The present disclosure relates to modules that provide optical signal detection.
Some handheld computing devices such as smart phones can provide a variety of different optical functions such as one-dimensional (1D) or three-dimensional (3D) gesture detection, 3D imaging, proximity detection, ambient light sensing, and/or front-facing two-dimensional (2D) camera imaging.
Proximity detectors, for example, can be used to detect the distance to (i.e., proximity of) an object up to distances on the order of about one meter. In some cases, a smudge (e.g., fingerprint) on the transmissive window or cover glass of the host device can produce a spurious proximity signal, which may compromise the accuracy of the proximity data collected.
The present disclosure describes optoelectronic modules operable to distinguish between signals indicative of reflections from an object interest and signals indicative of a spurious reflection. Modules also are described in which particular light projectors in the module can serve multiple functions (e.g., can be used in more than one operating mode).
For example, in one aspect, a module is operable to distinguish between signals indicative of an object of interest and signals indicative of a spurious reflection, for example from a smudge (i.e., a blurred or smeared mark) on the host device's cover glass. The module can include a light projector operable to project light out of the module, and an image sensor including spatially distributed light sensitive components (e.g., pixels of a sensor) that are sensitive to a wavelength of light emitted by the light projector. The module includes processing circuitry operable to read signals from the spatially distributed light sensitive components of the image sensor and to assign peak signals associated, respectively, with particular ones of the light sensitive components either to a reflection from an object of interest (e.g., outside the host device) or to a spurious reflection (e.g., resulting from a smudge on a transmissive window of a host device).
In some implementations, a single module can be used for one or more of the following applications: proximity sensing, heart rate monitoring and/or reflectance pulse oximetry applications. In each case, processing circuitry can distinguish between spurious signals (e.g., signals indicative of reflections caused by a smudge on a cover glass) and signals of interest (e.g., signals indicative of reflections from an object whose proximity is to be determined, or a person's finger or other body part, in the case of heart rate monitoring and/or reflectance pulse oximetry applications). The signals of interest then can be processed, depending on the application, to obtain a distance to an object, to determine a person's blood oxygen level or to determine a person's heart rate. In some implementations, the module can be used for stereo imaging in addition to one or more of the foregoing applications. The addition of a light projector that provides structured light can be advantageous, for example, in some imaging applications as well. In some implementations, a particular light projector can serve multiple functions. For example, in some cases, a light projector that is operable to emit red light can be used when the module is operating in a flash mode or when the module is operating in a reflectance pulse oximetry mode.
When used for proximity sensing applications, some implementations can provide enhanced proximity detection. For example, some implementations include more than one light projector to project light out of the module toward an object of interest. Likewise, some implementations may include more than one optical channel. Such features can, in some cases, help improve accuracy in the calculation of the object's proximity.
In another aspect, a proximity sensing module includes a first optical channel disposed over an image sensor having spatially distributed light sensitive components. A first light projector is operable to project light out of the module. There is a first baseline distance between the first light projector and the optical axis of the channel. A second light projector is operable to project light out of the module. There is a second baseline distance between the second light projector and the optical axis. An image sensor including spatially distributed light sensitive components that are sensitive to a wavelength of light emitted by the first light projector and a wavelength of light emitted by the second light projector. Processing circuitry is operable to read and process signals from the spatially distributed light sensitive components of the image sensor.
In some cases, the processing circuitry is operable to identify particular ones of the spatially distributed light sensitive components that sense peak signals based on light emitted by the first and second light projectors, and to determine a proximity to an object outside the module based at least in part on positions of the particular ones of the spatially distributed light sensitive components. In some instances, the first and second baseline distances differ from one another. Such features can, in some cases, help increase the range of proximities that can be detected.
In some cases, a particular optical channel and its associated spatially distributed light sensitive components can be used for other functions in addition to proximity sensing. For example, the same optical channel(s) may be used for proximity sensing as well as imaging or gesture recognition. In some cases, different imagers in the module or different parts of the light sensitive components can be operated dynamically in different power modes depending on the optical functionality that is required for a particular application. For example, a high-power mode may be used for 3D stereo imaging, whereas a low-power mode may be used for proximity and/or gesture sensing. Thus, in some cases, signals from pixels associated, respectively, with the different imagers can be read and processed selectively to reduce power consumption.
The modules may include multiple light sources (e.g., vertical cavity surface emitting lasers (VCSELs)) that generate coherent, directional, spectrally defined light emission. In some applications (e.g., 3D stereo matching), a high-power light source may be desirable, whereas in other applications (e.g., proximity or gesture sensing), a low-power light source may be sufficient. The modules can include both high-power and low-power light sources, which selectively can be turned on and off. By using the low-power light source for some applications, the module's overall power consumption can be reduced.
Thus, a single compact module having a relatively small footprint can provide a range of different imaging/sensing functions and can be operated, in some instances, in either a high-power mode or a low-power mode. In some cases, enhanced proximity sensing can be achieved. In some cases, by using different areas of the same image sensor for various functions, the number of small openings in the front casing of the smart phone or other host device can be reduced.
Other aspects, features and advantages will be readily apparent from the following detailed description, the accompanying drawings, and the claims.
As illustrated in
The module 100 also includes a light projector 114 such as a laser diode or vertical cavity surface emitting laser that is operable to emit coherent, directional, spectrally defined light emission. The light projector 114 can be implemented, for example, as a relatively low-power VCSEL (e.g., output power in the range of 1-20 mW, preferably about 10 mW) that can project infra-red (IR) light. The light projector 114 used for proximity sensing need not simulate texture and, therefore, can simply project an optical dot onto an object, whose distance or presence is to be detected based on light reflected from the object. In some implementations, the light projector 114 is operable to emit a predetermined narrow range of wavelengths in the IR part of the spectrum. The light projector 114 in some cases may emit light in the range of about 850 nm+10 nm, or in the range of about 830 nm+10 nm, or in the range of about 940 nm+10 nm. Different wavelengths and ranges may be appropriate for other implementations. The light emitted by the projector 114 may be reflected, for example, by an object external to the host device (e.g., a smart phone) such that the reflected light is directed back toward the image sensor 102.
In the illustrated module of
The module 100 can, in some cases, provide enhanced proximity sensing. For example, use of a VCSEL as the light projector 114 can provide coherent, more directional, and spectrally defined light emission than a LED. Further, as the image sensor 102 is composed of spatially distributed light sensitive components (e.g., pixels of a CMOS sensor), peaks in the detected intensity can be assigned by the processing circuitry 112 either to an object 124 of interest external to the host device or to a spurious reflection such as from a smudge 122 (i.e., a blurred or smeared mark) on the transmissive window 120 of the host device (see
As shown in the example of
where “f” is the focal length of the lens stack, and Z is the proximity (i.e., the distance to the object 124 of interest). As the measured intensities are spatially defined and can be assigned either to the object 124 or to the smudge 122, the measured optical intensity associated with the object 124 can be correlated more accurately to distance. Such proximity detection can be useful, for example, in determining whether a user has moved a smart phone or other host device next to her ear. If so, in some implementations, control circuitry in the smart phone may be configured to turn off the display screen to save power. In some instances, the processing circuitry 112 can use the distance between the spurious reflection (e.g., the smudge signal) and the object signal as further input to correct for the measured intensity associated with the object 124.
In some cases, instead of, or in addition to, calculating the proximity of the object 124 using a triangulation technique, the intensity of the peak 134 associated with the object 124 can be correlated to a proximity (i.e., distance) using a look-up table or calibration data stored, for example, in memory associated with the processing circuitry 112.
In some implementations, it may be desirable to provide multiple optical channels for proximity sensing. Thus, data can be read and processed from more than one imager 104 (or an imager having two or more optical channels) so as to expand the depth range for detecting an object. For example, data detected by pixels 102B associated with a first optical channel may be used to detect the proximity of an object 124 at a position relatively far from the transmissive window 120 of the host device (
As shown in
where “f” is the focal length of the lens stack, “X1” is the distance (i.e., baseline) between the first light projector 114B and the optical axis 138 of the optical channel, and “X2” is the distance (i.e., baseline) between the second light projector 114A and the optical axis 138. In general, the greater the value of “Z,” the smaller will be the distance “d” between the pixels 140A, 140B. Conversely, the smaller the value of “Z,” the greater will be the distance “d” between the pixels 140A, 140B. Thus, since the value of “d” in
In the examples of
In some implementations, as illustrated in
In some implementations, instead of (or in addition to) providing an emitter that emits light at an angle with respect to the emission channel's optical axis, the proximity detection module has a tilted field-of-view (FOV) for the detection channel. An example is illustrated in
The optics assembly 170 includes one or more beam shaping elements (e.g., lenses 174, 176) on the surface(s) of the transparent cover 172. The lenses 174, 176 are arranged over the image sensor 102 such that the optical axis 138A of the detection channel is tilted at an angle (a) with respect to a line 138B that is perpendicular to the surface of the image sensor 102. The lenses 174, 176 may be offset with respect to one another. In some implementations, the angle α is about 30°+10°. Other angles may be appropriate in some instances. A baffle 182 can be provided to reduce the likelihood that stray light will be detected and to protect the optics assembly 170. As illustrated in
Although the light beam emitted by the emitter 114 may have a relatively small divergence (e.g., 10°-20°), in some cases, it may be desirable to provide one or more beam shaping elements (e.g., collimating lenses 184, 186) on the surface(s) of the transparent cover 172 so as to reduce the divergence of the outgoing light beam even further (e.g., total divergence of 2°-3°). Such collimating lenses may be provided not only for the example of
In some cases, as illustrated by
The structured light emitted by the light projector 142 can result in a pattern 144 of discrete features (i.e., texture) being projected onto an object 124 external to the host device (e.g., a smart phone) in which the module is located. Light reflected by the object 124 can be directed back toward the image sensor 102 in the module. The light reflected by the object 124 can be sensed by the image sensor 102 as a pattern and may be used for proximity sensing. In general, the separation distances x1 and x2 in the detected pattern change depending on the distance (i.e., proximity) of the object 124. Thus, for example, assuming that the focal length (“f”), the baseline distance (“B”) between the light projector 142 and the channel's optical axis 138, and the angle of emission from the structured light source 142 are known, the proximity (“Z”) can be calculated by the processing circuitry 112 using a triangulation technique. The values of the various parameters can be stored, for example, in memory associated with the processing circuitry 112. Alternatively, the proximity can be determined from a look-up table stored in the module's memory. In some cases, the proximity of the object 124 can be determined based on a comparison of the measured disparity xi and a reference disparity, where a correlation between the reference disparity and distance is stored by the module's memory.
In some implementations, distances may be calculated by projected structured light using the same triangulation method as the non-structured light projector. The structured light emitter also can be useful for triangulation because it typically is located far from the imager (i.e., a large baseline). The large baseline enables better distance calculation (via triangulation) at longer distances.
In some implementations, the optical channel that is used for proximity sensing also can be used, for other functions, such as imaging. For example, signals detected by pixels of the image sensor 102 in
As noted above, some implementations include two or more optical channels each of which is operable for use in proximity sensing. In some cases, the different channels may share a common image sensor, whereas in other cases, each channel may be associated with a different respective image sensor each of which may be on a common substrate. In implementations where multiple channels are used to acquire image data, the processing circuitry 112 can combine depth information acquired from two or more of the channels to generate three-dimensional (3D) images of a scene or object. Further, in some instances, as illustrated by
In some implementations, the structured pattern 144 generated by the light source 142 can be used for both imaging as well as proximity sensing applications. The module may include two different light projectors, one of which 142 projects a structured pattern 144 used for imaging, and a second light projector 114 used for proximity sensing. Each light projector may have an optical intensity that differs from the optical intensity of the other projector. For example, the higher power light projector 142 can be used for imaging, whereas the lower power light projector 114 can be used for proximity sensing. In some cases, a single projector may be operable at two or more intensities, where the higher intensity is used for imaging, and the lower intensity is used for proximity sensing.
To enhance imaging capabilities, as shown in
In the examples illustrated in
The processing circuitry 112 can be configured to implement a triangulation technique to calculate the proximity of an object 124 in any of the foregoing module arrangements (e.g.,
Some implementations include an autofocus assembly 164 for one or more of the optical channels. Examples are illustrated in
Also, in some implementations, as shown in
As noted above, in some implementations, the different light sources 114, 142 may be operable at different powers from one another such that they emit different optical intensities from one another. This can be advantageous to help reduce the overall power consumption in some cases.
In some implementations, control circuitry 113 mounted on the PCB 110 (see
As an example, in a low-power mode of operation, proximity data from the secondary imagers 104 can be read and processed. The proximity can be based on light emitted by a low-power light projector 114 and reflected by an object (e.g., a person's ear or hand). If 3D image data is not to be acquired, then, data from the primary imager 154 would not need to be read and processed, and the high-power light projector 142 would be off. On the other hand, when 3D image data is to be acquired, the module can be operated in a high-power mode in which the high-power light projector 142 is turned on to provide a structured light pattern, and data from pixels in the primary imager 154, as well as data from pixels in the secondary imager(s) 104, can be read and processed.
In some implementations, the optical channels used for proximity sensing also can be used for gesture sensing. Light emitted by the low-power projector 114, for example, can be reflected by an object 124 such as a user's hand. As the user moves her hand, the processing circuitry 112 can read and process data from the secondary imagers 104 so as to detect such movement and respond accordingly. Signals indicative of hand gestures, such as left-right or up-down movement, can be processed by the processing circuitry 112 and used, for example, to wake up the host device (i.e., transition the device from a low-power sleep mode to a higher power mode). Referring to
In the foregoing implementations, the modules are operable to distinguish between signals indicative of a reflection from an object interest and signals indicative of a spurious reflection in the context of proximity sensing. However, similar arrangements and techniques also can be used for other reflective light sensing applications as well. In particular, the following combination of features also can be used in modules designed for reflectance pulse oximetry applications (e.g., to detect blood oxygen levels) and/or heart rate monitoring (HRM) applications: at least one collimated light source (e.g., a VCSEL), an image sensor including an array of spatially distributed light sensitive components (e.g., an array of pixels), and processing circuitry operable to read signals from the spatially distributed light sensitive components and to assign a first peak signal associated with a first one of the light sensitive components to a spurious reflection and to assign a second peak associated with a second one of the light sensitive components to a reflection from an object of interest. The signals (i.e., peaks) assigned to the object of interest then can be used by the processing circuitry 112 according to known techniques to obtain, for example, information about a person's blood oxygen level or heart rate.
Pulse oximeters, for example, are medical devices commonly used in the healthcare industry to measure the oxygen saturation levels in the blood non-invasively. A pulse oximeter can indicate the percent oxygen saturation and the pulse rate of the user. Pulse oximeters can be used for many different reasons. For example, a pulse oximeter can be used to monitor an individual's pulse rate during physical exercise. An individual with a respiratory condition or a patient recovering from an illness or surgery can wear a pulse oximeter during exercise in accordance with a physician's recommendations for physical activity. Individuals also can use a pulse oximeter to monitor oxygen saturation levels to ensure adequate oxygenation, for example, during flights or during high-altitude exercising. Pulse oximeters, for example, can include processing circuitry to determine oxygen saturation and pulse rate and can include multiple light emitting devices, such as one in the visible red part of the spectrum (e.g., 660 nm) and one in the infrared part of the spectrum (e.g., 940 nm). The beams of light are directed toward a particular part of the user's body (e.g., a finger) and are reflected, in part, to one or more light detectors. The amount of light absorbed by blood and soft tissues depends on the concentration of hemoglobin, and the amount of light absorption at each frequency depends on the degree of oxygenation of the hemoglobin within the tissues.
An example of an arrangement for a reflectance pulse oximetry module 200 is illustrated in
Processing circuitry in the modules of
In some cases, the pulse oximeter module includes more than one imager 104 (see
Each of the module arrangements of
In some implementations, as shown in
In view of the foregoing description, a single module can be used for one or more of the following applications: proximity sensing, gesture sensing, heart rate monitoring, reflectance pulse oximetry, flash and/or light indicators. For proximity sensing and heart rate monitoring applications, only a single light projector is needed, although in some cases, it may be desirable to provide multiple light projectors. For pulse oximetry applications, a second light projector can be provided as well. The processing circuitry 112 and control circuitry 113 are configured with appropriate hardware and/or software to control the turning on/off of the light projector(s) and to read and process signals from the imagers. In each case, the processing circuitry 112 can use the techniques described above to distinguish between spurious signals (e.g., signals indicative of reflections caused by a smudge on a cover glass) and signals of interest (e.g., signals indicative of reflections from an object whose proximity is to be determined, or a person's finger or other body part, in the case of heart rate monitoring and/or reflectance pulse oximetry applications). In some implementations, the module can be used for stereo imaging in addition to one or more of the foregoing applications. The addition of a light projector that provides structured light can be advantageous, for example, in some imaging applications.
Any of the foregoing module arrangements also can be used for other applications, such as determining an object's temperature. For example, if the imagers 104 are sensitive to infra-red light, the intensity of the detected signals can be indicative of the temperature (i.e., a higher intensity indicates a higher temperature). The processing circuitry 112 can be configured to determine the temperature of a person or object based on signals from the imagers using known techniques. Although light from the projector(s) is not required for such applications, in some cases, light from the projector (e.g., 114B) may be used to point to the object whose temperature is to be sensed.
Any of the foregoing module arrangements also can be used for determining an object's velocity. For example, the processing circuitry 112 can use signals from the imager(s) to determine an object's proximity as a function of time. In some cases, if it is determined by the processing circuitry 112 that the object is moving away from the module, the control circuitry 113 may adjust (e.g., increase) the intensity of light emitted by the structured light projector 142.
In some implementations, the foregoing modules may include user input terminal(s) for receiving a user selection indicative of the type of application for which the module is to be used. The processing circuitry 112 would then read and process the signals of interest in accordance with the user selection. Likewise, the control circuitry 113 would control the various components (e.g., light projectors 114) in accordance with the user selection.
In general, the module's light projector(s) in the various implementations described above should be optically separated from the imagers such that the light from the light projector(s) does not directly impinge on the imagers. For example, an opaque wall or other opaque structure can separate the light projector(s) from the imager(s). The opaque wall may be composed, for example, of a flowable polymer material (e.g., epoxy, acrylate, polyurethane, or silicone) containing a non-transparent filler (e.g., carbon black, a pigment, an inorganic filler, or a dye).
Other implementations are within the scope of the claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/SG2015/050211 | 7/13/2015 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62024040 | Jul 2014 | US | |
62051128 | Sep 2014 | US |