This relates generally to electronic devices and, more particularly, to electronic devices with proximity sensors.
Electronic devices often include components that have sensors. For example, earbuds, cellular telephones, and other devices sometimes have light-based components such as light-based proximity sensors. A light-based proximity sensor may have a light source such as an infrared light-emitting diode and may have a light detector.
During operation, the light-emitting diode may emit light outwards from the electronic device. When the electronic device is near an external object, the emitted light may be reflected from the object and detected by the light detector. When the device is not in the vicinity of an external object, the light will not be reflected toward the light detector and only small amounts of reflected light will be detected by the light detector. However, some infrared light may reflect within the electronic device, thereby creating crosstalk. It may be difficult to determine whether light detected by the light detector is due to crosstalk or due to reflection from an external object.
Electronic devices may be provided with light-based components. The light-based components may include, for example, light-based proximity sensors. A light-based proximity sensor may have a light source such as an infrared light source and may have a light detector that detects whether light from the infrared light source has been reflected from an external object in the vicinity of an electronic device. Light sources may also be used as part of light-based transceivers, status indicator lights, displays, light-based touch sensors, light-based switches, and other light-based components. Illustrative configurations in which an electronic device is provided with a light-based component such as a light-based proximity sensor may sometimes be described herein as an example.
As shown in
Circuitry 16 may be used to run software on device 10. The software may control the operation of sensors and other components in device 10. For example, the software may allow circuitry 16 to control the operation of light-based proximity sensors and to take suitable actions based on proximity data gathered from the light-based proximity sensors. As an example, a light-based proximity sensor may be used to detect when a wireless earbud is in the ear of a user or may be used to detect when other user (human) body parts are in the vicinity of an electronic device. Based on information on whether or not the earbud is in the ear of a user or is otherwise in a particular position relative to a user, the software running on control circuitry 16 may adjust audio output and/or media playback operations, may change the operation of communications functions (e.g., cellular telephone operations) for a paired cellular telephone or other additional device that is associated with the earbud, or may take other suitable action.
As another example, the light-based proximity sensor may be used to detect when a cellular telephone has been brought into close proximity with a user's head or other body part (e.g., within 1 cm, within 2 cm, within 5 cm, etc.). Based on information about whether or not the cellular telephone is brought up to a user's head or is in a particular position relative to a user, the software running on control circuitry 16 may adjust the brightness of a display within device 10, may deactivate the display, may deactivate any touch functions associated with the display, or may take other suitable action.
To support interactions with external equipment, circuitry 16 may be used in implementing communications protocols. Communications protocols that may be implemented using circuitry 16 include wireless local area network protocols (e.g., IEEE 802.11 protocols—sometimes referred to as WiFi®), protocols for other short-range wireless communications links such as the Bluetooth® protocol, cellular telephone protocols, near-field communications protocols, and other wireless communications protocols.
Device 10 may include input-output devices 18. Input-output devices 18 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 18 may include touch screens, displays without touch sensor capabilities, buttons, joysticks, scrolling wheels, touch pads, key pads, keyboards, microphones, cameras, speakers, status indicators, light sources, audio jacks and other audio port components, light sensors, accelerometers, and other sensors, and input-output components. These components may include light-based components such as components with light sources. As shown in
Proximity sensor 20 may include light source 22 and light detector 24. Light source 22 may emit light 26 that has the potential to be reflected from external objects such as object 28 (e.g., the ear or other body part of a user, inanimate objects, or other objects). Light detector 24 may measure how much of emitted light 26 is reflected towards device 10 as reflected light 30 and may therefore be used in determining whether an external object such as object 28 is present in the vicinity of device 10. Light 26 may be infrared light, visible light, or ultraviolet light (as examples). Infrared light is not visible to a user and is detectable by semiconductor infrared light detectors, so it may be desirable to form light source 22 from a component that emits infrared light. Light source 22 may be a light-emitting component such as a light-emitting diode or a laser diode (as examples). Proximity sensor 20 may output a proximity sensor reading (e.g., a proximity sensor output that is proportional to the distance between device 10 and object 28), and control circuitry 16 may monitor the proximity sensor reading and compare the proximity sensor reading to a predetermined threshold to detect proximity to external object 28. An example of device 10 is shown in
As shown in
As discussed above in connection with
When a proximity sensor (or other input-output device 18), is formed in the border region, the opaque layer may be removed or modified to accommodate the proximity sensor. However, this is merely illustrative. Proximity sensor 20 may operate through the opaque layer or any other layer, if desired. In use, proximity sensor 20 may include a light-emitter and a light-detector. The light-emitter may emit light (such as infrared light) through any overlapping layers (e.g., the display cover layer), and the light-detection may receive reflected light that has bounced off of an external target through the display cover layer.
Alternatively or additionally, proximity sensor 20 may be formed at location 36, which is under display 14. For example, proximity sensor 20 may be formed under the array of pixels 32. In one example, proximity sensor 20 may include a light-emitter and a light-detector. The light-emitter may emit light (such as infrared light) through the array of pixels (e.g., between adjacent pixels of the array of pixels), and the light-detection may receive reflected light that has bounced off of an external target through the array of pixels. However, this is merely illustrative. If desired, one or more pixels may be removed to accommodate underlying proximity sensor 20. Alternatively, proximity sensor 20 may be formed in the same substrate as pixels 32.
Although proximity sensor 20 has been shown on the front face of device 10, this is merely illustrative. Proximity sensors may be formed on the front face, the rear face, and/or any other face of device 10. Regardless of the location of proximity sensor 20, the proximity sensor may detect the proximity of objects from electronic device 10. An example of proximity sensor 20 detecting an external object is shown in
As shown in
In some embodiments, light emitter 43 may be an infrared light emitter, such as an infrared light-emitting diode, or any other desired infrared light emitting device. In these embodiments, light detector 45 may be sensitive to infrared light. For example, light detector 45 may be an array of single-photon avalanche diode (SPAD) pixels that are each triggered and generate a signal in response to a single photon of infrared light. However, the use of an infrared light emitter and detector is merely illustrative. Light emitter 43 may emit light of any desired wavelength, such as visible, infrared, near-infrared, or ultraviolet, and light detector 45 may detect the same or a similar wavelength of light as the light emitted by light emitter 43.
Regardless of the wavelength of light emitted by light emitter 43, light emitter 43 may emit light 40 and 48 toward external target 42 through window 39 in cover layer 38. Cover layer 38 may be a display cover layer, such as the display cover layer that overlaps display 14 in
External target 42 may be exterior to device 10, and may be any desired object, and in some cases may be a portion of a user (e.g., a user's face, head, or hand). Once light 40 and 48 has reached target 42, the light may be respectively reflected as light 46 and light 50. Light 50 may be reflected back through cover layer 38 toward light detector 45, while light 46 may be reflected light that is reflected back toward device 10 away from light detector 45. In response to light 50, light detector 45 may generate a signal or signals representative of the amount and position of light 50. In turn, control and/or processing circuitry within device 10 may use the signal(s) to determine the proximity of external target 42 relative to proximity sensor 41 and therefore relative to device 10.
In addition to light 50 being reflected from external object 42, there may be stray light that reaches light detector 45. As shown in
As shown in
Portions 56 and 58 may correspond to target signals, or signals generated in response to light that has reflected off of an external target, such as target 42. In particular, portion 56 may correspond with a near-field target signal and portion 58 may correspond with a far-field signal (because the photons detected in portion 56 have returned to light detector 45 more quickly than the photons in portion 58, the target detected in portion 56 is closer than the target detected in portion 58). Portion 58 may include far-field signal 59 and baseline measurement 55 (e.g., known noise or error in light detector 45). As a result, for far-field signals, it may be straightforward for control circuitry or processing circuitry in device 10 to remove the baseline measurement 55 from the signals in portion 58 to determine the far-field signals 59.
Near-field portion 56, on the other hand, may include cross-talk, such as cross-talk caused by light 54 in
As shown in
To determine the amount of cross-talk detected by light detector 45, the SPAD pixel array may be split into two different sections, section 66 and section 68. Section 66 may be in a location with higher cross-talk (e.g., because of the location of light emitter 43 relative to light detector 45 in
A first illustrative near-field signal histogram 56A that may be produced using the SPAD pixels in section 66 is shown in
As a result of generating histograms 56A and 56B separately, control circuitry and/or processing circuitry in device 10 may weigh the results of the two histograms to better approximate the actual near-field target signal. For example, control circuitry may apply a weight such that at least 60%, at least 70%, at least 80%, less than 90%, or any other desired percent of the near-field target signal 57 is generated by the SPAD pixels in region 68 (corresponding to histogram 56B). In this way, the near-field target signal may be more accurately measured, and near-field targets be more accurately detected by proximity sensor 41.
Although
Although
By having a shorter integration time for the SPAD pixel array, more data that corresponds with only the near-field signals may be obtained. If enough histograms are obtained in a short period of time, it may be assumed that the cross-talk and other ambient effects are relatively constant between the multiple histograms and that any differences between the histograms is due to the target moving. In this way, the near-field signal can be determined. If desired, taking multiple near-field measurements may be combined with generating multiple histograms for different portions of the SPAD pixel array (e.g., locations 66 and 68 of
Although
Alternatively or additionally to splitting the near-field target signal into two different histograms that may be weighted to more accurately determine the near-field target signal, the histogram bins may have varied size. An example of this arrangement is shown in
As shown in
Time 70 (the demarcation between the fine and coarse bins) may be moved as desired based on the target being measured. For example, time 70 may be moved to ensure that the near-field signal is entirely within the time prior to time 70. Alternatively or additionally, time 70 may be moved past far-field signal if there is an application that requires more precise proximity detection or that requires the detection of multiple far-field targets. However, these examples are merely illustrative. In general, time 70 may be moved to any desired time relative to the near-field and far-field signals.
If desired, finer bins may be used for the near-field measurements (as described in
As shown in
In particular, the signals that reach circuitry 53 first pass through OR circuitry 76, and the signals that pass through OR circuitry 76 are converted into from the time domain to the digital domain via time-to-digital converter (TDC) 78. Because the signals leaving TDC 78 are in the time domain, digital multiplexer (MUX) 80 may then route the digital signals leaving each TDC 78 to one of histogram 1 (82) or histogram 2 (84). As an example, histogram 1 (84) may correspond with histogram 56A of
By having each OR circuit 76 receive signals from a subset of the SPAD pixel array, such as from 12 SPAD pixels, from less than 16 SPAD pixels, from less than 32 SPAC pixels, or any other desired subset of the pixel array, more target information may pass to TDCs 78 and eventually to histograms 82 and 84 than in conventional ambient light sensors. This may allow the control circuitry to better determine an amount of cross-talk in the near-field signals, as the signals that are taken at the same time, but at different locations in the SPAD array can be compared.
Although
In use, control circuitry in device 10 may dynamically search for a far-field signal in the output of light detector 45. If a far-field signal is found, a single histogram may be used to determine the far-field signal, due to the lack of cross-talk interference as described above in connection with
An alternative configuration for circuitry 53 is shown in
A second alternative configuration for circuitry 53 is shown in
For example, the counter may assign values to a group of 64 SPAD pixels (although any desired group size may be used) that have generated the most signals in response to an external object. Within that group of 64 pixels, each pixel may receive a value based on its individual contribution. Control circuitry within device 10 may then weight the output of TDCs 90 based on the value assigned to the pixel from which the output originated. In this way, circuitry 53 may keep track of the outputs of SPAD pixels having different characteristics, and the control circuitry may apply weights to those outputs based on the SPAD pixel characteristics. This process may be dynamic, as counter 94 may assign values to different groups of pixels and to the individual pixels within those groups of pixels based on the signal generated by the pixels (e.g., as an external target moves, different pixels will generate more signal). This may allow for more accurate measurements of either or both of near-field and far-field target measurements.
An illustrative example of using a proximity sensor with an array of multiple groups of SPAD pixels is shown in
SPAD pixels 106 in each micropixel 104 may generate signals in response to light that has been reflected off of an external target. These signals may be output over lines 108 to circuitry 120. Circuitry 120 may be a portion of proximity sensor 100 (e.g., processing circuitry within proximity sensor 100), or may be a portion of control circuitry or processing circuitry, such as control circuitry 16 of
The signals produced by each micropixel 104 are converted from the time domain to the digital domain via time-to-digital converters (TDC) 110. After being converted to the digital domain, the signals may be weighted using weight circuits 114. Weight circuits 114 may weight the signals based on outputs from weight calculation circuit 116. Weight calculation circuit 116 may base the appropriate weights on outputs from counter 112. Counter 112 may be similar to counter 94 of
For example, the counter may assign values to a group of 16 SPAR pixels (e.g., each micropixel 104, although any desired group size may be used) that have generated the most signals in response to an external object. Within that group of 16 pixels, each pixel may receive a value based on its individual contribution.
Weight calculation circuit 116 may determine the appropriate weights based on the values assigned by counter 112. Weight circuits 114 may then apply the weights to the outputs of TDCs 110.
Because the signals leaving weight circuits 114 are in the time domain, TDC select multiplexers (MUXs) 118 may then route the digital signals leaving weight circuits 114 to histograms 122. TDC select MUXs 118 may route the digital signals based on outputs of TDC select controller 120, which may determine the appropriate signal routing based on outputs from counter 112 and information from TDC select register 121, which may be a look-up table or other register.
Although
Micropixel dynamic saturation control 124 may also be included in circuitry 120. In particular micropixel dynamic saturation control 124 may selectively activate or deactivate SPADs 106 within micropixels 102 to change the saturation of each micropixel 102. For example, if circuitry 120 determines that one of micropixels 102 is more susceptible to cross-talk, some or all of SPADs 106 within that micropixel 102 may be deactivated (e.g., turned off to prevent the SPADs from generating measurements in response to light). In this way, micropixel dynamic saturation control 124 may adjust micropixels 102 as needed.
In use, control circuitry in device 10 may dynamically search for a far-field signal in the output of light detector 102. If a far-field signal is found, a single histogram may be used to determine the far-field signal, due to the lack of cross-talk interference as described above in connection with
Using counter 112 of
As shown in
Although
The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of U.S. Provisional Patent Application No. 63/309,989, filed Feb. 14, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63309989 | Feb 2022 | US |