Optical interferometry proximity sensor with temperature variation compensation

Information

  • Patent Grant
  • 12169125
  • Patent Number
    12,169,125
  • Date Filed
    Wednesday, July 12, 2023
    a year ago
  • Date Issued
    Tuesday, December 17, 2024
    5 days ago
Abstract
An optical proximity sensor includes a first vertical cavity surface-emitting laser configured for self-mixing interferometry to determine distance to and/or velocity of an object. The optical proximity sensor also includes a second vertical cavity surface-emitting laser configured for self-mixing interferometry to determine whether any variation in a fixed distance has occurred. The optical proximity sensor leverages output from the second vertical cavity surface-emitting laser to calibrate output from the second vertical cavity surface-emitting laser to eliminate and/or mitigate environmental effects, such as temperature changes.
Description
FIELD

Embodiments described herein relate to optical sensors, and, in particular, to optical interferometry proximity sensors configured to determine a distance to, and/or velocity of, an object.


BACKGROUND

An electronic device can include a system or sensor—referred to herein as a “proximity sensor”—to measure or estimate a distance separating that electronic device in free space from an object or surface, such as a user of the electronic device.


However, conventional proximity sensors are often highly susceptible to changes in temperature. More specifically, thermal expansion or contraction of a conventional proximity sensor often results in changes to one or more electrical and/or optical properties of the proximity sensor. As such, electronic devices requiring accurate output from conventional proximity sensors are typically burdened with a requirement to incorporate additional components or systems to compensate for effects of temperature, thereby increasing design complexity, component cost, and power consumption.


SUMMARY

Embodiments described herein reference an optical proximity sensor including two discrete vertical cavity surface-emitting lasers. Both vertical cavity surface-emitting lasers are configured for self-mixing interferometry.


The first vertical cavity surface-emitting laser is configured to illuminate an object to determine a distance to and/or a velocity of that object based on self-mixing interferometry. The second vertical cavity surface-emitting laser is configured to illuminate a surface a known distance away (e.g., an interior surface of a housing enclosing the optical proximity sensor).


In this manner, the optical proximity sensor leverages output from the second vertical cavity surface-emitting laser to calibrate output of the first vertical cavity surface-emitting laser. In this manner, calibrated measurements obtained from the first vertical cavity surface-emitting laser are substantially independent of any effects of temperature or other environmental conditions.





BRIEF DESCRIPTION OF THE DRAWINGS

Reference will now be made to representative embodiments illustrated in the accompanying figures. It should be understood that the following descriptions are not intended to limit this disclosure to one included embodiment. To the contrary, the disclosure provided herein is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the described embodiments, and as defined by the appended claims.



FIG. 1 depicts a schematic representation of an electronic device incorporating a proximity sensor, such as described herein, to determine a distance to, and/or a velocity of, an object near the electronic device.



FIG. 2A depicts a schematic representation of another electronic device incorporating a proximity sensor, such as described herein, to determine a distance to, and/or a velocity of, an object near the electronic device.



FIG. 2B depicts a schematic representation of another electronic device incorporating a proximity sensor, such as described herein, to determine a distance to, and/or a velocity of, a movable surface of the electronic device.



FIG. 3A depicts a schematic representation of an electronic device incorporating a proximity sensor, such as described herein, to determine a distance to, and/or a velocity of, a user's finger relative to an input region of the electronic device.



FIG. 3B depicts a schematic representation of an electronic device incorporating a proximity sensor, such as described herein, to determine a distance to, and/or a velocity of, a user's finger relative to a touch-input region of the electronic device.



FIG. 3C depicts a schematic representation of an electronic device incorporating a proximity sensor, such as described herein, to determine a distance to, and/or a velocity of, a movable surface of the electronic device against which a user of the electronic device can exert a force.



FIG. 3D depicts another schematic representation of an electronic device incorporating a proximity sensor, such as described herein, to determine a distance to, and/or a velocity of, a movable surface of the electronic device against which a user of the electronic device can exert a force.



FIG. 4 depicts a system diagram of a proximity sensor, such as described herein.



FIG. 5 depicts a simplified cross section view of a proximity sensor, such as described herein.



FIG. 6 depicts a simplified cross section view of a proximity sensor, such as described herein.



FIG. 7 depicts a simplified cross section view of a proximity sensor, such as described herein.



FIG. 8 depicts a simplified cross section view of a proximity sensor, such as described herein.



FIG. 9 depicts a simplified cross section view of a proximity sensor, such as described herein.



FIG. 10 depicts a simplified cross section view of a proximity sensor, such as described herein.



FIG. 11 is a flowchart depicting example operations of a method of operating a proximity sensor, such as described herein.



FIG. 12 is a flowchart depicting example operations of a method of operating a proximity sensor, such as described herein.





The use of the same or similar reference numerals in different figures indicates similar, related, or identical items.


The use of cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalities of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures.


Similarly, certain accompanying figures include vectors, rays, traces and/or other visual representations of one or more example paths—which may include reflections, refractions, diffractions, and so on, through one or more mediums—that may be taken by one or more photons originating from one or more light sources shown or, or in some cases, omitted from, the accompanying figures. It is understood that these simplified visual representations of light are provided merely to facilitate an understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale or with angular precision or accuracy, and, as such, are not intended to indicate any preference or requirement for an illustrated embodiment to receive, emit, reflect, refract, focus, and/or diffract light at any particular illustrated angle, orientation, polarization, color, or direction, to the exclusion of other embodiments described or referenced herein.


Additionally, it should be understood that the proportions and dimensions (either relative or absolute) of the various features and elements (and collections and groupings thereof) and the boundaries, separations, and positional relationships presented therebetween, are provided in the accompanying figures merely to facilitate an understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale, and are not intended to indicate any preference or requirement for an illustrated embodiment to the exclusion of embodiments described with reference thereto.


DETAILED DESCRIPTION

Embodiments described herein relate to an optical proximity sensor for an electronic device. These embodiments typically include a vertical cavity surface-emitting laser (“VCSEL”) oriented to radiate a beam of light toward an object near the optical proximity sensor in order to determine a distance to that object and/or a velocity of that object relative to the optical proximity sensor. A distance to the object and/or a velocity of the object can be determined based on a reflection of the radiated beam from an external surface of the object. The object can be internal to, or external to, the electronic device incorporating the optical proximity sensor. The beam of light may be coherent (e.g., with all photons having the same frequency and phase) or may have a wavelength modulated according to a particular pattern, referred to herein as “wavelength modulation.”


For simplicity of description, example embodiments are understood to reference a VCSEL configured to emit light in a spectral range that includes a non-visible frequency band (e.g., infrared or ultraviolet light). However, it may be appreciated that this is merely one example and that in other embodiments, more than one VCSEL can be used (e.g., an array of VCSELs disposed in any suitable pattern) or, additionally or alternatively, one or more VCSELs configured to emit light in a spectral range including a visible frequency band can be used. Further, although not required for all embodiments, the example VCSEL described in reference to many embodiments that follow is understood to be a Class 1 laser as defined by the American National Standards Association; in other cases, higher power lasers may be used.


As noted above, a beam emitted from a VCSEL of an optical proximity sensor may be reflected from an object. At least a portion of the reflected light can be directed back into the VCSEL to return to (i.e., reenter) a quantum well layer of the VCSEL, interfering with the operation thereof and changing an electrical property of the VCSEL. the change in the electrical and/or optical property of the VCSEL is related to (1) the distance from the optical proximity sensor to the surface of the object and (2) the wavelength of light emitted by the VCSEL.


It may be appreciated that, because the wavelength of light emitted by the VCSEL—whether modulated or fixed—is known, any measured interference (also referred to as “self-mixing”effects) can be correlated to the absolute distance separating the surface of the object and the optical proximity sensor, for example by counting interference mode hops or by quantifying a property of a beat frequency (e.g., via frequency domain analysis). This distance is referred to herein as the “measured distance” separating the optical proximity sensor and the surface of the object. As may be appreciated, this construction leverages an effect typically referred to as “self-mixing” interferometry or reflectometry.


In many embodiments, a second VCSEL can also be used (referred to herein as an “auxiliary” VCSEL). The auxiliary VCSEL can be positioned near the first VCSEL (also referred to herein as the “primary” VCSEL) such that the two experience a substantially identical thermal environment. In these constructions, the auxiliary VCSEL can be oriented to emit a beam of light toward a surface a fixed and known distance away from the auxiliary VCSEL, such as an internal surface of an enclosure of the optical proximity sensor. In these examples, a distance measurement obtained from the auxiliary VCSEL can be used to calibrate, in substantially real time, distance and/or velocity measurements obtained from the primary VCSEL.


These foregoing and other embodiments are discussed below with reference to FIGS. 1A-12. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.



FIG. 1 depicts a schematic representation 100 of an electronic device 102 configured to measure a distance d to an object 104 and/or a velocity v of the object 104 relative to an optical proximity sensor 106 disposed within a housing of the electronic device 102.


The optical proximity sensor 106 can include one or more VCSEL light sources—including a primary VCSEL light source and an auxiliary VCSEL light source—formed onto a substrate or semiconductor die; for simplicity of illustration and description the illustrated embodiment omits many of these elements, which are described in greater detail in reference to other figures presented herein.


The VCSEL light source(s) of the optical proximity sensor 106 can be formed in a pattern or array, although this may not be required. The VCSEL light sources of the optical proximity sensor 106 can be formed from any number of suitable materials or combinations of materials. In one example embodiment, the VCSEL light sources of the optical proximity sensor 106 each include, without limitation or express requirement: a first distributed Bragg reflector layer; an oxide layer defining an emission aperture; a quantum well layer; a second distributed Bragg reflector layer; and so on. In other examples, additional or fewer layers may be required. For simplicity of description, two VCSEL light sources of the optical proximity sensor 106 are referenced below—a primary VCSEL light source configured to emit light toward the object 104 and an auxiliary VCSEL light source configured to emit light toward a surface (not shown) a fixed distance away from the auxiliary VCSEL light source.


In many cases, the primary VCSEL light source and the auxiliary VCSEL light source of the optical proximity sensor 106 are each disposed within the same protective enclosure, potting, or encapsulation (including any housing or enclosure of the electronic device 102) to prevent damage.


As noted above, the primary VCSEL light source of the optical proximity sensor 106 can be used to determine a distance d to and/or a velocity v of the object 104. For example, in one embodiment, wavelength modulation may be used by the optical proximity sensor 106 to simultaneously detect distance d to, and velocity v of, the object 104 relative to the optical proximity sensor 106. Wavelength modulation can be achieved by modulating a drive current supplied to the VCSEL.


One example wavelength modulation leverages a triangular waveform including an “up cycle” (in which current supplied to the primary VCSEL and, correspondingly, wavelength of the radiated beam emitted therefrom increase linearly at a particular rate) and a “down cycle” (in which current supplied to the primary VCSEL and wavelength of the radiated beam emitted therefrom decrease linearly at the same particular rate). In this example, the undulation in wavelength can effectively simulate motion of the object 104 toward and away from the optical proximity sensor 106, whether or not that object 104 is actually moving. In these examples, frequency content of power output from the primary VCSEL—affected by self-mixing interference effects—can be described by Equation 1, presented below.


More specifically, in Equation 1 (below), the quantity ft denotes a frequency at which power output from the primary VCSEL of the optical proximity sensor 106 is modulated at a particular time t when the primary VCSEL of the optical proximity sensor 106 is emitting light at a wavelength λ.


It may be appreciated that as the absolute distance d to the object changes (which is equal to half of the total round-trip distance, drt), light received by the primary VCSEL at time t will have a different wavelength than the light emitted by the primary VCSEL at the same time t because the wavelengths of the two beams differ by an amount related to the rate of change over time in wavelength







d

λ

dt





multiplied by the total round-trip flight time required for the previously-emitted light to traverse the round-trip distance drt from the primary VCSEL to the object and back.


In this manner, power output from the primary VCSEL follows the triangular waveform (e.g., a current injected into the VCSEL light source at a rate of








dI
dt

)

.





In addition, power output from the primary VCSEL has an interference signal superimposed upon that triangular waveform that corresponds to the effects of constructive or destructive interference that results from different wavelengths of light interfering with one another in specific ways.


More specifically, the superimposed interference corresponds to the number of constructive and destructive interference “mode” transitions (e.g., also referred to as a beat frequency) that occur between two specific wavelengths of light that are determined based on the absolute distance to—or, more specifically, the round-trip time required to travel to and from—the object and based on the rate of change in wavelength







d

λ

dt





which, in turn, is based on me rate of change in current injected into the VCSEL light source







dI
dt

.




Phrased in another manner, a wavelength of light emitted from the primary VCSEL at time t0 is different from a wavelength of light emitted from the primary VCSEL at time t1 by an amount determined by the rate of change in wavelength








d

λ

dt

.





In this manner, because different wavelengths of light are emitted at different times (based on the rate of change









d

λ

dt

)

,





different wavelengths of light will be received at within the primary VCSEL based on the time at which that particular wavelength was emitted.


As such, generally and broadly, it may be appreciated that the frequency content of the power output from the primary VCSEL (e.g., quantity ft) is directly related to the absolute distance separating the primary VCSEL and the object 104. The auxiliary VCSEL may operate on the same principle.


Further, it may be appreciated that, quantity ft may increase or decrease as a result of one or more Doppler effects resulting from the velocity v of the object 104. For example, if the object 104 is moving toward the optical proximity sensor (e.g., parallel to the direction of propagation of the emitted beam), the frequency ft may increase. Alternatively, if the object 104 is moving away from the optical proximity sensor (e.g., parallel to the direction of propagation of the emitted beam), the frequency ft may decrease.


Equation 1, relating the values referenced above, follows:










f
t

=




d

λ

dt

·


d
rt


λ
2



±


2

v

λ






Equation


1







In a more general form, the direction of motion of the object 104—or more particularly, the angle θ of motion of the object 104 relative to the direction of propagation of the emitted beam—can also be considered. Equation 2, representing this more general form, follows:










f
t

=




d

λ

dt

·


d
rt


λ
2



±



2

v

λ



cos

(
θ
)







Equation


2







Using either Equation 1 or Equation 2, it may be appreciated that the distance to the object 104, represented by








d
rt

2

,





and the velocity of that object 104v, can be readily determined by monitoring one or more characteristics of the frequency content of the power consumed by the primary VCSEL during the up cycle and down cycle of a triangular modulation waveform. As described in reference to certain embodiments that follow, these measurements obtained from the primary VCSEL can be calibrated, adjusted, or otherwise modified in response to a similar measurement obtained from the auxiliary VCSEL.


In many embodiments, the optical proximity sensor 106 is configured to leverage triangular waveform modulation to obtain both distance and velocity information from a VCSEL (or array of VCSELs), such as the primary VCSEL, although it may be appreciated that this is merely one example and that other constructions and modulation techniques may be used.


Notwithstanding the foregoing, it may be appreciated that environmental conditions such as temperature and/or barometric pressure can affect one or more electrical properties of the primary VCSEL which, in turn, can decrease accuracy or precision of any calculation(s) based thereon (e.g., determinations of an object's distance, velocity, acceleration, and so on).


For example, as temperature or pressure changes, mechanical expansion or contraction may change physical proportions of a Bragg reflector and/or a quantum well layer of the primary VCSEL, which in turn may result in a change in the wavelength or modulation of light emitted from that primary VCSEL. In another example, as temperature changes, electrical resistance of the primary VCSEL will likewise change, decreasing the reliability of any measurements or operations that may be based on, or may require, precise voltage, current, or resistance. In still further cases, as temperature or pressure changes refractive index(es) of one or more portions of the primary VCSEL, which in turn may result in a change in the wavelength of light emitted from the primary VCSEL.


Accordingly, and as noted above, the optical proximity sensor 106 also includes the auxiliary VCSEL, which is positioned next to (e.g., on the same die or substrate), and typically but not necessarily constructed in the same manner as, the primary VCSEL. In many embodiments, the auxiliary VCSEL is thermally coupled to the primary VCSEL via a thermal coupling such that the primary VCSEL and the auxiliary VCSEL experience the same thermal environment.


More broadly, as a result of the physical proximity, thermal coupling, and/or similar (or identical) construction, it may be appreciated that the primary VCSEL and the auxiliary VCSEL in these constructions experience substantially identical environmental conditions.


In certain of these embodiments, the auxiliary VCSEL of the optical proximity sensor 106 is oriented to radiate a coherent beam of light toward a fixed reflective surface near, or within, the optical proximity sensor 106 (not shown). In many examples, the reflective surface is disposed substantially perpendicular to a propagation direction of that beam. The reflective surface is separated from the auxiliary VCSEL by a fixed and known distance referred to herein as the “reference distance.”


In these examples, the beam radiated from the auxiliary VCSEL (which is typically modulated in the same manner as the primary VCSEL) reflects from the reflective surface, resulting in self-mixing interference. As with the primary VCSEL, the self-mixing interference effects experienced by the auxiliary VCSEL correspond to (1) the reference distance and (2) the wavelength or modulation of light emitted by the auxiliary VCSEL. It may be appreciated that because both of these values are known, the self-mixing interference effects experienced by the auxiliary VCSEL are substantially constant while environmental conditions of the optical proximity sensor remain the same.


As a result of this example construction, self-mixing effects experienced by the auxiliary VCSEL can be leveraged as a reference to cancel or mitigate effects of environmental conditions, on the primary VCSEL, such as changes in temperature or barometric pressure.


In view of the foregoing, more generally and broadly, embodiments described herein relate to an optical proximity sensor—such as the optical proximity sensor 106—that leverages self-mixing interferometry to: (1) measure a variable distance and/or velocity to a surface or object; and (2) to measure variances or changes, if any, in a reference distance. Thereafter, the optical proximity sensor or, more specifically, a processing unit, circuit, or other controller of the optical proximity sensor, adjusts the measured variable distance in proportion to a variance detected in the reference distance. More simply, an optical proximity sensor, such as described herein, uses a measurement of a reference distance to calibrate—in real time—measurements of a variable distance. This construction has the effect, in many embodiments, of substantially reducing variability due to environmental conditions, such as temperature.


Any stationary or portable electronic device can incorporate an optical proximity sensor, such as described herein. Example electronic devices include, but are not limited to: mobile phone devices; tablet devices; laptop devices; desktop computers; computing accessories; peripheral input devices; home or business networking devices; aerial, marine, submarine, or terrestrial vehicle control devices or networking devices; mobile entertainment devices; augmented reality devices; virtual reality devices; industrial control devices; digital wallet devices; home or business security devices; wearable devices; head-mounted devices; hand-held controllers; health or medical devices; implantable devices; clothing-embedded devices; fashion accessory devices; home or industrial appliances; media appliances; and so on.


Similarly, the optical proximity sensor 106 can be leveraged by an electronic device for a number of suitable purposes. Example purposes include, but are not limited to: detecting distance and velocity of a user's finger (or other object, such as a stylus) to an input surface or component of the electronic device; detecting distance and velocity of a user's body (or any other object) to an input surface or component of the electronic device; detecting deflection in a surface of a housing of the electronic device due to a deformation caused by an application of force (e.g., by a user or other object, such as a stylus); and the like.


Similarly, the optical proximity sensor 106 can be manufactured or constructed in a number of suitable ways. Examples include, but are not limited to: a primary VCSEL positioned on the same die as an auxiliary VCSEL and disposed within an opaque enclosure defining a single transparent aperture aligned above the primary VCSEL; a primary VCSEL positioned on the same die as an auxiliary VCSEL and disposed within a transparent enclosure having a reflective material disposed above the auxiliary VCSEL; a single VCSEL disposed within an enclosure having a partially reflective material disposed above the VCSEL; and the like.


Similarly, the optical proximity sensor 106 can include any number of suitable optical adapters, lenses, or beam-shaping elements. Examples include, but are not limited to: reflectors; mirrors; translucent lens windows; transparent lens windows; concave lenses; convex lenses; tilted lenses; microlenses; macro lenses; collimators; polarizers; color filters; infrared-cut filters; infrared-pass filters; fiber optic cables; and the like.


In many embodiments, the optical proximity sensor 106 includes a component enclosure formed from a plastic or acrylic material; other conductive or non-conductive/insulating materials including glass and metal may also be suitable. The component enclosure can be formed from a single material or, alternatively, can be formed from multiple layers or regions of different materials joined together in a suitable manner (e.g., by adhesive, welding, and the like).


The component enclosure of the optical proximity sensor 106 can be opaque or transparent, or may include transparent regions and opaque regions. The component enclosure can include one or more reflective areas or regions.


In many embodiments, the component enclosure includes a lens or window disposed in an aperture defined through, or formed within, the component enclosure. Typically, the lens or window is disposed directly above, and aligned with, at least one of the VCSEL light sources. As a result of this construction, a beam of light emitted/radiated from at least one of the VCSEL light sources can propagate outwardly from the component enclosure of the optical proximity sensor 106. As an additional result of this construction, one or more reflections of an emitted/radiated beam that may reflect from a surface of the object 104 can be received by at least one of the VCSEL light sources.


In many embodiments, the optical proximity sensor 106 also includes one or more photodiodes disposed adjacent to, or integrated in, each of the VCSEL light sources. In these examples, the optical proximity sensor 106 (or, more specifically, a circuit or processor of, or communicably coupled to, the optical proximity sensor 106) can monitor power output from a photodiode to determine one or more performance characteristics, such as modulation frequency of that power output of a particular VCSEL light source. (see, e.g., Equations 1-2). It may be appreciated, however, that this foregoing example is merely one example; monitoring and/or measuring power output by and/or power consumption of a VCSEL light source may be performed in a number of suitable ways.


In some embodiments, the optical proximity sensor 106 includes a thermal mass coupled to each VCSEL of the VCSEL light sources to promote even temperature distribution between each VCSEL light source. In other cases, a thermally conductive layer can be disposed adjacent to or below each VCSEL light source.


In one specific implementation of the example introduced above, the optical proximity sensor 106 of the electronic device 102 includes a component enclosure that retains, encloses, and protects two VCSEL light sources. For this embodiment, a first VCSEL light source is referred to as the primary VCSEL and a second VCSEL light source is referred to as the auxiliary VCSEL light source.


As with other embodiments described herein, the primary VCSEL light source can be configured to emit a beam of laser light outwardly from the component enclosure of the optical proximity sensor 106. In the illustrated embodiment, the primary VCSEL light source is oriented to emit/radiate light in a direction generally perpendicular to an edge of a housing of the electronic device 102. It may be appreciated, however, that this is merely one example and that other emission or radiation directions are possible. (see, e.g., FIG. 2A)


As with other embodiments described herein, the primary VCSEL light source is further configured to receive a reflection of the emitted/radiated beam off the object 104. This reflection can result in self-mixing interference within the primary VCSEL light source which, in turn, can affect power output of the primary VCSEL light source. Accordingly, monitoring power output of the primary VCSEL light source (e.g., via monitoring an output of a photodiode, such as described above) can be used to determine and/or calculate the distance d and the velocity v of the object 104 relative to the electronic device 102. (see, e.g., Equations 1-2).


Also as with other embodiments described herein, the auxiliary VCSEL light source can be configured to emit a beam of light toward a reflective surface separated from the auxiliary VCSEL light source by a fixed reference distance.


In some cases, the reflective surface is an interior surface of the component enclosure of the optical proximity sensor 106. In other cases, the reflective surface is an exterior surface of the component enclosure of the optical proximity sensor 106. In yet other examples, the reflective surface is defined by a reflective or metallic material disposed onto (e.g., via physical vapor deposition or another suitable technique), or molded within (e.g., via insert molding or co-molding), a surface or body portion of the component enclosure of the optical proximity sensor 106.


As with other embodiments described herein, the auxiliary VCSEL light source is further configured to receive a reflection of the beam emitted therefrom off the reflective surface. This reflection can result in self-mixing interference within the auxiliary VCSEL light source which, in turn, can affect power consumption by and/or power output of the auxiliary VCSEL light source. As with the primary VCSEL light source a photodiode may be used to monitor power output by and/or power consumption of the auxiliary VCSEL light source which, in turn, can be used to determine and/or calculate a variation in the reference distance—if any. (see, e.g., Equations 1-2). Any detected or calculated variance in the reference distance can be used to calibrate, in real time, the output of the primary VCSEL light source.


The optical proximity sensor 106 and the electronic device 102 can be communicably or functionally coupled in any suitable manner. More specifically, the optical proximity sensor 106 can be configured to communicate distance and/or velocity information (which is calculated or otherwise determined based on self-mixing of the primary VCSEL light source calibrated based on self-mixing of the auxiliary VCSEL light source), to a processor or system of the electronic device 102 in any suitable manner, according to any protocol, in compliance with any suitable digital or analog form or format.


Furthermore, as noted above, the electronic device 102 can be any suitable electronic device including both stationary and portable electronic devices. In one embodiment, the electronic device 102 is a wearable electronic device, such as a smart watch. In this example, the electronic device 102 can leverage the optical proximity sensor 106 to determine a distance to a user (represented by the object 104) and a velocity of that user while that user is interacting with the electronic device 102. For example, the electronic device 102 can leverage a signal sent from the optical proximity sensor 106 to determine whether a user is wearing the smart watch or is directing the smart watch toward the user's face.


More specifically, in some embodiments, the electronic device 102 may be configured to perform a function upon determining that the user has to cross one or more thresholds, such as distance thresholds or velocity thresholds. Such thresholds may be variable or fixed and may be set by, and/or stored within, a memory of the electronic device 102. In some examples, the thresholds may be based on a user setting, an application setting, or an operating system setting or mode of operation. In other cases, such thresholds may be based, at least in part, on a particular application executed or instantiated by a processor of the electronic device 102. For example, a threshold set associated with a telephony application may be different from a threshold set associated with a gaming application. It may be appreciated that any suitable threshold or set of thresholds, stored or accessed in any suitable form or format may be used to inform one or more behaviors of the electronic device 102 in response to a signal received from the optical proximity sensor 106.


In one specific embodiment, the electronic device 102 can disable a screen of the electronic device 102 upon determining that a user is a far distance away from the electronic device 102.


In another specific embodiment, the electronic device 102 can modify a display or power setting of the electronic device 102 based on the distance and velocity of the user. Examples include, but may not be limited to: decreasing a brightness of a display or a display region upon receiving a signal from the optical proximity sensor 106 that the user is covering the display; increasing a brightness of a display upon receiving a signal from the optical proximity sensor 106 that the user is covering the display; highlighting a user interface element (e.g., an item of a list, a button, and the like) of a graphical user interface upon receiving a signal from the optical proximity sensor 106 that the user is hovering a finger near the display; highlighting or otherwise modifying a user interface element of a graphical user interface upon receiving a signal from the optical proximity sensor 106 that the user is hovering a finger near an input component of the electronic device 102 (e.g., rotary input device, push-button input device, touch input device, and so on); and so on.


In another embodiment, the electronic device 102 may be a portable electronic device such as a cellular phone. In these examples, the electronic device 102 can leverage a velocity or distance signal received from the optical proximity sensor 106 to determine an appropriate time to disable or enable a touch-sensitive display of the electronic device 102 when a user of the electronic device raises the cellular phone to the user's ear.


In another embodiment, the electronic device 102 may leverage a velocity or distance signal received from the optical proximity sensor 106 to determine whether the electronic device 102 is falling or will imminently impact a surface.


In another embodiment, the electronic device 102 may be a vehicle accessory or attachment. In these examples, the electronic device 102 can leverage a velocity or distance signal received from the optical proximity sensor 106 to determine a distance to, and/or a velocity of, another vehicle, pedestrian, or a road hazard.


In another embodiment, the electronic device 102 may position an optical proximity sensor, such as the optical proximity sensor 106, within an input/output communication port or a power port of the electronic device 102. In these examples, the electronic device 102 can leverage a velocity or distance signal received from the optical proximity sensor 106 to determine whether a cable is properly seated in the port, whether a cable is removed too quickly or in a manner that may cause damage to the electronic device 102, and so on.


It may be appreciated that the foregoing example embodiments are not exhaustive and that an optical proximity sensor, such as described herein, can be leveraged by an electronic device in any suitable manner to determine distance and/or velocity of a known or unknown object or surface relative to the electronic device.


For example, in some embodiments, an electronic device can include more than one optical proximity sensors, such as described herein. In another example, an electronic device can include an array of optical proximity sensors arranged in a pattern, such as in a line.


In many cases, an optical proximity sensor, such as described herein, can be disposed within a housing of an electronic device and aligned with a transparent aperture defined by the housing, but this may not be required. For example, in some embodiments, an optical proximity sensor can be disposed behind a display. In other cases, an optical proximity sensor can be disposed entirely within an electronic device housing. In these examples, the optical proximity sensor can be used to detect deflections or deformations in a surface of the electronic device that can result from a user applying a purposeful force to that surface. For example, in one embodiment, an optical proximity sensor is positioned entirely within a housing of an electronic device, behind a display. In this example, when a user of the electronic device applies a force to the display, the display may deform or bend, shortening the distance between the display and the optical proximity sensor. The optical proximity sensor, in turn, can detect and measure this deflection which can be correlated by a processor of the electronic device—and/or a processor of the optical proximity sensor—into a magnitude of force input.


The foregoing examples are not exhaustive; it may be appreciated that generally and broadly an electronic device can leverage one or more optical proximity sensors, such as described herein, for any suitable purpose or function.


For example, FIG. 2A depicts a schematic representation 200a of an electronic device 202 configured to measure a distance d to an object 204 and/or a velocity of the object 204 relative to an optical proximity sensor 206, such as described herein. In this example embodiment, the optical proximity sensor 206 can include a beam-shaping lens that redirects light emitted from the optical proximity sensor 206 to an angle θ. In this manner and as a result of this construction, the electronic device 202 and the optical proximity sensor 206 can determine velocity in multiple directions or along multiple axes (e.g., vx and vy). In this manner, together (optionally) with one or more additional optical proximity sensors, the electronic device 202 can determine multi-axis velocity and distance.


Still other embodiments can be implemented in other manners. For example, FIG. 2B depicts a schematic representation 200b of an electronic device 202 configured to leverage an optical proximity sensor 206 to measure a distance d to a flexible surface 208 and a velocity v of one or more deformations or flexions of that flexible surface 208. In this manner, and as a result of this construction, flexion of the flexible surface 208 can be quantified by the electronic device 202.


The foregoing embodiment depicted in FIGS. 1-2B and the various alternatives thereof and variations thereto are presented, generally, for purposes of explanation, and to facilitate an understanding of various configurations and constructions of an optical proximity sensor, such as described herein. However, it will be apparent to one skilled in the art that some of the specific details presented herein may not be required in order to practice a particular described embodiment, or an equivalent thereof.


Thus, it is understood that the foregoing and following descriptions of specific embodiments are presented for the limited purposes of illustration and description. These descriptions are not targeted to be exhaustive or to limit the disclosure to the precise forms recited herein. To the contrary, it will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.


For example, as noted above, an optical proximity sensor such as described herein, can be leveraged by an electronic device for, without limitation: determining a proximity of a user to the electronic device beyond a threshold; determining a distance separating a user and the electronic device; determining a distance separating a user's finger and the electronic device; determining a distance separating a user's finger and an input region (e.g., touch screen, force input sensor, physical input component, rotary input component) of the electronic device; determining a velocity with which a user's finger approaches the electronic device; determining a velocity or distance with which a surface of the electronic device deforms or deflects in response to an input force; a velocity or distance with which a housing portion of the electronic device deforms or deflects in response to an input force; and so on.


Expanding upon these and related examples, FIGS. 3A-3D are presented to depict various example use cases for an optical proximity sensor, such as described herein. These figures depict a wearable electronic device, such as a smart watch, but it may be appreciated that this is merely one example. An optical proximity sensor, such as described herein, can be incorporated into any suitable electronic device and leveraged for any suitable purpose.


For example, FIG. 3A depicts a wearable electronic device 300 including a housing 302 that can be worn by a user (e.g., via a wristband 304). In this example, an optical proximity sensor 306 can be included within the housing 302 and positioned relative to a periphery or bezel region of a display 308 that defines a graphical user interface 308a with which a user can interact. As a result of this construction, the optical proximity sensor 306 can be configured to and oriented to detect, measure, or otherwise determine a distance d and/or a velocity v of a user's finger 310 relative to an input component 312. The input component 312 can be any suitable input component including, but not limited to: a rotating input component (e.g., a crown); a press-button input component; a solid-state input component; and so on.


In this example embodiment, the wearable electronic device 300 can be configured to perform a first function upon determining that the user's finger 310 is approaching the input component 312, to perform a second function upon determining that the user's finger 310 is departing or moving away from the input component 312, to perform a third function upon determining that the user's finger 310 is hovering near or on the input component 312, and so on. It may be appreciated that these examples are not exhaustive and that the wearable electronic device 300 can perform any suitable function or operation based on the distance d and/or the velocity v of the user's finger 310 relative to the input component 312 as determined by the optical proximity sensor.


For example, in one embodiment, the wearable electronic device 300 can modify a position, characteristic, size, color, or other quality of a graphical user interface element 308b in response to a change in the distance d between the input component 312 and the user's finger 310.


In another example, FIG. 3B depicts a wearable electronic device 300 including a housing 302 configured to couple to a user via a wristband 304. An optical proximity sensor 306, such as described herein, can be disposed behind a display 308 that renders a graphical user interface 308a. In one example, the optical proximity sensor 306 is configured to emit light through an inter-pixel region of the display 308.


In this example, the optical proximity sensor 306 can be configured to and oriented to detect, measure, or otherwise determine a distance d and/or a velocity v of a user's finger 310 relative to the display 308 and/or to the graphical user interface 308a. The display 308 can be implemented as or with, without limitation: a touch-sensitive screen or display; a force-sensitive screen or display; a haptic-output surface; and so on.


In this example embodiment, as with other embodiments described herein, the wearable electronic device 300 can be configured to perform any suitable function or operation based on the distance d and/or the velocity v—and/or changes therein over time—of the user's finger 310 relative to the display 308 as determined by the optical proximity sensor.


In another example, FIG. 3C depicts a wearable electronic device 300 including a housing 302 that can be attached to a user via a wristband 304. An optical proximity sensor, such as described herein, can be disposed within the housing 302. In this example, the optical proximity sensor can be configured to and oriented to detect, measure, or otherwise determine a distance d and/or a velocity v of a deflection of a display 308 that results from a downward force applied by the user's finger 310 to a graphical user interface 308a rendered by the display 308. The display 308 can be configured in the same manner as described in reference to FIG. 3B, and this description is not repeated.


In this example embodiment, as with other embodiments described herein, the wearable electronic device 300 can be configured to perform any suitable function or operation based on the distance d and/or the velocity v—and/or changes therein over time—of the user's finger 310 relative to the display 308 as determined by the optical proximity sensor. In many examples, the wearable electronic device 300 can be configured to correlate or otherwise convert at least one of a distance d and/or a velocity v of the deflection of an display 308 into a magnitude of force input F.


In yet another example, FIG. 3D depicts a wearable electronic device 300 including a housing 302 that can be attached to a user via a wristband 304. An optical proximity sensor 306, such as described herein, can be disposed within the housing 302 adjacent to a sidewall of the housing 302. More specifically, in this example, the optical proximity sensor 306 can be configured to and oriented to detect, measure, or otherwise determine a distance d and/or a velocity v of a deflection of a housing sidewall or housing section—identified in the figure as the sidewall 314 that results from a force F applied by the user.


In this example embodiment, as with other embodiments described herein, the wearable electronic device 300 can be configured to perform any suitable function or operation based on the determine magnitude of the force F—and/or changes therein over time—applied by the user's finger 310 as determined by the optical proximity sensor.


The foregoing embodiment depicted in FIGS. 3A-3D and the various alternatives thereof and variations thereto are presented, generally, for purposes of explanation, and to facilitate an understanding of various possible means by which an optical proximity sensor, such as described herein, can be leveraged or otherwise used by an electronic device. However, it will be apparent to one skilled in the art that some of the specific details presented herein may not be required in order to practice a particular described embodiment, or an equivalent thereof.


Thus, it is understood that the foregoing and following descriptions of specific embodiments are presented for the limited purposes of illustration and description. These descriptions are not targeted to be exhaustive or to limit the disclosure to the precise forms recited herein. To the contrary, it will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.


For example, it may be appreciated that, generally and broadly, and in view of the foregoing examples, an optical proximity sensor, such as described herein, can be used in a number of suitable ways to detect distance and/or velocity of any suitable surface. FIG. 4 depicts a system diagram of an optical proximity sensor 400, such as described herein. In particular, the optical proximity sensor 400 includes a processor 402, a memory 404 (optional), and a power controller 406 each of which may be interconnected and/or communicably or conductively coupled in any suitable manner.


As described herein, the term “processor” refers to any software and/or hardware-implemented data processing device or circuit physically and/or structurally configured to instantiate one or more classes or objects that are purpose-configured to perform specific transformations of data including operations represented as code and/or instructions included in a program that can be stored within, and accessed from, a memory, such as the memory 404. This term is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, analog or digital circuits, application-specific integrated circuits, or other suitably configured computing element or combination of elements.


The power controller 406 is coupled to a primary VCSEL light source 408 and an auxiliary VCSEL light source 410. Each of these components—along with other components of the optical proximity sensor 400 that may be required or preferred in particular embodiments or implementations—can be disposed in whole or in part within a component enclosure 412. The optical proximity sensor 400 also includes a window 414 configured to allow light emitted from the primary VCSEL light source 408 to pass through the component enclosure 412 to illuminate a surface of an object 416.


In these examples, the processor 402 can be configured to cause the power controller 406 to generate a triangular current waveform (or other suitable periodic or non-periodic waveform) to simultaneously inject current into the primary VCSEL light source 408 and the auxiliary VCSEL light source 410. As noted with respect to other embodiments described herein, the primary VCSEL light source 408 and the auxiliary VCSEL light source 410 can each be configured to operate in a manner that leverages effects of self-mixing.


For simplicity, the operation of injecting current, whether modulated or otherwise, into a VCSEL light source to emit light and to promoting self-mixing, whether by a power controller or processor such as described herein, is referred to herein as “driving” a light source.


In the illustrated embodiment, the primary VCSEL light source 408 is aligned with the window 414 such that light emitted from the primary VCSEL light source 408 traverses the window 414 and illuminates a surface of the object 416 (which may be a user, an internal surface of an electronic device, and so on).


Reflections from the object 416 traverse the window 414 and reenter the primary VCSEL light source 408 to cause self-mixing interference that is directly related to the distance dvar separating the object 416 from the primary VCSEL light source 408 and, additionally directly related to the velocity of the object 416 relative to the primary VCSEL light source 408. In some cases, the power controller 406 can be further configured to monitor power output from one or more photodiodes optically coupled to the primary VCSEL light source 408 in order to quantify and/or otherwise determine variations in power output of the primary VCSEL light source 408 that result from the self-mixing effects. (see, e.g., Equations 1-2).


Conversely, in the illustrated embodiment, the auxiliary VCSEL light source 410 is positioned within the component enclosure 412 such that light emitted from the auxiliary VCSEL light source 410 only illuminates an internal surface of the component enclosure 412.


Reflections from the internal surface of the component enclosure 412 reenter the auxiliary VCSEL light source 410 to cause self-mixing interference that is directly related to the distance dref separating the internal surface of the component enclosure 412 from the auxiliary VCSEL light source 410. In some cases, as with the primary VCSEL light source 408 described above, the power controller 406 can be further configured to monitor power output from one or more photodiodes optically coupled to the auxiliary VCSEL light source 410 in order to quantify and/or otherwise determine variations in power output of the auxiliary VCSEL light source 410 that result from the self-mixing effects. (see, e.g., Equations 1-2).


In these embodiments, the processor 402 can be configured to receive one or more signals from the power controller 406 that correspond to one or more power use and/or power output characteristics of the auxiliary VCSEL light source 410 and the primary VCSEL light source 408. Thereafter, the processor 402 can be configured to calibrate and/or otherwise adjust an output corresponding to the primary VCSEL light source 408 based on an output corresponding to the auxiliary VCSEL light source 410 that relates to the fixed reference distance dref.


It may be appreciated that the calibration operation described herein can be performed by the processor 402 at any suitable time with any suitable form or format of data output from the power controller 406. For example, in some implementations, the processor 402 can be configured to modify or calibrate raw power consumption data—whether digital or analog—corresponding to the primary VCSEL light source 408 whereas in other cases, the processor 402 can be configured to modify or calibrate distance or velocity calculations. In still further examples, more than one calibration operation can be performed in sequence or in parallel on any suitable data.


In this manner, and a result of the foregoing described example system architecture, an optical proximity sensor, such as described herein, is generally and broadly configured to (1) determine in real time or substantially real time a distance and/or velocity calculation based on self-mixing interferometry and (2) to calibrate those measurements based on a second self-mixing interferometry calculation that is based on a fixed, reference distance.


More broadly, these foregoing embodiments depicted in FIGS. 3A-3D and the various alternatives thereof and variations thereto are presented, generally, for purposes of explanation, and to facilitate an understanding of various optical proximity sensor system architectures, such as described herein. However, it will be apparent to one skilled in the art that some of the specific details presented herein may not be required in order to practice a particular described embodiment, or an equivalent thereof.


Thus, it is understood that the foregoing and following descriptions of specific embodiments are presented for the limited purposes of illustration and description. These descriptions are not targeted to be exhaustive or to limit the disclosure to the precise forms recited herein. To the contrary, it will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.


For example, it may be appreciated that an optical proximity sensor and, in particular, a primary and auxiliary VCSEL associated therewith can be arranged and constructed in a variety of ways. As such, FIGS. 5-10 are presented to illustrate various example embodiments of an optical proximity sensor, such as described herein.


In particular, FIG. 5 depicts a simplified cross section view of an optical proximity sensor 500, such as described herein. The optical proximity sensor 500 is formed, at least in part, on a substrate 502 to which a semiconductor die 504 can be conductively and mechanically coupled. The semiconductor die 504 includes two separate VCSEL regions, each of which can include one or more individual VCSELs. These regions are identified in the figure as the auxiliary VCSEL region 506 and the primary VCSEL region 508. As noted with respect to other embodiments described herein, the auxiliary VCSEL region 506 and the primary VCSEL region 508 may be physically proximate (or immediately adjacent) to one another specifically such that each experiences substantially the same environmental conditions (e.g., temperature).


The semiconductor die 504, in addition to the auxiliary VCSEL region 506 and the primary VCSEL region 508 are enclosed on the substrate 502 by a component enclosure 510. An optical adapter 512 is insert molded or otherwise disposed into the component enclosure 510 and aligned above a central region or emission axis of the primary VCSEL region 508. In many examples, the optical adapter 512 may serve as a lens, but this is not required.


As a result of this construction, a beam of light emitted from the primary VCSEL region 508 can traverse the optical adapter 512 to illuminate an object 514. As with other embodiments described herein, a portion of the light radiated/emitted from the primary VCSEL region 508 can reflect from the object 514 and return to the primary VCSEL region 508 to facilitate self-mixing interferometry detection and/or measurement of a distance dvar (and/or a velocity v) that separates an exterior surface of the component enclosure 510 from the surface of the object 514.


Conversely, a beam of light emitted from the auxiliary VCSEL region 506 does not traverse the optical adapter 512 and, instead, reflects directly from an interior surface of the component enclosure 510 to facilitate self-mixing interferometry detection and/or measurement of a fixed reference distance dref.


As noted with respect to other embodiments described herein, interferometric calculations and/or measurements to cancel or mitigate effects of temperature and/or other environmental conditions can be performed in whole or in part by a power controller 516 coupled to the primary VCSEL region 508 and the auxiliary VCSEL region 506. The power controller 516 may be configured in the same manner as described above in reference to FIG. 4; this description is not repeated.


In this example embodiment, the component enclosure 510 may be substantially opaque, although this may not be required. Similarly, in this example, the optical adapter 512 may be insert molded or otherwise formed from a separate material from the component enclosure 510; this is also not expressly required.


For example, FIG. 6 depicts a simplified cross section view of an optical proximity sensor 600, such as described herein. As with other embodiments described herein, the optical proximity sensor 600 is formed, at least in part, on a substrate 602 to which a semiconductor die 604 can be conductively and mechanically coupled. The semiconductor die 604 includes two separate VCSEL regions, identified in the figure as the auxiliary VCSEL region 606 and the primary VCSEL region 608. These regions are defined physically proximate to one another in order to ensure substantially uniform environmental conditions.


The semiconductor die 604, in addition to the auxiliary VCSEL region 606 and the primary VCSEL region 608, are enclosed on the substrate 602 by a transparent component enclosure 610. The transparent component enclosure 610 includes a lensing region 612 aligned above a central region or emission axis of the primary VCSEL region 608. In many examples, the lensing region 612 may serve as a convex lens, but this is not required.


As a result of this construction, as with other embodiments described herein, a beam of light emitted from the primary VCSEL region 608 can traverse the lensing region 612 of the transparent component enclosure 610 to illuminate an object 614. As with other embodiments described herein, a portion of the light radiated/emitted from the primary VCSEL region 608 can reflect from the object 614 and return to the primary VCSEL region 608 to facilitate self-mixing interferometry detection and/or measurement of a distance dvar (and/or a velocity v) that separates an exterior surface of the transparent component enclosure 610 from the surface of the object 614.


Conversely, a beam of light emitted from the auxiliary VCSEL region 606 does not traverse the lensing region 612 of the transparent component enclosure 610 and, instead, reflects directly from an interior surface of the transparent component enclosure 610 to facilitate self-mixing interferometry detection and/or measurement of a fixed reference distance dref. In many embodiments, the transparent component enclosure 610 can optionally include a reflective region 610a to increase the quantity of light reflected back to the auxiliary VCSEL region 606.


In some cases, the reflective region 610a of the transparent component enclosure 610 can be formed by depositing a reflective ink or paint onto an interior or exterior surface of the transparent component enclosure 610. In other cases, the reflective region 610a of the transparent component enclosure 610 can be formed by inserting (e.g., via insert molding or co-molding) a reflective material such as metal or a multilayer dielectric stack (having high reflectivity) into the transparent component enclosure 610. In other cases, the reflective region 610a of the transparent component enclosure 610 can be formed by adhering or otherwise attaching a reflective material such as metal into the transparent component enclosure 610. In still other cases, the reflective region 610a of the transparent component enclosure 610 can be formed by introducing an optical index of refraction mismatch with transparent component enclosure 610. It may be appreciated that these examples are not exhaustive and that other methods of increasing reflectivity of the transparent component enclosure 610 can be suitable for other embodiments.


As noted with respect to other embodiments described herein, interferometric calculations and/or measurements to cancel or mitigate effects of temperature and/or other environmental conditions can be performed in whole or in part by a power controller 616 coupled to the primary VCSEL region 608 and the auxiliary VCSEL region 606. The power controller 616 may be configured in the same manner as described above in reference to FIG. 4; this description is not repeated.


Still further constructions are possible. For example, FIG. 7 depicts a simplified cross section view of an optical proximity sensor 700, such as described herein. As with other embodiments described herein, the optical proximity sensor 700 is formed, at least in part, on a substrate 702 to which a semiconductor die 704 can be conductively and mechanically coupled. The semiconductor die 704 includes two separate VCSEL regions, identified in the figure as the auxiliary VCSEL region 706 and the primary VCSEL region 708. As with other embodiments described herein, the auxiliary VCSEL region 706 and the primary VCSEL region 708 are defined and/or disposed physically proximate to one another.


The semiconductor die 704 is enclosed on the substrate 702 by a transparent component enclosure 710. Similar to the embodiment depicted in FIG. 5, an optical adapter 712 can be insert molded, or otherwise disposed into, the transparent component enclosure 710 and aligned above a central region or emission axis of the primary VCSEL region 708. As noted above, the optical adapter 712 may serve as a lens, but this is not required.


As a result of this construction, and as with other embodiments described herein, a beam of light emitted from the primary VCSEL region 708 can traverse the optical adapter 712 of the transparent component enclosure 710 to illuminate an object 714. A reflection therefrom can return to the primary VCSEL region 708 to facilitate self-mixing interferometry detection and/or measurement of a distance dvar (and/or a velocity v) that separates an exterior surface of the transparent component enclosure 710 from the surface of the object 714.


Conversely, a beam of light emitted from the auxiliary VCSEL region 706 does not traverse the optical adapter 712 of the transparent component enclosure 710 and, instead, reflects directly from an exterior surface of the transparent component enclosure 710 to facilitate self-mixing interferometry detection and/or measurement of a fixed reference distance dref. In this embodiment, in contrast to the embodiment shown in FIG. 6, the transparent component enclosure 710 can optionally include a reflective region 710a to increase the quantity of light reflected back to the auxiliary VCSEL region 706. The reflective region 710a can be configured and/or coupled to the transparent component enclosure 710 in the same manner as described above in reference to FIG. 6; this description is not repeated.


As noted with respect to other embodiments described herein, interferometric calculations and/or measurements to cancel or mitigate effects of temperature and/or other environmental conditions can be performed in whole or in part by a power controller 716 coupled to the primary VCSEL region 708 and the auxiliary VCSEL region 706. The power controller 716 may be configured in the same manner as described above in reference to FIG. 4; this description is not repeated.


In another example embodiment, FIG. 8 depicts a simplified cross section view of an optical proximity sensor 800, such as described herein. As with other embodiments described herein, the optical proximity sensor 800 is formed, at least in part, on a substrate 802 to which a semiconductor die 804 is conductively and mechanically coupled. The semiconductor die 804 includes two separate VCSEL regions, identified in the figure as the auxiliary VCSEL region 806 and the primary VCSEL region 808. These regions are disposed physically proximate to one another.


The semiconductor die 804 is enclosed on the substrate 802 by a transparent component enclosure 810. Similar to the embodiment depicted in FIG. 6, the transparent component enclosure 810 can include a lensing region 812 aligned with an emission/radiation axis of the primary VCSEL region 808. In this embodiment, the optical proximity sensor 800 can be positioned behind and/or otherwise coupled to, a transparent substrate 814. In one example embodiment, the transparent substrate 814 is a cover glass positioned over a display of an electronic device such as a smart watch or a cellular phone. In many cases, the optical proximity sensor 800 is coupled to the transparent substrate 814 with an optically clear adhesive.


As a result of this construction, a beam of light emitted from the primary VCSEL region 808 can illuminate an object 816 through the transparent substrate 814. A reflection therefrom can return to the primary VCSEL region 808 to facilitate self-mixing interferometry detection and/or measurement of a distance dvar (and/or a velocity v) that separates an exterior surface of the transparent component enclosure 810 from the surface of the object 816.


Conversely, as with other embodiments described herein, a beam of light emitted from the auxiliary VCSEL region 806 does not traverse the lensing region 812 of the transparent component enclosure 810 and, instead, reflects directly from an exterior surface of the transparent component enclosure 810 to facilitate self-mixing interferometry detection and/or measurement of a fixed reference distance dref. In this embodiment, in contrast to the embodiment shown in FIG. 6, the transparent component enclosure 810 can optionally include a reflective region 810a to increase the quantity of light reflected back to the auxiliary VCSEL region 806. The reflective region 810a can be configured and/or coupled to the transparent component enclosure 810 in the same manner as described above in reference to FIG. 6; this description is not repeated.


As noted with respect to other embodiments described herein, interferometric calculations and/or measurements to cancel or mitigate effects of temperature and/or other environmental conditions can be performed in whole or in part by a power controller 818 coupled to the primary VCSEL region 808 and the auxiliary VCSEL region 806. The power controller 818 may be configured in the same manner as described above in reference to FIG. 4; this description is not repeated.


Still further embodiments may not require an auxiliary VCSEL region. For example, FIG. 9 depicts a simplified cross section view of an optical proximity sensor 900. The optical proximity sensor 900 includes a substrate 902 coupled to a semiconductor die 904 defining a primary VCSEL region 906. The semiconductor die 904 and the primary VCSEL region 906 are enclosed against the substrate 902 by a transparent component enclosure 908 that can, similar to other embodiments described herein, optionally include a lensing region 910. In this example embodiment, a partially-transparent layer 912 is disposed, formed, or otherwise coupled to an interior surface of the transparent component enclosure 908. In this manner, the partially-transparent layer 912 can reflect a portion of light emitted/radiated from the primary VCSEL region 906 back to the primary VCSEL region 906. In addition, the partially-transparent layer 912 can transmit a portion of light emitted/radiated from the primary VCSEL region 906 toward an object 914.


As a result of this construction, a beam of light emitted from the primary VCSEL region 906 can illuminate the object 914 through the partially-transparent layer 912. As with other embodiments described herein, a reflection from the object 914 can return to the primary VCSEL region 906 to facilitate self-mixing interferometry detection and/or measurement of a distance dvar (and/or a velocity v) that separates an exterior surface of the transparent component enclosure 908 from the surface of the object 914.


In addition, the portion of light reflected from the partially-transparent layer 912 facilitates self-mixing interferometry detection and/or measurement of the fixed reference distance dref.


As noted with respect to other embodiments described herein, interferometric calculations and/or measurements to cancel or mitigate effects of temperature and/or other environmental conditions can be performed in whole or in part by a power controller 916 coupled to the primary VCSEL region 906. The power controller 916 may be configured in the same manner as described above in reference to FIG. 4; this description is not repeated.


In still further embodiments, a partially-transparent layer may not be required. For example, FIG. 10 depicts an optical proximity sensor 1000, such as described herein. In this example, a substrate 1002 is coupled to a semiconductor die 1004 that defines a primary VCSEL region 1006. In this example, however, a component enclosure 1008 can be formed together with an optical adapter 1010 that can, optionally, perform one or more lensing functions. The component enclosure 1008 also includes a shelf 1012 that at least partially blocks (e.g., reflects) light emitted from the primary VCSEL region 1006. In this manner, interferometric calculations and/or measurements to cancel or mitigate effects of temperature and/or other environmental conditions and to determine a distance to and/or a velocity of an object 1014 can be performed in whole or in part by a power controller 1016 coupled to the primary VCSEL region 1006.


The foregoing embodiment depicted in FIGS. 5-10 and the various alternatives thereof and variations thereto are presented, generally, for purposes of explanation, and to facilitate an understanding of various possible constructions of an optical proximity sensor, such as described herein. However, it will be apparent to one skilled in the art that some of the specific details presented herein may not be required in order to practice a particular described embodiment, or an equivalent thereof.


Thus, it is understood that the foregoing and following descriptions of specific embodiments are presented for the limited purposes of illustration and description. These descriptions are not targeted to be exhaustive or to limit the disclosure to the precise forms recited herein. To the contrary, it will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.



FIG. 11 is a flowchart depicting example operations of a method of operating a proximity sensor, such as described herein. The method 1100 includes operation 1102 in which a current or power waveform of a VCSEL is monitored. This method operation can be performed in whole or in part by a power controller or a processor such as described herein. The method 1100 further includes operation 1104 in which a target property is determined based on the monitored waveform of operation 1102. Example target properties include distance, velocity, and/or acceleration. In further examples, target properties can include one or more calculated quantities such as a change in distance over time, a direction or magnitude of velocity, and/or a magnitude of force applied to a surface to cause that surface to deflect or deform by a particular distance. The method 1100 further includes operation 1106 in which a target property value is updated in response to a change in the monitored waveform of operation 1102.



FIG. 12 is a flowchart depicting example operations of a method of operating a proximity sensor, such as described herein. The method 1200 includes operation 1202 in which frequency content of an output signal (e.g., of a photodiode optically coupled to a VCSEL such as described herein) is monitored. Next, at operation 1204, distance and/or velocity information can be determined based on the monitored frequency content of operation 1202.


One may appreciate that, although many embodiments are disclosed above, the operations and steps presented with respect to methods and techniques described herein are meant as exemplary and accordingly are not exhaustive. One may further appreciate that alternate step order or fewer or additional operations may be required or desired for particular embodiments.


Although the disclosure above is described in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects, and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the some embodiments of the invention, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments but are instead defined by the claims herein presented.

Claims
  • 1. A proximity sensor configured for detecting proximity of a user of a portable electronic device to the portable electronic device, the proximity sensor comprising: a substrate;a first laser light source disposed on a first region of the substrate;a second laser light source disposed on a second region of the substrate adjacent to, and coplanar with, the first region;an enclosure formed from a transparent material and positioned over the substrate to enclose the first laser light source and the second laser light source within the enclosure, the enclosure defining: a lensing region aligned over the first laser light source such that light emitted by the first laser light source is refracted by the lensing region before exiting the enclosure to illuminate an object external to the proximity sensor; anda planar reflection region aligned over the second laser light source such that light emitted by the second laser light source reflects from the planar reflection region and returns to the second laser light source to induce a self-mixing interferometric effect; anda controller configured to drive the first laser light source and the second laser light source and to determine a distance to and a velocity of the object based on self-mixing induced in the first laser light source, output of the first laser light source calibrated based on the self-mixing interferometric effect induced in the second laser light source.
  • 2. The proximity sensor of claim 1, wherein the portable electronic device is a wearable electronic device.
  • 3. The proximity sensor of claim 2, wherein the wearable electronic device is a head mounted wearable electronic device.
  • 4. The proximity sensor of claim 1, wherein the first laser light source and the second laser light source each comprise an array of laser diodes.
  • 5. The proximity sensor of claim 1, wherein the first laser light source and the second laser light source each comprise an array of vertical cavity surface emitting laser diodes.
  • 6. The proximity sensor of claim 1, wherein the controller is configured to determine a velocity of the object based, at least in part, on the power consumption of the first laser light source.
  • 7. The proximity sensor of claim 1, wherein the second laser light source is disposed a distance from to the first laser light source such that the first laser light source and the second laser light source experience substantially the same thermal environment.
  • 8. The proximity sensor of claim 1, wherein the enclosure is formed from an acrylic material.
  • 9. The proximity sensor of claim 1, wherein the enclosure comprises a reflector disposed onto the planar reflective surface.
  • 10. The proximity sensor of claim 9, wherein the reflector comprises a metallic material.
  • 11. The proximity sensor of claim 9, wherein the reflector is disposed on an interior surface of the enclosure.
  • 12. The proximity sensor of claim 9, wherein the reflector is disposed on an exterior surface of the enclosure such that light emitted from the second laser light source passes at least partially through the enclosure before being reflected from the reflector back to the second laser light source.
  • 13. The proximity sensor of claim 1, wherein the object is a body part of the user.
  • 14. The proximity sensor of claim 13, wherein the body part is a finger and the controller is configured to determine whether the finger is hovering relative to the portable electronic device.
  • 15. The proximity sensor of claim 13, wherein the body part is a finger and the controller is configured to determine whether the finger is moving relative to the portable electronic device.
  • 16. A portable electronic device comprising: a housing; anda proximity sensor within the housing and comprising: a substrate;a first laser light source disposed on the substrate;a second laser light source disposed on the substrate adjacent to the first laser light source;an enclosure formed from a transparent material and positioned over the substrate to enclose the first laser light source and the second laser light source within the enclosure, the enclosure defining a lensing region aligned over the first laser light source such that light emitted by the first laser light source is refracted by the lensing region before exiting the enclosure and the housing to illuminate an object external to the portable electronic device; anda reflector disposed on an interior surface of the enclosure and aligned over the second laser light source such that light emitted by the second laser light source reflects from the reflector and returns to the second laser light source to induce a self-mixing interferometric effect; anda controller configured to determine a distance to, and a velocity of, the object based on self-mixing induced in the first laser light source, output of the first laser light source calibrated based on the self-mixing interferometric effect induced in the second laser light source.
  • 17. The portable electronic device of claim 16, wherein the portable electronic device is a wearable electronic device and the object is a finger of a user of the portable electronic device.
  • 18. The portable electronic device of claim 17, wherein the wearable electronic device is a head mounted electronic device and the controller is configured to determine a position of a finger of the user relative to the housing of the head mounted electronic device.
  • 19. The portable electronic device of claim 16, comprising a display disposed within the housing; and the controller is configured to determine a position of a finger of a user of the portable electronic device relative to the display.
  • 20. A method of determining proximity of a body part of a user to a housing part of a wearable electronic device incorporating an optical proximity sensor, the method comprising: driving a first laser light source of the optical proximity sensor to emit light through a lensing portion of an at least partially transparent enclosure enclosing the first laser light source against a substrate of the optical proximity sensor such that light emitted from the first laser light source illuminates the body part;driving a second laser light source of the optical proximity sensor to emit light toward a reflector disposed on an internal surface of the at least partially transparent enclosure such that light emitted from the second laser light source illuminates the reflector, reflecting therefrom and inducing a self-mixing effect within the second laser light source;determining a velocity of the body part of the user based on self-mixing effects induced in the first laser light source; andcalibrated the determined velocity based on self-mixing effects induced in the second laser light source.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of U.S. patent application Ser. No. 16/230,967, filed Dec. 21, 2018, the contents of which are incorporated herein by reference as if fully disclosed herein.

US Referenced Citations (139)
Number Name Date Kind
4254333 Bergström Mar 1981 A
4468131 Bui et al. Aug 1984 A
4913547 Moran Apr 1990 A
5500708 Ohsawa Mar 1996 A
5781297 Castore Jul 1998 A
5825465 Nerin et al. Oct 1998 A
6233045 Suni et al. May 2001 B1
6531767 Shrauger Mar 2003 B2
6816523 Glenn et al. Nov 2004 B1
6872931 Liess et al. Mar 2005 B2
7139446 Slotwinski Nov 2006 B2
7184445 Guenter Feb 2007 B2
7277180 Townley-Smith et al. Oct 2007 B2
7336368 Liao et al. Feb 2008 B2
7366217 Guenter et al. Apr 2008 B2
7388672 Zhou et al. Jun 2008 B2
7509050 Ekkizogloy et al. Mar 2009 B2
7557795 Kong et al. Jul 2009 B2
7589709 Liess et al. Sep 2009 B2
7620332 Nishiyama Nov 2009 B2
7667851 Dubois et al. Feb 2010 B2
7675020 Machida Mar 2010 B2
7684957 Ueno Mar 2010 B2
7995193 Kuwata Aug 2011 B2
8174597 Ogasawara May 2012 B2
8208814 Sheth et al. Jun 2012 B2
8248615 Ueno Aug 2012 B2
8378287 Schemmann et al. Feb 2013 B2
8405902 Irie Mar 2013 B2
8416424 Werner et al. Apr 2013 B2
8444331 Kobayashi May 2013 B2
8446592 Arissian May 2013 B1
8529460 Kawano et al. Sep 2013 B2
8736581 Han et al. May 2014 B2
8751091 Moench et al. Jun 2014 B2
8781687 Han et al. Jul 2014 B2
8896745 Takachi Nov 2014 B2
8942069 Tortora Jan 2015 B2
9004698 Kilcher et al. Apr 2015 B2
9091573 Van Der Lee et al. Jul 2015 B2
9091747 Pruijmboom Jul 2015 B2
9146304 Land et al. Sep 2015 B2
9160390 Zhou et al. Oct 2015 B2
9420155 Brodie Aug 2016 B2
9459352 Becker et al. Oct 2016 B2
9588586 Rihn Mar 2017 B2
9648221 Seo et al. May 2017 B2
9658113 Bosch et al. May 2017 B2
9677986 Baldwin et al. Jun 2017 B1
9703173 Brodie Jul 2017 B2
9726474 Royo Royo et al. Aug 2017 B2
9772398 Bikumandla et al. Sep 2017 B2
9778037 Bestler Oct 2017 B2
9911890 Renard et al. Mar 2018 B2
9912923 Kilcher et al. Mar 2018 B2
9952245 Ueno Apr 2018 B2
9995877 Nakamura Jun 2018 B2
RE46930 Mimeault Jul 2018 E
10070799 Ang et al. Sep 2018 B2
10184783 Flanders et al. Jan 2019 B2
10215555 Chen et al. Feb 2019 B2
10222474 Raring et al. Mar 2019 B1
10317651 Furutake et al. Jun 2019 B2
10379028 Spruit et al. Aug 2019 B2
10386554 Hjelmstrom et al. Aug 2019 B2
10492679 Zhou Dec 2019 B2
10503048 Del Bino et al. Dec 2019 B2
10555079 Bakish Feb 2020 B2
10613625 Huang et al. Apr 2020 B2
10614295 Kim et al. Apr 2020 B2
10635800 Bakish Apr 2020 B2
10581474 Fishman et al. Jun 2020 B1
10705211 Jacobs et al. Jul 2020 B2
10718922 Yong et al. Jul 2020 B2
10791283 Bardagjy et al. Sep 2020 B2
10824275 Mutlu et al. Nov 2020 B2
10845873 Huang Nov 2020 B2
10866083 Van Der Lee et al. Dec 2020 B2
10871820 Mutlu et al. Dec 2020 B2
10871836 Dashevsky Dec 2020 B2
10970556 Teich et al. Apr 2021 B2
11036318 Kuboyama et al. Jun 2021 B2
11073615 Chua et al. Jul 2021 B2
11092480 Garrett et al. Aug 2021 B2
11092531 Spruit et al. Aug 2021 B2
11112233 Mutlu et al. Sep 2021 B2
11150332 Chen et al. Oct 2021 B1
11156456 Chen et al. Oct 2021 B2
11157113 Winkler et al. Oct 2021 B2
11175700 Poole Nov 2021 B1
11243068 Mutlu et al. Feb 2022 B1
11243686 McCord Feb 2022 B2
11280714 Momtahan et al. Mar 2022 B2
11327008 Venturini et al. May 2022 B2
11360440 Perkins et al. Jun 2022 B2
11460293 Chen et al. Oct 2022 B2
11494926 Wu Nov 2022 B1
11549799 Tan et al. Jan 2023 B2
11629948 Shou et al. Apr 2023 B2
11822095 Park et al. Nov 2023 B2
20050156874 Kong Jul 2005 A1
20050157971 Juijve Jul 2005 A1
20060239312 Kewitsch et al. Oct 2006 A1
20080007713 Zijp Jan 2008 A1
20080123106 Zeng et al. May 2008 A1
20090002829 Shinohara Jan 2009 A1
20110126617 Bengoechea Apezteguia et al. Jun 2011 A1
20110267467 Kimura et al. Nov 2011 A1
20120281221 Studer et al. Nov 2012 A1
20140293055 Otsuka Oct 2014 A1
20150309568 Miki Oct 2015 A1
20160021285 Nadler et al. Jan 2016 A1
20170192133 Murakami et al. Jul 2017 A1
20170343817 Bietry et al. Nov 2017 A1
20180073924 Steinmann et al. Mar 2018 A1
20180081434 Siddiqui et al. Mar 2018 A1
20190146065 Jutte et al. May 2019 A1
20190313178 Mutlu Oct 2019 A1
20190317454 Holenarsipur et al. Oct 2019 A1
20190319157 Coffy et al. Oct 2019 A1
20200200522 Huang et al. Jun 2020 A1
20200318945 Mutlu Oct 2020 A1
20200370879 Mutlu et al. Nov 2020 A1
20210015350 Butte et al. Jan 2021 A1
20210104873 Gerlach Apr 2021 A1
20210116355 Spruit et al. Apr 2021 A1
20210294489 Li et al. Sep 2021 A1
20210302745 Mutlu Sep 2021 A1
20220003543 Chen et al. Jan 2022 A1
20220155052 Mutlu et al. May 2022 A1
20220202515 Itkowitz et al. Jun 2022 A1
20220349703 Spruit Nov 2022 A1
20220404138 Mutlu et al. Dec 2022 A1
20230073039 Huang Mar 2023 A1
20230087691 Chen et al. Mar 2023 A1
20230314185 Chen et al. Oct 2023 A1
20240003671 Jin et al. Jan 2024 A1
20240004034 Chen et al. Jan 2024 A1
20240060766 Suarez Feb 2024 A1
Foreign Referenced Citations (28)
Number Date Country
101592762 Dec 2009 CN
102109650 Jun 2011 CN
105223579 Jan 2016 CN
107564924 Jan 2018 CN
108225543 Jun 2018 CN
108692663 Apr 2020 CN
3598591 Jan 2020 EP
3796140 Mar 2021 EP
2443662 May 2008 GB
2005528682 Sep 2005 JP
2009222573 Oct 2009 JP
2010526315 Jul 2010 JP
2011523700 Aug 2011 JP
2012521003 Sep 2012 JP
2013508717 Mar 2013 JP
2014078823 May 2014 JP
2014534459 Dec 2014 JP
2015078946 Apr 2015 JP
2019515258 Jun 2019 JP
2019121691 Jul 2019 JP
WO 05013517 Feb 2005 WO
WO 09156937 Dec 2009 WO
WO 10139144 Dec 2010 WO
WO 12049561 Apr 2012 WO
WO 17178711 Oct 2017 WO
WO 18036685 Mar 2018 WO
WO 19015623 Jan 2019 WO
WO 19042953 Mar 2019 WO
Non-Patent Literature Citations (1)
Entry
Diao et al., “High-speed high-resolution heterodyne interferometer using a laser with low beat frequency,” Applied Optics, vol. 55, No. 1, 2015, pp. 110-116.
Related Publications (1)
Number Date Country
20230366672 A1 Nov 2023 US
Continuations (1)
Number Date Country
Parent 16230967 Dec 2018 US
Child 18221210 US