This relates generally to wearable devices, and, more particularly, to wearable devices such as head-mounted devices and other eyewear.
Head-mounted devices and other eyewear may use gaze tracking circuitry to track a user's gaze.
It can be challenging to design gaze tracking circuitry that performs satisfactorily. If care is not taken, the gaze tracking circuitry may produce inaccurate measurements or may exhibit other performance limitations such as excessive power consumption.
Ultrasonic transducers may be used to track the position of an object. For example, eyewear such as a pair of glasses or other head-mounted device may use ultrasonic transducers to track a user's eye position and gaze direction. To increase positioning accuracy over short-range distances, an array of ultrasonic transducers with different center frequencies may be formed on a common substrate. The closely spaced ultrasonic transducers with relatively small individual bandwidths may be used to simulate a single transducer with a larger bandwidth.
Control circuitry may adjust the phases and/or amplitudes of the transducers in the array to steer the ultrasonic signal beam towards the user's eye (and away from direct paths to receiving transducer arrays, if desired) and/or to receive a reflected ultrasonic signal beam from the direction of the user's eye. The control circuitry may use time delay measurement techniques, phase delay measurement techniques, and/or amplitude measurement techniques to determine a distance and/or direction to the user's eye using the ultrasonic transducer arrays.
The transducer arrays may include piezoelectric micromachined ultrasonic transducers, capacitive micromachined ultrasonic transducers, and/or other suitable type of ultrasonic transducers formed in a common substrate. The transducers may be provided with different center frequencies by forming cavities with different depths, cavities with different diameters, and/or cavities with different surface features.
Control circuitry may control each transducer in the array independently of one another, and/or the control circuitry may control subsets of transducers in the array (e.g., subsets with the same center frequency) independently of other sets of transducers in the array. One or more sets of transducers with the same center frequency may be placed adjacent to one another in the array to help disambiguate phase information. The remaining transducers in the array that are not used for phase disambiguation may be spaced farther apart from other transducers of the same center frequency.
Ultrasonic transducers may be used to track the position of one or more objects of interest. For example, eyewear such as a pair of glasses or other head-mounted device may use one or more ultrasonic transducers to track a user's eye position and gaze direction. The ultrasonic transducers may include capacitive micromachined ultrasonic transducers, piezoelectric micromachined transducers, and/or other suitable ultrasonic transducers for emitting and/or detecting acoustic signals. To increase positioning accuracy over short-range distances, multiple ultrasonic transducers with different center frequencies may be arranged in an array. An array of closely spaced ultrasonic transducers with relatively small individual bandwidths may be used to simulate a single transducer with a much larger bandwidth, without sacrificing performance. When the transducers in the array are properly phased, the array may be able to generate an ultrasonic signal pulse with a much shorter pulse length than that which could be produced with a single transducer alone. Being able to generate short ultrasonic signal pulses may be especially beneficial for determining short-range distances and for resolving two objects that are closely spaced together. For example, a phased transducer array may be able to resolve objects that are as close as one wavelength apart or other suitable distance.
If desired, the ultrasonic transducers with different center frequencies may be operated independently of one another to independently measure signals at different frequencies. The independently measured signals can then be used to estimate time of flight and path length to the desired level of accuracy.
In arrangements where the ultrasonic transducers are used for gaze tracking in a pair of glasses or other eyewear, multiple arrays of ultrasonic transducers may be distributed at different locations around each of the user's eyes. Each array may include ultrasonic transducers with different center frequencies. One or more of the arrays may emit ultrasonic signals towards the user's eye. The ultrasonic signals may reflect off of the user's eye and may be detected by one or more of the other arrays. Control circuitry may gather data from the transducer arrays to determine the user's eye position and/or gaze direction (e.g., using time delay measurement techniques, phase delay measurement techniques, amplitude measurement techniques, and/or other suitable measurement techniques).
An illustrative system having a device with one or more ultrasonic transducers is shown in
Adjustable lens components 22 may form lenses that allow a viewer (e.g., a viewer having eyes 16) to view external objects such as object 18 in the surrounding environment. Glasses 14 may include one or more adjustable lens components 22, each aligned with a respective one of a user's eyes 16. As an example, lens components 22 may include a left lens 22 aligned with a viewer's left eye and may include a right lens 22 aligned with a viewer's right eye. This is, however, merely illustrative. If desired, glasses 14 may include adjustable lens components 22 for a single eye.
Adjustable lenses 22 may be corrective lenses that correct for vision defects. For example, eyes 16 may have vision defects such as myopia, hyperopia, presbyopia, astigmatism, higher-order aberrations, and/or other vision defects. Corrective lenses such as lenses 22 may be configured to correct for these vision defects. Lenses 22 may be adjustable to accommodate users with different vision defects and/or to accommodate different focal ranges. For example, lenses 22 may have a first set of optical characteristics for a first user having a first prescription and a second set of optical characteristics for a second user having a second prescription. Glasses 14 may be used purely for vision correction (e.g., glasses 14 may be a pair of spectacles) or glasses 14 may include displays that display virtual reality or augmented reality content (e.g., glasses 14 may be a head-mounted display). In virtual reality or augmented reality systems, adjustable lens components 22 may be used to move content between focal planes from the perspective of the user. Arrangements in which glasses 14 are spectacles that do not include displays are sometimes described herein as an illustrative example.
Glasses 14 may include control circuitry 26. Control circuitry 26 may include processing circuitry such as microprocessors, digital signal processors, microcontrollers, baseband processors, image processors, application-specific integrated circuits with processing circuitry, and/or other processing circuitry and may include random-access memory, read-only memory, flash storage, hard disk storage, and/or other storage (e.g., a non-transitory storage media for storing computer instructions for software that runs on control circuitry 26).
Glasses 14 may include input-output circuitry such as eye state sensors, range finders disposed to measure the distance to external object 18, touch sensors, buttons, microphones to gather voice input and other input, sensors, and other devices that gather input (e.g., user input from viewer 16) and may include light-emitting diodes, displays, speakers, and other devices for providing output (e.g., output for viewer 16). Glasses 14 may, if desired, include wireless circuitry and/or other circuitry to support communications with a computer or other external equipment.
Control circuitry 26 may also control the operation of optical elements such as adjustable lens components 22. Adjustable lens components 22, which may sometimes be referred to as adjustable lenses, adjustable lens systems, adjustable optical systems, adjustable lens devices, tunable lenses, etc., may include fluid-filled variable lenses, may include Alvarez lenses, and/or may contain electrically adjustable material such as liquid crystal material, volume Bragg gratings, or other electrically modulated material that may be adjusted to produce customized lenses. Each of components 22 may contain an array of electrodes that apply electric fields to portions of a layer of liquid crystal material or other voltage-modulated optical material with an electrically adjustable index of refraction (sometimes referred to as an adjustable lens power or adjustable phase profile). By adjusting the voltages of signals applied to the electrodes, the index of refraction profile of components 22 may be dynamically adjusted. This allows the size, shape, and location of the lenses formed within components 22 to be adjusted.
Glasses 14 may include gaze tracking circuitry such as gaze tracking circuitry 20. Gaze tracking circuitry 20 may include one or more sensors for tracking a user's eyes 16. For example, gaze tracking circuitry 20 may include one or more digital image sensors (e.g. visible image sensors and/or infrared image sensors that gather images of the user's eyes), ultrasonic sensors, light-based sensors such as lidar (light detection and ranging) sensors, and/or other suitable sensors for tracking the location of a user's eyes. As an example, gaze tracking circuitry 20 may be used by control circuitry 26 to gather eye information such as cornea location, pupil location, gaze direction, and/or other information about the eye(s) of the viewer. The locations of the viewer's pupils and the locations of the viewer's pupils relative to specular glints from light sources with known positions or the rest of the viewer's eyes may be used to determine the locations of the centers of the viewer's eyes (i.e., the centers of the user's pupils) and the direction of view (gaze direction) of the viewer's eyes. If desired, gaze tracking circuitry 20 may also include a wavefront sensor that measures the aberrations of a user's eyes so that control circuitry 26 can adjust the optical properties of lens component 22 to correct the user-specific aberrations detected by the wavefront sensor.
Gaze tracking information may be used as a form of user input, may be used to determine where, within an image, image content resolution should be locally enhanced in a foveated imaging system (e.g., in arrangements where device 14 is a head-mounted display), and/or may be used to determine where, within one or both of lenses 22, vision correction should be locally enhanced in a foveated lens arrangement. In a foveated lens arrangement, control circuitry 26 may dynamically adjust lens components 22 so that the optical properties of portions of lens components 22 that align with a user's gaze are different than the optical properties of portions of lens components 22 that are outside of the user's gaze. For example, portions of lens components 22 that align with a user's gaze may be optically modulated to produce a first lens power, while the remaining portions of lens components 22 may be left optically unmodulated, may be optically modulated to produce a second lens power magnitude that is less than the first lens power magnitude, and/or may be optically modulated to produce a phase profile that is less spatially varied than the phase profile of portions of lens components 22 within the user's gaze. Control circuitry 26 may gather eye information from gaze tracking circuitry 20 and may adjust the optical properties of lens components 22 accordingly.
During operation, one or more of arrays 24 such as array 24A may be used to emit ultrasonic signals 32. The ultrasonic signals 32 may reflect off of a user's eye 16 (e.g., may reflect off of cornea 30 of the user's eye 16). One or more of arrays 24 such as array 24B may be used to detect ultrasonic signals 32 after the signals reflect off of the user's cornea 30. Using time-of-flight measurement techniques, control circuitry 26 may be used to determine the time that it takes for the emitted signal 32 to reflect back from eye 16, which may in turn be used to determine the distance to eye 16 (e.g., the distance to the point of specular reflection on cornea 30). As eye 16 rotates, control circuitry 26 may continue to monitor changes in distance to the point of specular reflection on the user's eye. For example, as the user's eye 16 moves in direction 36, ultrasonic signals 32′ from transducer array 24A may reflect off of cornea 30′ and may be detected by array 24B. The time-of-flight of signals 32′ may be different from the time-of-flight of signals 32, due to the change in distance to the point of specular reflection of the user's eye 16. Control circuitry 26 may monitor these changes in distance to determine the direction of the user's gaze during the operation of glasses 14. If desired, the same array 24 may be used to emit and detect signals 32. Arrangements in which multiple arrays 24 emit signals 32 and/or where multiple arrays 24 detect signals 32 may also be used. The example of
If desired, one or more of arrays 24 may be a phased transducer array. In a phased ultrasonic transducer array, beam steering techniques may be used in which ultrasonic signal phase and/or magnitude for each transducer in array 24 are adjusted to perform beam steering. Beam steering may be used to “illuminate” a particular area of interest with ultrasonic signals 32. Beam steering may also be used to avoid illuminating certain areas with ultrasonic signals 32 (e.g., to avoid directly illuminating other arrays 24 and/or to avoid illuminating certain parts of the user's face). For example, a phased ultrasonic transducer array 24 may be configured to emit a concentrated beam of ultrasonic signals 32 that strikes cornea 30 but does not strike the user's eye brow. This type of beam steering arrangement may help improve gaze tracking accuracy by avoiding detecting significant reflections from surfaces around the user's eye 16.
The use of time-of-flight based measurement techniques is merely illustrative. If desired, other time-based, amplitude-based, and/or phase-based measurement schemes such as time difference of arrival measurement techniques, angle of arrival measurement techniques, triangulation methods, and/or other suitable measurement techniques may be used to determine a location of the user's eye 16 using ultrasonic sensor arrays 24. Arrangements in which other sensors such as visible light cameras, infrared light cameras, and/or proximity sensors (e.g., infrared proximity sensors or other proximity sensors) are used to gather eye location information, glint location information, gaze direction information, pupil shape information, and/or other eye information may also be used.
In the example of
In the example of
The examples of
The bandwidth of each individual transducer 38 may be smaller than the collective bandwidth spanned by all of the transducers 38 in array 24. The center frequencies of individual transducers 38 may be selected so that the collective bandwidth of the entire array 24 spans some or all of the desired frequency range (e.g., from 750 kHz to 1.25 MHz, from 700 kHz to 1 MHZ, from 800 kHz to 1.2 MHz, from 900 kHz to 1.1 MHz, from 750 kHz to 1.4 MHz, or any other suitable frequency range). The desired frequency range may depend on the range of distances to be measured. For example, to measure distances to objects that are within a few centimeters (such as a user's eye 16), array 24 may span a frequency range of 750 kHz to 1.25 MHz (as an example).
Substrate 42 may have any suitable dimensions. For example, lateral dimensions L1 and L2 of substrate 42 may be between 2 mm and 2.5 mm, between 1 mm and 1.5 mm, between 1 mm and 3 mm, between 2 mm and 4 mm, and/or other suitable length. Dimensions L1 and L2 may be equal (so that substrate 42 has a square footprint) or unequal (so that substrate 42 has a rectangular footprint), or the footprint of substrate 42 may have other shapes (e.g., circular, oval, round, triangular, etc.).
Transducers 38 may be arranged in an evenly spaced grid of rows and columns on substrate 42, or may be arranged with any other suitable pattern (e.g., unevenly spaced clusters, a random pattern, a non-grid pattern, etc.). The example of
In some arrangements, it may be desirable to maximize the amount of space between transducers 38 that share the same center frequency. Maximizing the spacing between commonly configured transducers 38 in array 24 may increase the accuracy of distance measurements made with array 24. Different rules regarding placement of the different subsets of transducers 38 on substrate 42 may be implemented to achieve the desired performance from array 24. As an example, the convex hull of a given set of transducers 38 that share the same center frequency may cover at least 50% of array 24, may cover at least 80% of array 24, or may cover other suitable portions of array 24. As another example, most pairs of transducers 38 that share the same center frequency may be separated by a transducer 38 of a different center frequency. These examples are merely illustrative. In general, transducers 38 may be placed in any suitable arrangement on substrate 42.
Operation of transducers 38 may be controlled by control circuitry 26. Substrate 42 may include interconnects 40 for conveying signals between transducers 38 and control circuitry 26. For example, interconnects 40 may be used to convey driving signals from control circuitry 26 to transducers 38 and to convey sensor signals (e.g., sensor signals associated with ultrasonic waves that are detected by transducers 38) from transducers 38 to control circuitry 26.
If desired, transducers 38 in array 24 may be independently controlled from one another. For example, the frequency, phase, and pulse shape of the driving signal for a given transducer 38 may be different from other transducers 38 in array 24. Each individual transducer 38 in array 24 may be independently controlled with different driving signals, or there may be subsets of transducers 38 (e.g., a subset that share the same center frequency or other suitable subset) that are controlled with the same drive signals but that are independently controlled from other subsets of transducers 38. This is merely illustrative, however. If desired, transducers 38 may not be independently controlled and/or may be controlled with any other suitable driving scheme.
In some arrangements, transducers 38 may be driven by off-chip control circuitry. In this type of arrangement, interconnects 40 may include leads, contact pads, solder and/or other conductive elements for conveying signals between array 24 and control circuitry 26 that is separate from array 24. In other arrangements, substrate 42 may be a multilayer substrate in which transducers 38 are stacked with a control circuitry layer (e.g., an application-specific integrated circuit layer) that includes control circuitry 26. With this type of integrated control circuitry, interconnects 40 may include metal vias or and/or other conductive elements for conveying signals between transducers 38 and control circuitry 26 that is located in a different layer of substrate 42. These examples are merely illustrative. If desired, interconnects 42 may include metal vias for conveying signals between different layers of substrate 42 and may also include contact pads for conveying signals between array 24 and external circuitry.
In the example of
In the example of
The transducers of
In the example of
In the example of
The examples of
Distance A2 may be determined as a function of angle Y or angle X (e.g., A2=A1 sin(X) or A2=A1 cos(Y)). Distance A2 may also be determined as a function of the phase difference between the signal received by transducer 38-1 and the signal received by transducer 38-2 (e.g., A2=(Δϕλ)/(2π), where Δϕ is the phase difference between the signal received by transducer 38-1 and the signal received by transducer 38-2 and λ is the wavelength of the received signal 32). Control circuitry 26 may include phase measurement circuitry coupled to each transducer 38 to measure the phase of the received signals and identify a difference in the phases (Δφ). The two equations for A2 may be set equal to each other (e.g., A1 sin(X)=(Δφλ)/(2π)) and rearranged to solve for angle X (e.g., X=sin−1((Δφλ)/(2πA1)) or may be rearranged to solve for angle Y. As such, the angle of arrival may be determined (e.g., by control circuitry 26) based on the known (predetermined) distance between transducer 38-1 and transducer 38-2, the detected (measured) phase difference between the signal received by transducer 38-1 and the signal received by transducer 38-2, and the known wavelength or frequency of the received signals 32.
Because phase measurements are restricted to an interval between negative pi and positive pi, phase wrapping may occur which can, if care is not taken, lead to ambiguous phase difference measurements. Phase disambiguation may be achieved by performing non-uniform sampling and/or by clustering certain transducers together within array 24. This type of arrangement is illustrated in
As shown in
The use of multiple transducers 38 in phased transducer array 24 allows beam steering arrangements to be implemented by controlling the relative phases and magnitudes (amplitudes) of the ultrasonic signals conveyed by the transducers. Control circuitry 26 may, for example, include phase shifter circuits and/or circuitry for adjusting the magnitude of the ultrasonic signals. The circuitry within control circuitry 26 that controls the phase and/or magnitude of drive signals for arrays 24 may sometimes be referred to as beam steering circuitry (e.g., beam steering circuitry that steers the beam of ultrasonic signals transmitted and/or received by phased transducer array 24).
Control circuitry 26 may adjust the relative phases and/or magnitudes of the transmitted signals that are provided to each transducer 38 in phased transducer array 24 and may adjust the relative phases and/or magnitudes of the received signals that are received by phased transducer array 24. Control circuitry 26 may, if desired, include phase detection circuitry for detecting the phases of the received signals that are received by phased transducer array 24. The term “beam” or “signal beam” may be used herein to refer to ultrasonic signals that are transmitted and/or received by phased transducer array 24 in a particular direction. The signal beam may exhibit a peak gain that is oriented in a particular pointing direction at a corresponding pointing angle (e.g., based on constructive and destructive interference from the combination of signals from each transducer in the phased transducer array). The term “transmit beam” may sometimes be used herein to refer to ultrasonic signals that are transmitted in a particular direction whereas the term “receive beam” may sometimes be used herein to refer to ultrasonic signals that are received from a particular direction.
If, for example, control circuitry 26 provides control signals to transducers 38 to emit ultrasonic signals 32 having a first set of phases and/or magnitudes, the transmitted signals 32 may form a transmit beam that illuminates (i.e., covers) both the cornea 30 and sclera 28 of the user's eye 16, as shown in the example of
Control circuitry 26 may also operate array 24 to receive signals from a given range of angles. This may be achieved by adjusting the phase and magnitude of transducers 38 so that ultrasonic signals 32 are received from a particular direction. In other arrangements, this may be achieved post-measurement by changing the gain of signals received from a given region to amplify those signals relative to signals received from other regions.
If desired, control circuitry 26 may actively adjust control signals for transducers 38 in real time to steer the transmit or receive beam in different desired directions over time. For example, gaze tracking circuitry 20 may include a camera that captures images of the user's eye, which in turn may be used to determine the location of a glint on the user's eye. Control circuitry 26 may control arrays 24 based on eye information gathered with the camera so that beam shaping and beam steering can be performed accordingly (e.g., so that signals 32 are steered towards the location of the glint on the eye and/or received from the location of the glint on the eye). If desired, the ultrasonic signal beam may also be steered away from other arrays 24 (e.g., to avoid a scenario in which a receiving array 24 detects a signal that travels directly to the receiving array 24 from a transmitting array 24 without reflecting off the user).
During operation, control circuitry 26 may operate one or more arrays 24 as a transmitting array and one or more arrays 24 as a receiving array. For example, control circuitry 26 may control the phases of transducers 38 in a first array 24 (e.g., a transmitting array) to transmit a short pulse towards the user's eye and may control the phases of transducers 38 in a second array 24 (e.g., a receiving array) in a similar manner (e.g., such that the receiving array would generate the same short pulse if used for transmission). Control circuitry 26 may also, if desired, control the phases of the transmitting array 24 and the receiving array 24 to direct the ultrasonic signal beam to (or receive the ultrasonic signal beam from) an area of interest and thereby limit detected reflections from other areas. Active phasing may be achieved digitally or in analog. In digital phasing arrangements, control circuitry 26 may use a system clock to shift the time and phase of the driving signal to transducers 38. If desired, each transducer 38 may have an associated analog to digital converter circuit and phasing can be achieved computationally post-measurement.
Control circuitry 26 may, if desired, measure the actual resonance frequency of transducers 38 and may adjust drive signals for transducers 38 to compensate for any detuning that may occur in transducers 38 (e.g., detuning that may occur as a result of manufacturing effects or environmental conditions). This is merely illustrative. If desired, control circuitry 26 may not calculate or compensate for resonance frequency errors.
If desired, control circuitry 26 may phase the entire array 24 of transducers 38, combining all of the different center frequencies to form a short ultrasonic signal pulse. In other arrangements, control circuitry 26 may phase only transducers 38 in array 24 with the same center frequency. For example, each set of transducers 38 with the same center frequency in array 24 may be phased to produce an ultrasonic signal that illuminates the desired area and/or to receive an ultrasonic signal beam from a known range of angles. Control circuitry 26 may determine the phase and/or amplitude of the received signal from each set of phased transducers 38 with the same center frequency in array 24. Based on the phase and/or amplitude of the received signal at the different frequencies covered by array 24, control circuitry 26 may determine the time delay and/or phase delay of the emitted signal (and thus the location of the user's eye).
If desired, control circuitry 26 may tune the transmission properties (e.g., phases and/or amplitudes) of array 24T to reduce the amplitude of signals directly transmitted from transmitting arrays 24T to receiving arrays 24R. In the event that direct transmission from transmitting arrays 24T to receiving arrays 24R remains too large, one or more additional transmitting arrays 24T may be used to transmit a signal (e.g., steered away from the user's eye) that partially or completely nullifies the direct transmission from transmitting arrays 24T to receiving arrays 24R, if desired.
Cornea 30 and sclera 28 may be approximately spherical. If desired, control circuitry 26 may determine the locations of reflection points C1 and/or C2 as a function of the sphere center and/or sphere radii of cornea 30 and/or sclera 28. For example, the location of reflection point C1 may be determined as a function of cornea center cs.
The examples of
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of U.S. provisional patent application No. 63/014,651, filed Apr. 23, 2020, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6923066 | Ogawa | Aug 2005 | B2 |
7727156 | Angelsen et al. | Jun 2010 | B2 |
8500638 | Jinde et al. | Aug 2013 | B2 |
10303246 | Vidal et al. | May 2019 | B2 |
10908279 | Scally et al. | Feb 2021 | B2 |
20110213248 | Murakami | Sep 2011 | A1 |
20170261610 | Scally | Sep 2017 | A1 |
20190331919 | Huo | Oct 2019 | A1 |
20200064635 | Franklin | Feb 2020 | A1 |
20200174284 | Chan | Jun 2020 | A1 |
20200282424 | Oralkan | Sep 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
63014651 | Apr 2020 | US |