This disclosure relates generally to optics, and in particular to Time-of-Flight.
Time-of-Flight (ToF) measurements are sensed by transmitting a signal (e.g. light) toward a target. A portion of the signal reflects or scatters from the target as a return signal and is sensed by a sensor. For direct Time-of-Flight (dToF) applications, the time between the transmission of the signal and the sensing of the return signal is measured to determine the distance between a ToF device and the target. For indirect Time-of-Flight (iToF) applications, the phase-shift of the returning light is used to determine the distance between a ToF device and the target. The velocity of the target can also be determined from multiple ToF measurements.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of a stacked Time-of-Flight module are described herein. In the following description, numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In some implementations of the disclosure, the term “near-eye” may be defined as including an element that is configured to be placed within 50 mm of an eye of a user while a near-eye device is being utilized. Therefore, a “near-eye optical element” or a “near-eye system” would include one or more elements configured to be placed within 50 mm of the eye of the user.
In aspects of this disclosure, visible light may be defined as having a wavelength range of approximately 380 nm-700 nm. Non-visible light may be defined as light having wavelengths that are outside the visible light range, such as ultraviolet light and infrared light. Infrared light having a wavelength range of approximately 700 nm-1 mm includes near-infrared light. In aspects of this disclosure, near-infrared light may be defined as having a wavelength range of approximately 700 nm-1.6 μm.
In aspects of this disclosure, the term “transparent” may be defined as having greater than 90% transmission of light. In some aspects, the term “transparent” may be defined as a material having greater than 90% transmission of visible light.
Existing ToF modules have a light source, a light source driver, and a sensor. These components are disposed on a substrate such as a printed circuit board (PCB) or a flex circuit board and thus are mounted in the same plane (e.g. side-by-side on the substrate). This mounting configuration results in a relatively large x-y footprint for the ToF module.
In some contexts, the x-y footprint of existing ToF modules is too large for engineering and/or aesthetic reasons. Head mounted devices are an example context that may benefit from ToF modules with a reduced footprint. A reduced footprint may assist in mounting the ToF module in a frame of the head mounted device, for example. In implementations of the disclosure, components of a ToF module are stacked to reduce the footprint of the ToF module. An example method of fabricating stacked ToF modules is also disclosed. These and other embodiments are described in more detail in connection with
ToF module 200 also includes a light sensor 240 having a photosensitive region 243 and a non-photosensitive region 241. Light sensor 240 may include an image sensor. The image sensor may be a complementary metal-oxide-semiconductor (CMOS) image sensor. The photosensitive region 243 may include an array of light sensors (e.g. photodiodes) arranged in rows and columns. In an implementation, photosensitive region 243 includes an array of single-photon avalanche diodes (SPADs). The non-photosensitive region 241 may include readout circuitry and/or control circuitry to initiate and process images captured by photosensitive region 243. Light sensor 240 is configured to sense returning light 235. Returning light 235 is the pulsed illumination light 231 that is reflected or scattered from target 290 after being emitted by light source 230. Although not specifically illustrated, ToF module 200 may include a receiving lens or receiving lens assembly configured to focus returning light 235 to the photosensitive region 243 of light sensor 240.
Driver module 220 includes electrical circuitry configured to selectively drive light source 230 to emit pulsed illumination light 231. The electrical circuitry may include switches, transistors, and amplifiers, for example.
Light source 230 may include one or more illuminators to generate pulsed illumination light 231. Pulsed illumination light 231 may be infrared light. Pulsed illumination light 231 may be near-infrared light. Pulsed illumination light 231 may include a plurality of pulses where the pulse-widths are on the scale of nanoseconds. In an implementation, light source 230 includes an array of illuminators arranged in rows and columns. The illuminators may include lasers or LEDs. The illuminators may include vertical-cavity surface-emitting lasers (VCSELs). In an implementation, the pulsed illumination light 231 is narrow-band near-infrared light having a linewidth of less than 10 nm. Pulsed illumination light 231 may be centered around 850 nm or 940 nm, in some implementations. Pulsed illumination light 231 may be collimated light. Although not specifically illustrated, ToF module 200 may include a transmitting lens or transmitting lens assembly configured to focus the pulsed illumination light 231.
In the illustrated example of
In operation, driver module 220 drives light source 230 to emit pulsed illumination light 231. A pulse in pulsed illumination light 231 is reflected and/or scattered from target 290 as returning light 235. Returning light 235 is measured by light sensor 240 when it encounters light sensor 240. When light sensor 240 includes an image sensor, each pixel in the image sensor will generate a ToF signal that can be used to determine a distance and/or velocity of target 290.
Light sensor 340 includes a photosensitive region 343 and a non-photosensitive region 341. Light sensor 340 may include an image sensor. The image sensor may be a complementary metal-oxide-semiconductor (CMOS) image sensor. The photosensitive region 343 may include an array of light sensors (e.g. photodiodes) arranged in rows and columns. In an implementation, photosensitive region 343 includes an array of single-photon avalanche diodes (SPADs). The non-photosensitive region 341 may include readout circuitry and/or control circuitry to initiate and process images captured by photosensitive region 343. Light sensor 340 is configured to sense returning light 235. Returning light 235 is the pulsed illumination light 231 that is reflected or scattered from target 290 after being emitted by light source 230. Although not specifically illustrated, ToF module 300 may include a receiving lens or receiving lens assembly configured to focus returning light 235 to the photosensitive region 343 of light sensor 340.
Driver module 320 includes electrical circuitry configured to selectively drive light source 230 to emit pulsed illumination light 231. The electrical circuitry may include switches, transistors, and amplifiers, for example. Although not specifically illustrated, ToF module 300 may include a transmitting lens or transmitting lens assembly configured to focus the pulsed illumination light 231.
In the illustrated example of
In operation, driver module 320 drives light source 230 to emit pulsed illumination light 231. A pulse in pulsed illumination light 231 is reflected and/or scattered from target 290 as returning light 235. Returning light 235 is measured by light sensor 340 when it encounters light sensor 340. When light sensor 340 includes an image sensor, each pixel in the image sensor will generate a ToF signal that can be used to determine a distance and/or velocity of target 290.
Light sensor 440 includes a photosensitive region 443 and a non-photosensitive region 441. Light sensor 440 may include an image sensor. The image sensor may be a complementary metal-oxide-semiconductor (CMOS) image sensor. The photosensitive region 443 may include an array of light sensors (e.g. photodiodes) arranged in rows and columns. In an implementation, photosensitive region 443 includes an array of single-photon avalanche diodes (SPADs). The non-photosensitive region 441 may include readout circuitry and/or control circuitry to initiate and process images captured by photosensitive region 443. Light sensor 440 is configured to sense returning light 235. Returning light 235 is the pulsed illumination light 231 that is reflected or scattered from target 290 after being emitted by light source 230. Although not specifically illustrated, ToF module 400 may include a receiving lens or receiving lens assembly configured to focus returning light 235 to the photosensitive region 443 of light sensor 440.
Driver module 420 includes electrical circuitry configured to selectively drive light source 230 to emit pulsed illumination light 231. The electrical circuitry may include switches, transistors, and amplifiers, for example. Although not specifically illustrated ToF module 400 may include a transmitting lens or transmitting lens assembly configured to focus the pulsed illumination light 231.
In the illustrated example of
In operation, driver module 420 drives light source 230 to emit pulsed illumination light 231. A pulse in pulsed illumination light 231 is reflected and/or scattered from target 290 as returning light 235. Returning light 235 is measured by light sensor 440 when it encounters light sensor 440. When light sensor 440 includes an image sensor, each pixel in the image sensor will generate a ToF signal that can be used to determine a distance and/or velocity of target 290.
Light sensor 340 is configured to sense returning light 235. Returning light 235 is the pulsed illumination light 231 that is reflected or scattered from target 290 after being emitted by light source 230. Although not specifically illustrated, ToF module 500 may include a receiving lens or receiving lens assembly configured to focus returning light 235 to the photosensitive region 343 of light sensor 340.
Driver module 520 includes electrical circuitry configured to selectively drive light source 230 to emit pulsed illumination light 231. The electrical circuitry may include switches, transistors, and amplifiers, for example. Although not specifically illustrated ToF module 500 may include a transmitting lens or transmitting lens assembly configured to focus the pulsed illumination light 231.
In the illustrated example of
In operation, driver module 520 drives light source 230 to emit pulsed illumination light 231. A pulse in pulsed illumination light 231 is reflected and/or scattered from target 290 as returning light 235. Returning light 235 is measured by light sensor 340 when it encounters light sensor 340. When light sensor 340 includes an image sensor, each pixel in the image sensor will generate a ToF signal that can be used to determine a distance and/or velocity of target 290.
In an implementation, driver module 620 is a die included in a Fan Out Wafer Level Package (FOWLP) that includes a molding portion 670. In this implementation, one or more TMVs are included in molding portion 670 of the FOWLP and the TMVs are electrically coupled to electrical traces 613 in substrate 610. A Redistribution Layer (RDL) 680 may then be formed over the FOWLP where RDL 680 connects the RDL 680 to one or more TMVs 675.
In an implementation, mold compound 670 is disposed in cavity 617 of the substrate 610 where the mold compound 670 is disposed between driver module 620 and substrate 610 and between light sensor 340 and substrate 610.
Light sensor 340 is disposed over at least a portion of RDL 680, in
One or more wire bonds 649 electrically couples light sensor 340 to RDL 680. The one or more TMVs 675 and RDL 680 combine to electrically couple the light sensor 340 to the electrical traces 613 of substrate 610. One or more wire bonds 619 may electrically couple electrical traces 613 with light sensor 340.
In operation, driver module 620 drives light source 230 to emit pulsed illumination light 231. A pulse in pulsed illumination light 231 is reflected and/or scattered from target 290 as returning light 235. Returning light 235 is measured by light sensor 340 when it encounters light sensor 340. When light sensor 340 includes an image sensor, each pixel in the image sensor will generate a ToF signal that can be used to determine a distance and/or velocity of target 290.
In
In
In
In
In
In
In
In
In
Driver module 780 includes electrical circuitry configured to selectively drive light source 785 to emit pulsed illumination light. The electrical circuitry may include switches, transistors, and amplifiers, for example.
In
In
In
In
Light source 785 of ToF module 700 may include one or more illuminators to generate pulsed illumination light 793. Pulsed illumination light 793 may be infrared light. Pulsed illumination light 793 may be near-infrared light. Pulsed illumination light 793 may include a plurality of pulses where the pulse-widths are on the scale of nanoseconds. In an implementation, light source 785 includes an array of illuminators arranged in rows and columns. The illuminators may include lasers or LEDs. The illuminators may include vertical-cavity surface-emitting lasers (VCSELs). In an implementation, the pulsed illumination light 793 is narrow-band near-infrared light having a linewidth of less than 10 nm. Pulsed illumination light 793 may be centered around 850 nm or 940 nm, in some implementations. Pulsed illumination light 793 may be collimated light. Although not specifically illustrated, ToF module 700 may include a transmitting lens or transmitting lens assembly configured to focus the pulsed illumination light 793.
In operation, driver module 780 drives light source 785 to emit pulsed illumination light 793. A pulse in pulsed illumination light 793 is reflected and/or scattered from target 794 as returning light 795. Returning light 795 is measured by light sensor 740 when it encounters light sensor 740. When light sensor 740 includes an image sensor, each pixel in the image sensor will generate a ToF signal that can be used to determine a distance and/or velocity of target 794.
Among the benefits of fabricating a ToF module 700 using the process illustrated in
HMD 800 includes frame 814 coupled to arms 811A and 811B. Lens assemblies 821A and 821B are mounted to frame 814. Lens assemblies 821A and 821B may include a prescription lens matched to a particular user of HMD 800. The illustrated HMD 800 is configured to be worn on or about a head of a wearer of HMD 800.
In the HMD 800 illustrated in
Lens assemblies 821A and 821B may appear transparent to a user to facilitate augmented reality or mixed reality to enable a user to view scene light from the environment around them while also receiving image light directed to their eye(s) by, for example, waveguides 850. Lens assemblies 821A and 821B may include two or more optical layers for different functionalities such as display, eye-tracking, and optical power. In some embodiments, image light from display 830A or 830B is only directed into one eye of the wearer of HMD 800. In an embodiment, both displays 830A and 830B are used to direct image light into waveguides 850A and 850B, respectively. The implementations of the disclosure may also be used in head mounted devices (e.g. smartglasses) that don't necessarily include a display but are configured to be worn on or about a head of a wearer.
Frame 814 and arms 811 may include supporting hardware of HMD 800 such as processing logic 807, a wired and/or wireless data interface for sending and receiving data, graphic processors, and one or more memories for storing data and computer-executable instructions. Processing logic 807 may include circuitry, logic, instructions stored in a machine-readable storage medium, ASIC circuitry, FPGA circuitry, and/or one or more processors. Processing logic 807 may be electrically connected to any of the ToF modules disclosed herein. In one embodiment, HMD 800 may be configured to receive wired power. In one embodiment, HMD 800 is configured to be powered by one or more batteries. In one embodiment, HMD 800 may be configured to receive wired data including video data via a wired communication channel. In one embodiment, HMD 800 is configured to receive wireless data including video data via a wireless communication channel. Processing logic 807 may be communicatively coupled to a network 880 to provide data to network 880 and/or access data within network 880. The communication channel between processing logic 807 and network 880 may be wired or wireless.
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
The term “processing logic” (e.g. processing logic 807) in this disclosure may include one or more processors, microprocessors, multi-core processors, Application-specific integrated circuits (ASIC), and/or Field Programmable Gate Arrays (FPGAs) to execute operations disclosed herein. In some embodiments, memories (not illustrated) are integrated into the processing logic to store instructions to execute operations and/or store data. Processing logic may also include analog or digital circuitry to perform the operations in accordance with embodiments of the disclosure.
A “memory” or “memories” described in this disclosure may include one or more volatile or non-volatile memory architectures. The “memory” or “memories” may be removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Example memory technologies may include RAM, ROM, EEPROM, flash memory, CD-ROM, digital versatile disks (DVD), high-definition multimedia/data storage disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
Networks may include any network or network system such as, but not limited to, the following: a peer-to-peer network; a Local Area Network (LAN); a Wide Area Network (WAN); a public network, such as the Internet; a private network; a cellular network; a wireless network; a wired network; a wireless and wired combination network; and a satellite network.
Communication channels may include or be routed through one or more wired or wireless communication utilizing IEEE 802.11 protocols, short-range wireless protocols, SPI (Serial Peripheral Interface), I2C (Inter-Integrated Circuit), USB (Universal Serial Port), CAN (Controller Area Network), cellular data protocols (e.g. 3G, 4G, LTE, 5G), optical communication networks, Internet Service Providers (ISPs), a peer-to-peer network, a Local Area Network (LAN), a Wide Area Network (WAN), a public network (e.g. “the Internet”), a private network, a satellite network, or otherwise.
A computing device may include a desktop computer, a laptop computer, a tablet, a phablet, a smartphone, a feature phone, a server computer, or otherwise. A server computer may be located remotely in a data center or be stored locally.
The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a tangible or non-transitory machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or otherwise.
A tangible non-transitory machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
This application claims priority to both U.S. provisional Application No. 63/441,332 filed Jan. 26, 2023 and U.S. provisional Application No. 63/438,946 filed Jan. 13, 2023, which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
63441332 | Jan 2023 | US | |
63438946 | Jan 2023 | US |