This application relates to the field of sensing technologies, and in particular, to a time-of-flight TOF sensor module and an electronic device.
With development of informatization, accurate and reliable information needs to be obtained first in a process of using information, and a sensor is a main way and means to obtain information. Currently, sensors have been widely used in many fields, for example, fields such as industrial production, cosmic exploration, ocean exploration, environmental protection, resource surveys, medical diagnosis, and biological engineering. Three-dimensional (three dimensional, 3D) sensors are a hot topic under research in the sensor field.
Technologies applicable to the 3D sensors mainly include stereoscopic imaging, structured light, time-of-flight (time-of-flight, TOF) detection, and the like. TOF has advantages such as a long detection distance and a high resolution, and is an important technology used by the 3D sensors. Conventional TOF detection is mainly classified into a single-time full-projection TOF camera and a scanning device-based TOF scanner. The scanning device-based TOF scanner has a relatively high spatial resolution, but imposes a relatively high requirement on precision of a scanning angle, and requires a complex scanning structure, making it difficult to miniaturize a TOF sensor module. The single-time full-projection TOF camera has advantages of a high detection speed and a large field of view (field of view, FOV), but is limited by a detection element array, power consumption, and a maximum quantity of configurable memories of a sensor. A maximum size of an array of detection elements, of a detector, that can be simultaneously started is 160×120, thereby limiting a resolution of a formed image.
This application provides a TOF sensor module and an electronic device, to resolve a problem in a conventional technology that an image resolution is low due to a limitation by a maximum quantity of detection elements that can be simultaneously started.
According to a first aspect, this application provides a TOF sensor module. The TOF sensor module includes a light source, a beam adjustment assembly, and a detection assembly. The light source is configured to emit m first beams at each of M moments, and transmit the m first beams to the beam adjustment assembly. The beam adjustment assembly is configured to: after adjusting the received m first beams into S second beams, project the S second beams to S regions of a detection surface, where the S regions are in a one-to-one correspondence with the S second beams, and M projection points that are in a same region of the detection surface and to which projection is performed at the M moments respectively have different locations. The detection assembly is configured to receive S optical echo signals from the detection surface at each of the M moments, and convert the S optical echo signals into S electrical echo signals for storage at each moment, where the S optical echo signals are in a one-to-one correspondence with the S second beams, each optical echo signal is a signal obtained by reflecting a corresponding second beam by the detection surface, both m and M are integers greater than 1, and S is an integer greater than m.
Based on the TOF sensor module, the light source emits m first beams separately at different times, and M projection points that are in a same region of the detection surface and to which projection is performed at the M moments respectively have different locations. This is equivalent to performing M scans on each region of the detection surface. The detection assembly may simultaneously start S detection elements at each moment to receive S optical echo signals, and may receive a total of M×S optical echo signals at the M moments. In this way, image information can be determined based on the M×S optical echo signals, thereby helping increase a resolution of a formed image. If S=160×120, a determined image resolution may be M×160×120. When M=4×4, the determined image resolution may be 640×480. In other words, based on a capability (160×120 detection elements can be simultaneously started at most) of a conventional sensor, the TOF sensor module can form an image with a resolution of 640×480 or higher by reusing S detection elements at different times. This helps avoid a problem that a resolution of an image formed by the TOF sensor module is relatively low due to a limitation by a maximum quantity of detection elements that can be simultaneously started.
In a possible implementation, the detection assembly may include K detection elements, where K is an integer greater than or equal to S, and the detection assembly is configured to power on S detection elements of the K detection elements at each of the M moments. In other words, S detection elements of the K detection elements may be selected at each of the M moments. By powering on S detection elements in the detection assembly at each of the M moments, power consumption of the detection assembly can be reduced while a low image resolution caused by the limitation by the maximum quantity of detection elements that can be simultaneously started is addressed.
Further, optionally, the TOF sensor module further includes a processing circuit, and the processing circuit is configured to obtain, from the detection assembly, M×S electrical echo signals obtained at the M moments, and determine image information based on the M×S electrical echo signals. This helps avoid the limitation by the maximum quantity of detection elements that can be simultaneously started in the detection assembly, thereby increasing a determined image resolution.
In this application, S may be equal to m×n, and the beam adjustment assembly may be configured to adjust transmission directions of the received m first beams, split each of adjusted m first beams into n second beams to obtain m×n second beams, and project the m×n second beams to m×n regions of the detection surface, where the m×n regions are in a one-to-one correspondence with the m×n second beams, and n is an integer greater than 1. The m×n second beams are projected to the m×n regions of the detection surface, so that each region of the detection surface can be scanned, and a super-resolution effect can be achieved.
In a possible implementation, the detection assembly may include m×n detection elements, and the m×n detection elements are in a one-to-one correspondence with the m×n regions. Each of the m×n detection elements is configured to receive an optical echo signal from a corresponding region at each of the M moments, and convert the optical echo signal from the corresponding region into an electrical echo signal for storage at each moment. In this way, a resolution of an image of the TOF sensor module can be increased when the detection assembly includes a relatively small quantity of detection elements. In addition, the detection assembly includes a relatively small quantity of detection elements, thereby facilitating miniaturization of the TOF sensor module.
This application provides the following two possible TOF sensor modules as examples.
The light source includes M light source partitions, and each light source partition includes m emitters. m emitters in one of the M light source partitions are configured to emit m first beams at each of the M moments, where a light source partition used to emit m first beams at each of the M moments varies. In this way, the light source can emit m first beams at each of the M moments.
Further, optionally, the beam adjustment assembly includes a collimation assembly and a beam splitting assembly. The collimation assembly is configured to adjust an included angle between any two adjacent first beams of the m first beams into a first angle, and transmit the adjusted m first beams to the beam splitting assembly, where the first angle is determined based on a total field of view corresponding to the detection surface and a quantity m×n of second beams. The beam splitting assembly is configured to split each of the adjusted m first beams into n second beams.
In a possible implementation, if the total field of view corresponding to the detection surface is 64×48, when the quantity of second beams is equal to 160×120, the first angle is equal to (64/160)×(48/120)=0.4×0.4 degrees.
In a possible implementation, the M light source partitions may be an M1×M2 array. In a horizontal direction of the M1×M2 array, an included angle between first beams emitted by two adjacent emitters in two adjacent light source partitions is greater than or equal to an angle corresponding to an interval of M1 projection points on the detection surface. In a vertical direction of the M1×M2 array, an included angle between first beams emitted by two adjacent emitters in two adjacent light source partitions is greater than or equal to an angle corresponding to an interval of M2 projection points on the detection surface. Both M1 and M2 are integers greater than 1.
In a possible implementation, the detection assembly may include m×n detection elements. Each of the m×n detection elements is powered on at each of the M moments. Each of the m×n detection elements is configured to receive an optical echo signal from a corresponding region at each of the M moments, and convert the optical echo signal from the corresponding region into an electrical echo signal for storage at each moment.
The light source includes P emitters, where P is an integer greater than m. m emitters of the P emitters emit m first beams at preset intervals at each moment, where m emitters used to emit m first beams at each of the M moments vary. In this way, the light source can emit m first beams at each of the M moments.
In an optional implementation, the beam adjustment assembly includes a collimation assembly and a beam splitting assembly. The collimation assembly is configured to adjust an included angle between any two adjacent first beams of the m first beams into a second angle, and transmit the adjusted m first beams to the beam splitting assembly, where the second angle is determined based on a total field of view corresponding to the detection surface and a quantity m of started light sources. The beam splitting assembly is configured to split each of the adjusted m first beams into n second beams.
In a possible implementation, if the total field of view corresponding to the detection surface is 64×48, when the quantity m of started emitters is equal to m1×m2, the second angle is equal to (64/m1)×(48/m2). To be specific, the second angle is (64/m1) degrees in a horizontal direction, and is (48/m2) degrees in a vertical direction.
In a possible implementation, the detection assembly may power on each of m×n detection elements at each of the M moments. Each of the m×n detection elements is configured to receive an optical echo signal from a corresponding region at each of the M moments, and convert the optical echo signal from the corresponding region into an electrical echo signal for storage at each moment.
In a possible implementation, the M light source partitions are integrally molded. In this way, it can be ensured that the M light source partitions are on a same plane. In addition, a size of the integrally molded M light source partitions is relatively small, thereby facilitating miniaturization of the TOF sensor module.
According to a second aspect, this application provides a TOF sensor module. The TOF sensor module includes a light source, a beam adjustment assembly, and a detection assembly. The light source is configured to emit m first beams at each of M moments, and transmit the m first beams to the beam adjustment assembly, where both m and M are integers greater than 1. The beam adjustment assembly is configured to: after adjusting transmission directions of the received m first beams, project adjusted m first beams to a corresponding region of a detection surface, where a projection point that is on the detection surface and to which projection is performed at each of the M moments is located in a separate region. The detection assembly is configured to receive m optical echo signals from the corresponding region of the detection surface at each of the M moments, and convert the m optical echo signals into m electrical echo signals for storage at each moment, where the m optical echo signals are in a one-to-one correspondence with the m first beams, and each optical echo signal is a signal obtained by reflecting a corresponding first beam by the detection surface.
Based on the TOF sensor module, at each of the M moments, after m first beams emitted by the light source are adjusted by the beam adjustment assembly, the m first beams cover one region of the detection surface. The light source emits m first beams separately at different times, and a corresponding detection element in the detection assembly is selected, thereby resolving a problem that 160×120 detection elements can be simultaneously started at most.
In a possible implementation, the light source includes M light source partitions, each light source partition includes m emitters, and the M light source partitions are in a one-to-one correspondence with M regions. m emitters in one of the M light source partitions are configured to emit m first beams at each of the M moments, where a light source partition used to emit m first beams at each of the M moments varies.
In a possible implementation, the detection assembly includes M detection element regions, the M detection element regions are in a one-to-one correspondence with the M light source partitions, each detection element region includes a plurality of detection elements, and each detection element region is configured to receive optical echo signals obtained by reflecting, by the detection surface, beams emitted by a light source partition corresponding to the detection element region. The detection assembly is configured to power on, at each of the M moments, only each detection element in a detection element region of the M detection element regions that corresponds to the light source partition used to emit the m first beams. At each of the M moments, only one of the M detection element regions is powered on, and other detection element regions are not powered on. In this way, only some detection element regions may be enabled to operate, thereby helping reduce power consumption of the detection assembly.
In a possible implementation, the beam adjustment assembly is configured to adjust the transmission directions of the m first beams, and uniformly project the m first beams to the corresponding region. By uniformly projecting the m first beams to the corresponding region of the detection surface, the corresponding region of the detection surface can be uniformly scanned, thereby helping improve accuracy of determined image information.
In a possible implementation, the M light source partitions are integrally molded. In this way, it can be ensured that the M light source partitions are on a same plane. In addition, a size of the integrally molded M light source partitions is relatively small, thereby facilitating miniaturization of the TOF sensor module.
In a possible implementation, a light pipe is disposed between the beam adjustment assembly and each of the M light source partitions. The light pipe is configured to homogenize the received m first beams. This prevents the beam adjustment assembly from being close to the light source partitions, thereby improving utilization of the beam adjustment assembly, and helping reduce difficulty of assembling the TOF sensor module.
According to a third aspect, this application provides an electronic device. The electronic device may include the TOF sensor module described in the first aspect or the second aspect, and a fixing assembly. The fixing assembly is configured to fix the TOF sensor module.
To make the objectives, technical solutions, and advantages of this application clearer, the following further describes this application in detail with reference to the accompanying drawings.
In the following, some terms in this application are described, to help a person skilled in the art have a better understanding.
1. Spatial Resolution
The spatial resolution is a minimum distance between two adjacent objects that can be recognized in an image, and is used to represent an index of distinguishing details of a target object in the image. The spatial resolution is one of important indexes to evaluate sensor performance, and is also an important basis for recognizing a shape and a size of an object. The spatial resolution is usually represented by a size of an image element, an image resolution, or a field of view. The image element is a grid cell formed by discretizing ground information, and the image element is a smallest area that can be distinguished in a scanning image. The image resolution is represented by a line width that can be distinguished within a unit distance, or by a quantity of parallel lines at equal intervals. The field of view (IFOV) is a light receiving angle or an observation view of a single detection element (for example, a pixel) in a sensor, and is also referred to as an angular resolution of the sensor, and a unit is a milliradian (mrad) or a microradian (μrad). The field of view β is related to a wavelength λ and an aperture D of a collector: β=λ/2D. A smaller field of view indicates a higher spatial resolution.
2. Image Resolution
The image resolution is an amount of information stored in an image, and is a quantity of pixels in each inch of the image. A unit of a resolution is pixels per inch (pixels per inch, PPI). It should be understood that a field of view of each pixel is equal to a total field of view corresponding to a detection surface divided by a total quantity of pixels in a sensor.
3. Video Graphics Array (Video Graphics Array, VGA)
The VGA is a standard of a display resolution, and a corresponding resolution is 640×480.
4. Diffractive Optical Element (Diffractive Optical Elements, DOE)
The DOE is also referred to as a binary optical device. Beam splitting by the DOE is implemented by using a diffraction principle. A plurality of diffraction orders may be generated after a beam passes through the DOE. Each order corresponds to one beam. The DOE can implement one-dimensional beam splitting, two-dimensional beam splitting, or the like by using a specific surface structure design. The two-dimensional beam splitting means that the DOE may separately perform beam splitting in each of two directions (for example, a horizontal direction and a vertical direction). For example, the DOE splits a 1×1 beam into 16×12 beams. This means that the DOE splits one beam into 16 beams in the horizontal direction, and splits one beam into 12 beams in the vertical direction.
5. Single-Photon Avalanche Diode (Single-Photon Avalanche Diode, SPAD)
The SPAD is also referred to as a single-photon detector, and is a photoelectric detection avalanche diode with a single-photon detection capability. The SPAD has relatively high sensitivity, and is triggered upon detection of one photon. After being triggered, the SPAD usually requires approximately 10 ns to be restored to an initial state. Therefore, the SPAD may be configured to detect presence of photons, but cannot detect a quantity of photons. Usually, there are a plurality of SPADs in each detector in an image sensing system. For example,
In this application, a TOF sensor module may be applied to an electronic device, for example, a mobile phone; or may be applied to fields such as vehicle-mounted laser radars, self-driving, unmanned aerial vehicles, internet of vehicles, and security surveillance. The TOF sensor module transmits an electromagnetic wave and receives an electromagnetic wave (namely, an optical echo signal) scattered by a target object, and compares and analyzes the received optical echo signal and the transmitted electromagnetic wave, to extract information related to the target object, such as a distance from the target object, for another example, to form an image of the target object, and for another example, to obtain three-dimensional point cloud density of the target object.
In view of the foregoing problem, this application provides a TOF sensor module. A detection assembly in the TOF sensor module may receive optical echo signals from a detection surface at different times, and determine image information based on the optical echo signals received at different times, to increase a resolution of a formed image.
The following describes in detail the TOF sensor module provided in this application with reference to
For example, M=4, and
Based on the TOF sensor module, the light source is configured to emit m first beams separately at different times, and M projection points that are in a same region of the detection surface and to which projection is performed at the M moments respectively have different locations. This is equivalent to performing M scans on each region of the detection surface. The detection assembly may simultaneously start S detection elements at each moment to receive S optical echo signals, and may receive a total of M×S optical echo signals at the M moments. In this way, image information can be determined based on the M×S optical echo signals, thereby helping increase a resolution of a formed image. If S=160×120, a determined image resolution may be M×160×120. When M=4×4, the determined image resolution may be 640×480. In other words, based on a capability (160×120 detection elements can be simultaneously started at most) of a conventional sensor, the TOF sensor module can form an image with a resolution of 640×480 or higher by reusing S detection elements at different times. This helps avoid a problem that a resolution of an image formed by the TOF sensor module is low due to a limitation by a maximum quantity of detection elements that can be simultaneously started.
It should be noted that the image information in this application is depth image information, for example, a distance between the detection surface and the TOF sensor module, and an orientation, a height, a velocity, a posture, and a shape of a target on the detection surface.
The following separately describes functional components and structures shown in
1. Light Source
In this application, an emitter may be a laser, for example, a vertical cavity surface emitting laser (vertical cavity surface emitting laser, VCSEL) or an edge emitting laser (edge emitting laser, EEL). A size of a pitch (pitch, with reference to
Based on an arrangement manner and a lighting manner of emitters, the following provides two possible cases as examples.
Case 1: The light source includes M light source partitions, each light source partition includes m emitters, and a manner of lighting an emitter by using a light source partition is used.
That the light source includes M light source partitions may be understood as that the light source is divided into M partitions.
Based on the case 1, an implementation of emitting, by the light source, m first beams at each of the M moments may be as follows: m emitters in one of the M light source partitions are configured to emit m first beams at each of the M moments, where a light source partition used to emit m first beams at each of the M moments varies.
This may also be understood as that, based on the case 1, m emitters in one light source partition are started at one moment, and the started m emitters are configured to separately emit first beams, to obtain the m first beams. It should be noted that the M light source partitions may be sequentially started at the M moments. To be specific, the 1st light source partition is started at the 1st moment, the 2nd light source partition is started at the 2nd moment, and so on. Alternatively, the M light source partitions may be randomly started. This is not limited in this application.
In this application, the M light source partitions may be an M1×M2 array, where both M1 and M2 are integers greater than 1, and M1 and M2 may be equal or unequal. For example, the M light source partitions may be alternatively arranged in one row with M columns or one column with M rows. This is not limited in this application.
Case 2: The light source includes P emitters, and m emitters are selected from the P emitters at preset intervals.
Herein, the “at preset intervals” may be at an interval of one emitter or at an interval of two emitters. In case of an emitter array, a quantity of emitters within an interval in a row direction may be the same as or different from a quantity of emitters within an interval in a column direction.
Based on the case 2, an implementation of emitting, by the light source, m first beams at each of the M moments may be as follows:
m emitters selected from the P emitters at preset intervals are configured to emit m first beams at each of the M moments, where m emitters used to emit m first beams at each of the M moments vary.
In a possible implementation, the “at preset intervals” may be starting one emitter at an interval of one emitter.
2. Beam Adjustment Assembly
In this application, S may be equal to m×n. Further, optionally, the beam adjustment assembly may be configured to adjust transmission directions of the received m first beams, split each of adjusted m first beams into n second beams to obtain m×n second beams, and project the m×n second beams to m×n regions of the detection surface, where the m×n regions are in a one-to-one correspondence with the m×n second beams, and n is an integer greater than 1.
In an optional implementation, the beam adjustment assembly adjusts an included angle between any two adjacent first beams of the m first beams into a first angle, and split each of the m first beams into n second beams. m emitters in one light source partition are started at each of the M moments, and M projection points that are in a same region of the detection surface and to which projection is performed at the M moments respectively have different locations, in other words, locations of projection points in a plurality of regions of the detection surface can be switched. The location switching of the projection points achieves an effect of scanning the detection surface (refer to
Based on the light source described in the case 1, in a possible implementation, the beam adjustment assembly may include a collimation assembly and a beam splitting assembly. The collimation assembly is configured to adjust the included angle between any two adjacent first beams of the m first beams that come from the light source into the first angle, and transmit the adjusted m first beams to the beam splitting assembly. The beam splitting assembly is configured to split each of the adjusted m first beams into n second beams, to obtain m×n second beams. The first angle is determined based on a total field of view corresponding to the detection surface and a quantity m×n of second beams. In other words, a magnitude of the first angle is related to the total field of view and a quantity of beams obtained through splitting by the beam splitting assembly. For example, using one dimension as an example, if the total field of view corresponding to the detection surface is 64 degrees, when m×n=160, the first angle is equal to 64/160=0.4 degrees. For another example, using two dimensions as an example, if the total field of view corresponding to the detection surface is 64×48 degrees, when m×n=160×120, the first angle is equal to (64/160)×(48/120)=0.4×0.4 degrees, where m×n=160×120 indicates that there are 160 second beams in a horizontal direction and 120 second beams in a vertical direction. Certainly, the collimation assembly may alternatively adjust included angles between two adjacent first beams in different directions into different angles. This is not limited in this application. It should be understood that the total field of view corresponding to the detection surface is usually approximately equal to a total field of view corresponding to the detector. The total field of view corresponding to the detector=a field of view of each detection element in the detector×a total quantity of detection elements in the detector.
Further, optionally, the collimation assembly may collimate a divergent first beam emitted by each emitter into parallel light. In addition, because locations of emitters in the vertical direction are different, parallel light is aggregated, at different incident angles, to a plane on which the optical splitting assembly is located. Using the light source shown in
Based on the light source described in the case 2, in a possible implementation, the beam adjustment assembly may include a collimation assembly and a beam splitting assembly. The collimation assembly is configured to adjust the included angle between any two adjacent first beams of the m first beams into a second angle, and transmit the adjusted m first beams to the beam splitting assembly, where the second angle is determined based on a total field of view corresponding to the detection surface and a quantity m of started light sources. For example, if the total field of view corresponding to the detection surface is 64×48 degrees, when m=16×12, the second angle is equal to (64/16)×(48/12)=4×4 degrees. The beam splitting assembly is configured to split each of the adjusted m first beams into n second beams. It can be understood that, to form an image with a resolution of 320×240 when the total field of view corresponding to the detection surface is 64×48 degrees, an angle between beams corresponding to two adjacent projection points on the detection surface is 64/320=0.2 degrees. 0-order diffracted light of the m first beams is uniformly projected to 320×240 regions. To be specific, an interval between projection points of 0-order diffracted light of any two adjacent first beams of the m first beams is 19 projection points, and the included angle between any two adjacent first beams=the second angle=[(64/320)×(320/16)]×[(64/320)×(240/12)]=[0.2×20]×[0.2×20]=4×4 degrees. It should be understood that the interval of 19 projection points means that an interval between two projection points, on the detection surface, of 0-order diffracted light of two adjacent first beams is 19 projection points. In other words, a pitch between centers of the projection points corresponding to the two beams of 0-order diffracted light is 20 projection points (refer to
In a possible implementation, the collimation assembly may be a collimator, or a collimation mirror, a microlens (microlens), or a combination of microlenses. Further, optionally, a focal length of the collimator, the collimation mirror, the microlens, or the combination of microlenses may be adjusted, to adjust an included angle between two adjacent first beams that come from the light source. The beam splitting assembly may be a DOE, a polarizing beam splitter (polarizing beam splitter, PBS), or a grating.
In this application, the focal length f of the collimator, the collimation mirror, the microlens (microlens), or the combination of microlenses may be adjusted or selected, to adjust the included angle between any two adjacent first beams of the m first beams that come from the light source into the first angle. For ease of description of the solution, the collimator is used as an example below. An included angle between two adjacent first beams is related to the focal length f of the collimator and a pitch of an emitter in the light source. This may be specifically as follows: f=pitch/tan(α), where α is the included angle between two adjacent first beams. For example, if a size of a pitch between EEL-based emitters (emitters) is 30 μm and the included angle between any two adjacent first beams of the m first beams is as follows: α=0.4 degrees, f=pitch/tan(θ)=30/tan(0.4)=4.3 mm. This may also be understood as that, by adjusting/selecting the focal length of the collimator to/as 4.3 mm, the included angle between any two adjacent first beams of the m first beams that come from the light source may be adjusted to 0.4 degrees.
In a possible implementation, if the beam splitting assembly is a DOE, a second beam is diffracted light of a first beam. A projection point, on the detection surface, of 0-order diffracted light of a first beam is a direct projection point of the first beam. Projection points of ±1-order diffracted light, ±2-order diffracted light, ±3-order diffracted light, and the like on the detection surface are equivalent to projection points obtained by separately copying the projection point of the 0-order diffracted light to corresponding regions. With reference to
3. Detection Assembly
In this application, the detection assembly may include K detection elements, and may power on S detection elements of the K detection elements at each of the M moments (refer to
In a possible implementation, the detection assembly may include m×n detection elements, and the detection element usually includes an SPAD and a TDC. With reference to
In this application, the detection assembly includes a detector, and the detector may include a detection element array, for example, an SPAD array, a PIN-type photodiode (also referred to as a PIN junction diode) array, or an avalanche photodiode (avalanche photodiode, APD) array.
In this application, the TOF sensor module may further include a processing circuit, and the processing circuit is configured to obtain, from the detection assembly, M×S electrical echo signals obtained at the M moments, and determine image information based on the M×S electrical echo signals. In this way, the processing circuit determines the image information based on M×S electrical echo signals, thereby helping increase a resolution of a formed image.
Further, optionally, the processing circuit may be integrated in the detection assembly, or may be a processor in an electronic device or a laser radar in which the TOF sensor module is located, for example, a central processing unit (central processing unit, CPU) in a mobile phone. If the processing circuit is integrated in the detection assembly, the detection assembly may send a stored electrical echo signal to the processing circuit, and the processing circuit may determine image information based on the received electrical echo signal. If the processing circuit is the processor in the electronic device or the laser radar in which the TOF sensor module is located, the detection assembly may send a stored electrical echo signal to the processor, and the processor may determine image information based on the received electrical echo signal.
In this application, the TOF sensor module may further include a receiving assembly. The receiving assembly is configured to receive an optical echo signal from the detection surface, and transmit the optical echo signal to the detection assembly. In a possible implementation, the receiving assembly may be a lens group.
Further, optionally, the TOF sensor module may further include a light filter, and the light filter may be located before the receiving assembly, or may be located between the receiving assembly and the detection assembly, to reduce impact of ambient light on the detection assembly.
Based on the foregoing content, the following provides two specific examples of the TOF sensor module with reference to a specific hardware structure, to help further understand the structure of the TOF sensor module.
In the following two examples, for ease of description of the solution, for example, the beam adjustment assembly includes a collimator and a DOE, the total field of view corresponding to the detection surface is 64×48 degrees, and 160×120 detection elements can be simultaneously started at most in the detection assembly.
When 160×120 detection elements can be simultaneously started at most in the detection assembly, to implement VGA for a formed image (that is, a resolution of the image is 640×480), in a possible case, the light source includes 4×4=16 light source partitions, and each light source partition includes 10×10 emitters. 10×10 first beams emitted by 10×10 emitters in each light source partition pass through the collimator and then are projected to the DOE. A quantity of effective beams obtained through splitting by the DOE is 16×12 (16 and 12 are quantities of effective beams obtained through splitting by the DOE in a horizontal direction and a vertical direction respectively). A quantity of second beams that come from one light source partition is as follows: m×n=(16×10)×(12×10)=160×120. To be specific, when m emitters of a single light source partition are started, 160×120 second beams may be generated. The 160×120 second beams are projected to 160×120 regions of the detection surface. (4×160)×(4×120)=640×480 second beams may be generated for the 4×4 light source partitions, and 4×4 second beams may be projected to each light source partition of the detection surface, so that a resolution of a formed image may be 640×480.
It should be noted that any one or more of a quantity of light source partitions in the light source, a quantity of emitters included in a light source partition, and a quantity of effective beams obtained through splitting by the DOE may be further changed, so that a resolution of an image formed by the TOF sensor module can meet VGA. It should be understood that a resolution of 640×480 or higher can be achieved by increasing the quantity of light source partitions in the light source and/or the quantity of emitters in the light source partition and/or the quantity of effective beams obtained through splitting by the DOE.
The collimator is configured to adjust transmission directions of the received 10×10 first beams, to adjust an included angle between any two adjacent first beams of the 10×10 first beams in each direction into a first angle. The first angle is determined based on a total field of view and a quantity m×n of second beams. Specifically, the first angle=(64/160)×(48/120)=0.4×0.4. To be specific, an included angle between any two adjacent first beams of 10×10 first beams adjusted by the collimator in each of a horizontal direction and a vertical direction is 0.4 degrees. Because an included angle between two adjacent first beams after adjustment is 0.4 degrees, 10 emitters with different heights in one light source partition can cover a field of view of 10×0.4=4 degrees, and four light source partitions can cover a field of view of 4×4=16 degrees.
To adjust an included angle between two adjacent first beams to the first angle, a focal length of the collimator is as follows: f=pitch/tan(α)=pitch/tan(0.4). When the emitter is an EEL-based emitter (emitter), f=30/tan(0.4)=4.3 mm.
Further, the collimator transmits the adjusted 10×10 first beams to the DOE. The DOE is configured to split each of the adjusted 10×10 first beams into 16×12 second beams, to obtain 160×120 second beams, and project the 160×120 second beams to same locations in 160×120 regions of a detection surface. A total of 4×4 light source partitions are started at 16 moments, and four scans can be implemented in each of the 160×120 regions of the detection surface. To be specific, each light source partition is projected to 160×120 projection points on the detection surface, and 4×4 light source partitions are projected to (160×4)×(120×4) projection points on the detection surface. To form a VGA image when a total field of view corresponding to the detection surface is 64×48 degrees, an angle between beams corresponding to two adjacent projection points on the detection surface is 64/640=0.1 degrees, in other words, an included angle between two adjacent second beams corresponding to two adjacent projection points on the detection surface is 0.1 degrees. With reference to
With reference to
In this application, a pitch Δ1 between two adjacent emitters of two adjacent light source partitions (refer to
It should be noted that, ±1-order diffracted light that comes from the (i+1)th light source partition is located on the left or right of 0-order diffracted light that comes from the ith light source partition. For example, −1-order diffracted light that comes from the 2nd light source partition is located on the left of 0-order diffracted light that comes from the 1st light source partition. For another example, −1-order diffracted light that comes from the 3rd light source partition is located on the right of 0-order diffracted light that comes from the 2nd light source partition. ±2-order diffracted light that come from the (i+1)th light source partition is located on the left or right of ±1-order diffracted light that comes from the ith light source partition, and so on, so that projection points on the detection surface can be closely connected, to form a schematic diagram of arrangement of projection points in
The detection assembly may include 160×120 detection elements, and one detection element corresponds to one region. In
Based on the example 1, non-scanning three-dimensional detection with a VGA resolution or a higher resolution can be implemented based on a capability of a conventional detector. By starting light source partitions of the light source through switching, second beams can be projected to different regions of the detection surface, thereby implementing resolution superposition, and achieving a TOF with the VGA resolution or a million-level resolution.
When 160×120 detection elements can be simultaneously started at most in the detection assembly, to form an image with a resolution of 320×240, in a possible case, the light source includes 32×24 emitters, and a preset interval is starting one emitter at an interval of one emitter (refer to
The collimator is configured to adjust transmission directions of the received 16×12 first beams, to adjust an included angle between any two adjacent first beams of the 16×12 first beams in each direction into a second angle. The second angle is determined based on a total field of view corresponding to a detection surface and a quantity m of started emitters. Specifically, the second angle=(64/16)×(48/12)=4×4. To be specific, an included angle between any two adjacent first beams of 16×12 first beams obtained after adjustment by the collimator in each of two directions is 4 degrees. It can be understood that the collimator transmits the adjusted first beams to the DOE, and in this case, an included angle between two adjacent second beams from a same moment in each of a horizontal direction and a vertical direction is 4 degrees. An angle corresponding to two adjacent projection points is (64/320)×(48/240)=0.2×0.2. Therefore, an interval between direct projection points of first beams that come from two adjacent emitters at a same moment is 19 projection points (in other words, a pitch between the direct projection points of the two first beams is 20 projection points). Refer to
In the example 2, to implement uniform arrangement of projection points on the detection surface, a focal length of the collimator may be adjusted to achieve an interval Δ3 between direct projection points of first beams of two adjacent emitters (for example, two adjacent emitters 1 in
Further, the collimator transmits the adjusted 16×12 first beams to the DOE. The DOE is configured to split each of the adjusted 16×12 first beams into 10×10 second beams, to obtain 160×120 second beams, and project the 160×120 second beams to same locations in 160×120 regions of the detection surface, for example, the upper left corner of each region of the detection surface in
The detection assembly may include 160×120 detection elements. Each detection element may receive two optical echo signals from a corresponding region of the detection surface, to obtain an image with a resolution of (2×160)×(2×120)=320×240.
Based on the example 2, m emitters are started at each of M moments at preset intervals (for example, at equal intervals), and a resolution of a formed image can reach 320×240 or higher without a special design for emitter arrangement.
In a possible implementation, m emitters in one of the M light source partitions emit m first beams at each moment, where a light source partition used to emit m first beams at each of the M moments varies. In
Further, optionally, the M light source partitions included in the light source are integrally molded. In this way, it can be ensured that the M light source partitions are on a same plane. The light source provided with reference to
With reference to
The beam adjustment assembly is configured to: after adjusting transmission directions of the received m first beams, project adjusted m first beams to a corresponding region of a detection surface. A projection point that is on the detection surface and to which projection is performed at each of the M moments is located in a separate region. This may also be understood as that second beams that come from one light source partition are all projected to a corresponding region of the detection surface. In
In a possible implementation, the beam adjustment assembly is configured to adjust the transmission directions of the m first beams, and uniformly project the m first beams to the corresponding region. The following provides four implementations of uniformly projecting the m first beams to the ith region as examples.
Implementation 1: The beam adjustment assembly includes a collimator.
The collimator is configured to adjust an included angle between any two adjacent first beams of the m first beams into a third angle. The third angle is determined based on a field of view corresponding to the ith region and the quantity m of first beams. The field of view corresponding to the ith region=a total field of view of the detection surface/M. The third angle=the total field of view of the detection surface/M/m.
Implementation 2: The beam adjustment assembly includes a beam tuner (DOE tuner), also referred to as a beam scaler.
The beam scaler is configured to expand each of the m first beams that come from the ith light source partition of the light source, to expand the m first beams into one uniform beam, and project the uniform beam to the ith region.
Implementation 3: The beam adjustment assembly includes a collimator and a DOE.
The collimator is configured to adjust an included angle between any two adjacent first beams of the m first beams that come from the ith light source partition of the light source into a third angle. The DOE is configured to split each adjusted first beam into n second beams. The third angle is determined based on a field of view corresponding to the ith region and a quantity m×n of second beams. The field of view corresponding to the ith region=a total field of view of the detection surface/M. The third angle=the total field of view of the detection surface/M/(m×n).
This may also be understood as that the collimator is configured to transmit, to the DOE or a diffuser at a specific full angle or in parallel at a specific angle, the m first beams that come from the ith light source partition of the light source. The DOE or the diffuser properly diffuses or shapes the received first beams, and transmit the first beams to the ith region of the detection surface. A beam transmitted to the ith region may be in a circular or square shape, or is a dense lattice. This is not limited in this application.
Implementation 4: The beam adjustment assembly includes a collimator and a beam tuner.
The collimator is configured to transmit, to the beam tuner in parallel at a third angle, the m first beams that come from the ith light source partition of the light source. The beam tuner diffuses collimated light, and transmits diffused collimated light to the ith region of the detection surface.
It should be noted that the collimator is closer to the light source and farther away from the detection surface than the DOE; or the collimator is farther away from the light source and closer to the detection surface than the DOE.
The detection assembly is configured to: receive m optical echo signals from a corresponding region of the detection surface at each of the M moments, to be specific, the detection assembly receives m optical echo signals from the 1st region at the 1st moment, receives m optical echo signals from the 2nd region at the 2nd moment, and so on, receives m optical echo signals from the Mth region at the Mth moment; and converts the m optical echo signals into m electrical echo signals for storage at each moment, where the m optical echo signals are in a one-to-one correspondence with the m first beams, and an optical echo signal is a signal obtained by reflecting a corresponding first beam by the detection surface.
In a possible implementation, the detection assembly may include M detection element regions, and the M detection element regions of the detection assembly are in a one-to-one correspondence with the M light source partitions. Each detection element region includes a plurality of detection elements, and each detection element region is configured to receive optical echo signals obtained by reflecting, by the detection surface, beams emitted by a light source partition corresponding to the detection element region. The detection assembly is configured to power on, at each of the M moments, only each detection element in a detection element region of the M detection element regions that corresponds to the light source partition used to emit the m first beams. In other words, if the ith light source partition of the light source emits m first beams, the ith detection element region corresponding to the detection assembly is selected. The ith detection element region of the detection assembly is configured to receive an optical echo signal that comes from a corresponding region of the detection surface. In this way, the m first beams emitted by m emitters of the ith light source partition are adjusted by the beam adjustment assembly and then uniformly cover the ith region of the detection surface. In addition, the ith detection element region corresponding to the detection assembly is selected, and the ith detection element region can receive m optical echo signals from the ith region of the detection surface, thereby helping avoid a problem in a conventional technology that an image resolution is low because 160×120 detection elements can be simultaneously started at most.
Based on the TOF sensor module, at each of the M moments, after m first beams emitted by the light source are adjusted by the beam adjustment assembly, the m first beams cover one region of the detection surface. The light source emits m first beams separately at different times, and a corresponding detection element in the detection assembly is selected, thereby resolving a problem that 160×120 detection elements can be simultaneously started at most.
It should be noted that the interval Δ4 between light source partitions can be effectively controlled by using the M integrally molded light source partitions. A relatively small interval Δ4 between light source partitions facilitates continuous connection between projection points on the detection surface. The following describes the beneficial effect in detail with reference to specific examples.
It should be noted that, when the interval Δ4 between light source partitions is relatively large, and a full angle of emergent light of the DOE/diffuser needs to be designed to be relatively large, a global search/vector design manner needs to be used herein. A calculation amount is relatively large, and it is relatively difficult to control diffraction efficiency and uniformity.
In a possible implementation, the beam adjustment assembly may include beam adjustment partitions. For example, the beam adjustment assembly may be divided into M beam adjustment partitions, and the M light source partitions are in a one-to-one correspondence with the M beam adjustment partitions. For ease of description of the solution, an example in which the beam adjustment partition is a DOE/diffuser is used for description below. In other words, the light source partitions may be in a one-to-one correspondence with DOE/diffuser partitions, that is, one light source partition corresponds to one DOE/diffuser partition.
In a possible implementation, the DOE/diffuser may be located on a side, of the light source, that is close to the detection surface; or the light source and the DOE/diffuser are bonded to form a wafer (wafer)-level integrated device.
It should be noted that the DOE is configured to shape a received beam and deflect an angle of departure. Different light source partitions correspond to different angles of departure. With reference to
Compensation for the interval Δ4 between light source partitions further needs to be considered in a DOE design. As shown in
Based on the structure and the functional principle of the TOF sensor module described above, this application further provides an electronic device. The electronic device may include the TOF sensor module and a fixing structure, and the fixing structure is configured to fix the TOF sensor module. Certainly, the electronic device may further include other components, for example, a processor, a memory, a wireless communication apparatus, a sensor, a touchscreen, and a display.
In this application, the electronic device may be a mobile phone, a tablet computer, a wearable device (for example, a smartwatch), or the like. An example embodiment of the electronic device includes but is not limited to a device using IOS®, Android®, Microsoft®, or another operating system.
The processor 111 may include one or more processing units. For example, the processor 111 may include an application processor (application processor, AP), a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a digital signal processor (digital signal processor, DSP), and the like. Different processing units may be independent components, or may be integrated into one or more processors.
For the TOF sensor module 112, refer to the foregoing descriptions. Details are not described herein again.
The display 113 may be configured to display an image and the like. The display 113 may include a display panel. The display panel may be a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), an active-matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), a flexible light-emitting diode (flex light-emitting diode, FLED), miniLED, microLED, micro-OLED, a quantum dot light emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include one or H displays 113, where H is a positive integer greater than 1.
The fixing assembly 114 is configured to fix the TOF sensor module to the electronic device. For example, the fixing assembly may be a bracket, and the TOF sensor module may be fixed to the electronic device by using the bracket; or the fixing assembly may be a mechanical part formed by another component in the electronic device (for example, a middle frame in a mobile phone); or the fixing assembly may be various adhesives or connectors (for example, solders and screws).
In embodiments of this application, if there is no special description or logical conflict, and terms and/or descriptions in different embodiments are consistent and may be mutually referenced, technical features in the different embodiments may be combined to form a new embodiment based on an internal logical relationship.
In this application, the term “and/or” describes an association relationship between associated objects and may indicate that three relationships may exist. For example, A and/or B may indicate the following cases: Only A exists, both A and B exist, and only B exists, where A and B may be singular or plural. “At least one of the following items (pieces)” or a similar expression thereof means any combination of these items, including a single item (piece) or any combination of a plurality of items (pieces). For example, at least one of a, b, and c (pieces) may represent a, b, c, “a and b”, “a and c”, “b and c”, or “a, b, and c”, where a, b, and c may be singular or plural. In the text descriptions of this application, the character “/” usually indicates an “or” relationship between associated objects. It can be understood that, in this application, “uniform” does not mean absolute uniformity, but a specific engineering error may be allowed.
It may be understood that various numerals used in this application are merely differentiated for ease of description, but are not used to limit the scope of embodiments of this application. Sequence numbers of the foregoing processes do not mean execution sequences. The execution sequences of the processes should be determined based on functions and internal logic of the processes. The terms “first”, “second”, and the like are intended to distinguish between similar objects, but do not necessarily describe a particular order or sequence. Moreover, the terms “include”, “have”, and any other variant thereof are intended to cover a non-exclusive inclusion, for example, including a series of steps or units. Methods, systems, products, or devices are not necessarily limited to those explicitly listed steps or units, but may include other steps or units that are not explicitly listed or that are inherent to such processes, methods, products, or devices.
Although this application is described with reference to specific features and the embodiments thereof, it is clear that various modifications and combinations may be made to them without departing from the spirit and scope of this application. Correspondingly, the specification and accompanying drawings are merely example description of the solutions defined by the accompanying claims, and are considered as any of or all modifications, variations, combinations or equivalents that cover the scope of this application.
Obviously, a person skilled in the art can make various modifications and variations to this application without departing from the spirit and scope of this application. This application is intended to cover these modifications and variations to embodiments of this application provided that they fall within the scope of protection defined by the following claims in this application and their equivalent technologies.
Number | Date | Country | Kind |
---|---|---|---|
202010077087.8 | Jan 2020 | CN | national |
202010873829.8 | Aug 2020 | CN | national |
This application is a continuation of International Application No. PCT/CN2020/134053, filed on Dec. 4, 2020, which claims priority to Chinese Patent Application No. 202010077087.8, filed on Jan. 23, 2020 and Chinese Patent Application No. 202010873829.8, filed on Aug. 26, 2020. All of the aforementioned patent applications are hereby incorporated by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2020/134053 | Dec 2020 | US |
Child | 17870941 | US |