This disclosure generally relates to a time of flight sensor and, more particularly, to a time of flight sensor that pre-stores deviation compensation and deviation correction associated with temperature variation for calibrating the detection deviation caused by different operating temperatures and a temperature compensation method thereof.
Referring to
-
Referring to
However, the light source 11 generally has the temperature dependency. Under different operating temperatures, different modulation parameters are generated caused by such as the variation of clock duty Tduty and the drift of frequency Tfreq, as shown
It is possible to compensate this deviation by using the light source having lower temperature dependency or embedding a temperature sensor to measure the operating temperature, but these methods do not have effective compensation on a time of flight sensor adopting modulated light source.
Accordingly, it is necessary to provide a time of flight sensor that can effectively compensate the temperature dependency of the light source thereof.
The present disclosure provides a time of flight sensor and a temperature compensation method thereof that use a reference pixel to compensate the temperature dependency of the detection result of an active pixel.
The present disclosure provides a time of flight sensor including a light source, a light sensor and a processor. The light source is arranged in a first accommodation space, and configured to illuminate light according to a light source driving signal. The light sensor includes a first pixel, arranged in the first accommodation space, and a second pixel, arranged in a second accommodation space adjacent to the first accommodation space, and configured to generate output signals according to a sampling signal. The processor is configured to calibrate, at an operating temperature, a current distance using a predetermined detection compensation and a predetermined detection correction obtained according to a second reference phase-distance relationship associated with the second pixel at a reference temperature.
The present disclosure further provides a time of flight sensor including a light source, a light sensor and a processor. The light source is arranged in the first accommodation space, and configured to illuminate light according to a light source driving signal. The light sensor includes a first pixel, arranged in the first accommodation space, and a second pixel, arranged in a second accommodation space adjacent to the first accommodation space, and configured to generate output signals according to a sampling signal. The processor is configured to, at an operating temperature, calculate an operation phase-distance relationship associated with the first pixel, calculate a temperature compensation and a temperature correction according to a predetermined first reference phase-distance relationship associated with the first pixel at a reference temperature and the operation phase-distance relationship, and calibrate a current phase using the temperature compensation and the temperature correction.
The present disclosure further provide a time of flight sensor including an encapsulation, a light source, a first pixel and a second pixel. The encapsulation has a first accommodation space, a second accommodation space and a top cover, wherein the first accommodation space has a first opening, and the second accommodation space has a second opening. The light source is arranged in the first accommodation space and exposed by the first opening.
The first pixel is arranged in the first accommodation space and is not exposed by the first opening, and is configured to receive reflected light projected by the light source and reflected by the top cover. The second pixel is arranged in the second accommodation space, and configured to receive reflected light formed by emission light projected by the light source to penetrate the first opening of the first accommodation space, reflected by an external object, and then penetrating the second opening of the second accommodation space.
Other objects, advantages, and novel features of the present disclosure will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
It should be noted that, wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
The time of flight (TOF) sensor of the present disclosure is to adopt an additional reference pixel that is used to previously record a temperature compensation and a temperature correction in the memory. The temperature compensation and the temperature correction are used to calibrate a phase and a distance obtained in actual operation. There are two stages before the actual operation. The first stage is a setting stage before shipment in which a reference phase-distance relationship associated with the reference pixel is recorded under a reference temperature. The second stage is a booting stage before each actual operation in which an operation phase-distance relationship associated with the reference pixel is recorded under an operating temperature. The temperature compensation and the temperature correction are calculated according to the reference phase-distance relationship and the operation phase-distance relationship for calibrating a temperature variation between the reference temperature and the operating temperature. In this way, the deviation caused by the temperature variation is effectively eliminated.
Referring to
The light source 31 emits light of an identifiable spectrum to illuminate an object outside the encapsulation 39. The light source 31 is a coherent light source or a non-coherent light source, e.g., a light emitting diode or a laser diode.
The light sensor 33 includes, e.g., a CMOS sensor which has a first pixel 331 and a second pixel 332, wherein the first pixel 331 and the second pixel 332 respectively include at least one photodiode or at least one single photon avalanche diode (SPAD) used to detect optical energy and output electrical signals. It should be mentioned that although
The encapsulation 39 has a first accommodation space 391 and a second accommodation space 393. The first accommodation space 391 is connected to a first opening and covered by a top cover surrounding the first opening, and is used to accommodate the first pixel 331 and the light source 31, wherein the first pixel 331 receives reflected light from the top cover (e.g., located upon the first pixel 331) of the encapsulation 39 illuminated by the light source 31. As the first pixel 331 and the light source 31 are both disposed in the first accommodation space 391 and close to each other, a time of flight of emission light from being emitted from the light source 331 till propagating to the first pixel 331 is considered substantially identical to zero. The second accommodation space 393 is connected to a second opening, and is used to accommodate the second pixel 332.
More specifically, a part of emission light from the light source 31 is reflected inside the first accommodation space 391 to be received by the first pixel 331, and another part of emission light is reflected, after penetrating the first opening of the first accommodation space 391, by an external object outside the encapsulation 39 and then received by the second pixel 332 after entering the second opening of the second accommodation space 393.
The encapsulation 39 preferably has an isolation wall 395, e.g., extending downward from the top cover to a surface of the base layer as shown in
In some aspects, the first opening of the first accommodation space 391 is further arranged with a filter to block light outside the emission spectrum of the light source 31 to reduce the interference from ambient light to the light detection of the first pixel 331; and the second opening of the second accommodation space 393 is further arranged with a filter to block light outside the emission spectrum of the light source 31 to reduce the interference from ambient light to the light detection of the second pixel 332.
Please referring to
The time control circuit further generates a sampling signal Sd1 to the light sensor 33. The sampling controller 47 of the light sensor 33 reads charges (e.g., by a correlated double sampling, but not limited thereto) in the first pixel 331 and the second pixel 332 according to the sampling signal Sd1 to generate output signals SO1 and SO2, respectively. The time control circuit further controls a time delay between the sampling signal Sai and the light source driving signal Sd2_d and Sd2_nd. That is, the delay in the present disclosure is referred to whether the light source driving signal has a time delay with respect to the sampling signal. The processor 49 calculates the compensation, the correction, the phase and the distance according to the output signals SO1 and SO2. An example will be illustrated hereinafter.
In the present disclosure, the processor 49 is selected from an application specific integrated circuit (ASIC) and a digital signal processor (DSP) that is arranged, for example, in the light sensor 33 to perform the calculation using software, firmware and/or hardware.
In one aspect, the time control circuit includes, for example, a time controller 41, a first delay circuit 431 and a second delay circuit 433. The time controller 41 generates a timing signal to the first delay circuit 431 and the second delay circuit 433. For example referring to
In one aspect, the sampling controller 47 or other circuit of the light sensor 33 further generates an inverse sampling signal Sd1_inv (e.g., referring to
It is appreciated that although the timing signal Ssyn, the sampling signal Sai as well as the light source driving signal Sd2_nd and Sd2_d are illustrated by square waves in the drawings, the present disclosure is not limited thereto. In other aspects, the timing signal Ssyn, the sampling signal Sdi as well as the light source driving signal Sd2_nd and Sd2_d respectively have other different waveforms according to different applications.
In the present disclosure, the TOF sensor 300 preferably includes a memory (e.g., volatile memory and/or non-volatile memory) used to previously (e.g., a setting stage before shipment) record and store (1) a first reference phase-distance relationship associated with the first pixel 311 at a reference temperature; and (2) a detection compensation and a detection correction that are obtained according to a second reference phase-distance relationship associated with the second pixel 332 at the reference temperature. The memory further records and stores before actual operation (e.g., a booting stage) (3) a temperature compensation and a temperature correction calculated according to an operation phase-distance relationship (obtained at an operating temperature) associated with the first pixel 331 and the first reference phase-distance relationship. These recorded parameters (1)-(3) are used to compensate and calibrate a detected phase and a detected distance during actual operation. An example will be given below for illustration purposes.
Please referring to
At the reference temperature, when the sampling signal Sd1 and the light source driving signal Sd2_nd and Sd2_d have the first time delay Td0, the first pixel 331 generates a first reference output signal SO1 nd as shown in
For example, when the first pixel 331 is a single pixel, the first reference output signal SO1_nd and the second reference output signal SO1_d are respectively an output signal of the single pixel; whereas, when the first pixel 331 includes multiple pixels, the first reference output signal SO1_nd and the second reference output signal SO1_d are respectively a summation or an average of output signals of the multiple pixels, wherein the summation and the average are implemented by the circuit of the first pixel 331.
The processor 49 of the light sensor 33 or an external processor (e.g., an external computer may be used before shipment) obtains a first reference phase-distance relationship, e.g., a line LRr shown in
In one aspect, the line LRr is determined by using two reference points RPr1 and RPr2. For example, the processor 49 or the external processor obtains a first reference phase according to the first reference output signal SO1_nd, and obtains a first distance according to the first time delay Td0 so as to obtain a first reference point RPr1 at a phase-distance plane as shown in
Finally, a first reference point RPr1 is obtained according to the calculated first reference phase (i.e. as a longitudinal axis value of the first reference point RPr1) and the calculated first distance (i.e. as a transverse axis value of the first reference point RPr1).
Similarly, the processor 49 or the external processor obtains a second reference phase (e.g., using an equation: area of B2/(area of A2+area of B2)) according to the second reference output signal SO1_d, and obtains a second distance (e.g., using an equation: velocity of light×Tdelay/2) according to the second time delay Tdelay so as to obtain a second reference point RPr2 at the phase-distance plane as shown in
Next, the processor 49 or the external processor obtains the first reference phase-distance relationship LRr on the phase-distance plane according to a line connecting the first reference point RPr1 and the second reference point RPr2. The first reference phase-distance relationship LRr is then recorded in the memory.
At the reference temperature, when the sampling signal Sd1 and the light source driving signal Sd2_nd and Sd2_d have the first time delay Td0, the second pixel 332 generates a first detection output signal SO2_nd as shown in
Similarly, the second pixel 332 is a single pixel or includes multiple pixels, and used to directly output the first detection output signal SO2_nd and the second detection output signal SO2_d, or output a summation or an average of output signals of the multiple pixels to form the first detection output signal SO2_nd and the second detection output signal SO2_d.
Next, the processor 49 of the light sensor 33 or an external processor obtains a second reference phase-distance relationship according to the first detection output signal SO2_nd and the second detection output signal SO2_d, e.g., a line LAr shown in
Similarly, the line LAr is determined by using two detected points APr1and APr2. For example, the processor 49 or the external processor obtains a first detected phase (e.g., a longitudinal axis value of first detected point APr1) according to the first detection output signal SO2_nd, and obtains a first detected distance (e.g., a transverse axis value of first detected point APr1) according to the first time delay Td0 and the predetermined time of flight Ttof (e.g., Toverall=Td0+Ttof) so as to obtain a first detected point APr1 at a phase-distance plane as shown in
Similarly, the processor 49 or the external processor obtains a second detected phase (e.g., a longitudinal axis value of second detected point APr2) according to the second detection output signal SO2_d, and obtains a second detected distance (e.g., a transverse axis value of second detected point APr2) according to the second time delay Tdelay and the predetermined time of flight Ttof (e.g., Toverall=Tdelay+Ttof) so as to obtain a second detected point APr2 aat the phase-distance plane as shown in
velocity of light×Toverall/2, and these calculations have been described above, and thus are not repeated herein.
Next, the processor 49 or the external processor obtains a second reference phase-distance relationship LAr according to a line connecting the first detected point APr1 and the second detected point APr2 on the phase-distance plane. In one aspect, the second reference phase-distance relationship LAr is recorded in a memory. In another aspect, the processor 49 or the external processor obtains a detection compensation (e.g., a phase-axis intercept of LAr) and a detection correction (e.g., the correction to calibrate a slope of LAr to a slope of ideal line, e.g., dotted line shown in
After the above parameters are recorded or stored in the memory of the TOF sensor 300 before shipment, the setting stage is accomplished. It is appreciated that although the above descriptions are described in the way using two points to determine LRr and LAr as an example, the present disclosure is not limited thereto. In other aspects, by setting multiple time delays Tdelay, it is possible to use multiple reference points to determine LRr and use multiple detected points to determine LAr.
Please referring to
Similarly, the time control circuit controls the sampling signal Sd1 and the light source driving signal Sd2_nd and Sd2_d to sequentially have a first time delay Tao and the second time delay Tdelay according to
At the operating temperature, the first pixel 331 generates a first operation output signal, which is similar to SO1_nd in
The processor 49 of the light sensor 33 obtains an operation phase-distance relationship, e.g., a line LRo in
Similarly, the line LRo is determined by using two reference operation points RPO1 and RPO2. For example, the processor 49 obtains a first operation phase (e.g., a longitudinal axis value of first reference operation point RPO1) according to the first operation output signal, and obtains a first distance (e.g., a transverse axis value of first reference operation point RPO1) according to the first time delay Tdo so as to obtain a first reference operation point RPO1 at a phase-distance plane as shown in
Next, the processor 49 obtains the operation phase-distance relationship LRo according to a line connecting the first reference operation point RPO1 and the second reference operation point RPO2. The operation phase-distance relationship LRo is then recorded in the memory.
Please referring to
When the sampling signal Sai and the light source driving signal Sd2_nd and Sd2_d have the first time delay Td0, the second pixel 332 generates an operation detected signal, similar to SO2_nd as shown in
Using the same method mentioned above, the processor 49 calculates a current phase (e.g., a longitudinal axis value of current operation point APO) according to an ratio of area of the operation detected signal, and calculates a current distance (e.g., a traverse axis value of current operation point APO) according to Toverall=Ttof so as to obtain a current operation point APO. In this stage, the object distance is a value to be measured. As shown in
Compensated distance=velocity of light×detection correction×[(current phase−Δoffset)/Δslope−detection compensation]/2
That is, the processor 49 calibrates the current distance according to the recorded temperature compensation, temperature correction, detection compensation and detection correction, wherein the current distance is calculated according to a time of flight currently detected. In other words, if the object is arranged just at the predetermined distance (i.e. the distance D for obtaining LRr and LAr before shipment), an operation line LAo shown in
In this case, by using the temperature compensation Δoffset and the temperature correction Δslope (obtained according to
In another aspect, the processor 49 calculates a current phase and a current distance according to the operation detected signal generated by the second pixel 332 when the sampling signal Sd1 and the light source driving signal Sd2_nd and Sd2_d) have the second time delay Tdelay. That is, in the present disclosure the time delay used for calculating the current phase and the current distance is not particularly limited.
Referring to
Details of these steps of the temperature compensation method have been illustrated above, wherein the Steps S112-S114 are executed, for example, in a setting stage before shipment; the Steps S115-S118 are executed, for example, in a booting stage before accrual operation; the Step S111 is executed in both the setting stage and the booting stage; and the Step S119 is executed according to a current phase measured according to a current time of flight in actual operation.
In addition, the above steps are mainly used to eliminate the phase deviation caused by temperature variation. If it is desired to eliminate the distance deviation caused by other factors, the temperature compensation method of the present disclosure preferably further includes the steps of: generating a first detection output signal SO2_nd by the second pixel 332 at the reference temperature and the first time delay Td0; generating a second detection output signal SO2_d by the second pixel 332 at the reference temperature and the second time delay Tdelay; obtaining a second reference phase-distance relationship LAr according to the first detection output signal SO2_nd and the second detection output signal SO2_d calculating and recording a detection compensation and a detection correction according to the second reference phase-distance relationship LAr; and calibrating a current distance obtained according an operation detected signal using the detection compensation and the detection correction. Details of these steps are also described above, and thus are not repeated herein.
In the present disclosure, it is assumed that the operating temperature does not have apparent change within a short time, and thus the operation phase-distance relationship LRo is described in the way being obtained in the booting stage. However, if the operating temperature can change significantly within a short time, the operation phase-distance relationship LRo is calculated and recorded anytime according to the requirement of the user as long as being recorded before the actual operation.
As mentioned above, the object distance detected by the conventional TOF sensor is deviated due to the environmental temperature change, and the conventional method for compensating this deviation is not effective to the TOF sensor with modulated light source. Therefore, the present disclosure further provides a time of flight sensor for compensating the temperature deviation (as shown in
10 Although the disclosure has been explained in relation to its preferred embodiment, it is not used to limit the disclosure. It is to be understood that many other possible modifications and variations can be made by those skilled in the art without departing from the spirit and scope of the disclosure as hereinafter claimed.
The present application is a continuation application of U.S. application Ser. No. 18/129,078, filed on Mar. 31, 2023, which is a continuation application of U.S. application Ser. No. 16/936,777, filed on Jul. 23, 2020,the disclosure of which is hereby incorporated by reference herein in its entirety. To the extent any amendments, characterizations, or other assertions previously made (in this or in any related patent applications or patents, including any parent, sibling, or child) with respect to any art, prior or otherwise, could be construed as a disclaimer of any subject matter supported by the present disclosure of this application, Applicant hereby rescinds and retracts such disclaimer. Applicant also respectfully submits that any prior art previously considered in any related patent applications or patents, including any parent, sibling, or child, may need to be re-visited.
Number | Date | Country | |
---|---|---|---|
Parent | 18129078 | Mar 2023 | US |
Child | 18823748 | US | |
Parent | 16936777 | Jul 2020 | US |
Child | 18129078 | US |