The present application claims priorities to Chinese Patent Application No. 202011189912.X, titled “DETECTION METHOD AND DETECTION SYSTEM”, Chinese Patent Application No. 202011185790.7, titled “DETECTION METHOD AND DETECTION SYSTEM USING SAME”, and Chinese Patent Application No. 202011187367.0, titled “DETECTION METHOD AND DETECTION SYSTEM USING SAME”, filed on Oct. 30, 2020 with the Chinese Patent Office, all of which are incorporated herein by reference in their entireties.
The present disclosure relates to the technical field of lidar detection, and in particular to a detection method and a detection system.
With the development of the lidar technology, the Time of flight (TOF) technology has received more and more attention. The principle of the TOF is described as follows. A light pulse is continuously emitted to an object, and a light returned from the object is received by a sensor, and the distance to the object is obtained by detecting the flight (round-trip) time of the light pulse. The currently commonly used methods include a direct time-of-flight detection method and an indirect time-of-flight detection method. In the direct time-of-flight detection method, the distance to the object is obtained based on a direct time difference between the emitted light and the return light. In the indirect time-of-flight detection method, a phase difference between the emitted light and the return light is obtained, a final flight time is obtained based on the phase difference, so as to calculate the distance to the object. Currently, the detection system is generally implemented by a Vertical-Cavity Surface-Emitting Laser (VCSEL). The laser emitting source is implemented by multiple laser emitting units arranged in an array. The laser source emits a detection light. The detection light is reflected by the detected object and enters the detection unit to be converted into photogenerated charges. The target distance or target image of the detected object is obtained by the subsequent signal processing circuit later to complete the detection.
In the normal distance measurement process, the distance to the detected object can be obtained without contacting the object by the direct or indirect time-of-flight measurement method. In the case of multi-point detection, the shape and contour of the object may be obtained by multi-point measurement settings, and a three-dimensional image is outputted by the post-processing. In the actual detection process, especially in the distance measurement process, the emitted light of the array light source is required to have a certain diffusion angle to ensure a sufficient view field. However, in the case of ensuring the view field, the emitted light projected by the light source has a large diffusion angle, in this case detection objects in the view field are more complex. For example, in the automatic driving view field, there are road, obstacles, people and other complex components. Further, for example, in the sweeping robot application, there are ground, wall corners, obstacles and so on. Moreover, for example, in the view field of a security camera, there are ground, corner, people, and so on. The above shows only an example of several application scenarios, which is not limited herein. In this case there are the following possibilities. For example, there are an obstacle A and a detected object B in the view field. The light source emits a detection laser light, and the emitted light is partially reflected by the detected object B back to the detector. However, there is also a part of the emitted light is reflected by the obstacle A while being not directly back to the detector, but is firstly reflected to the detected object B and returns back to the detector after the reflection of the detected object B. This phenomenon is especially serious in a case that there are highly reflective objects in the view field, or in a case that the detector carrier is near a corner. This phenomenon causes false values of distance detection in the output of the detector array, and the interference to the detection result caused by this phenomenon belongs to the multipath interference in the detection, and this phenomenon has a very serious limitation for the application of the detector and the realization of accurate detection. In a patent No. CN205621076U (titled “DIMENSIONAL MARKING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION”), a method for improving and constraining this multipath interference phenomenon is proposed, which is implemented by an adaptive adjustment structure in the light beam scenario, for example, an adjustable lens. By adaptively detecting the basic information of the field of view and adjusting the projection beam of the emission light, the projection light outputted by the light source limits the diffusion angle, so that the object or target of interest is projected emphatically, so as to obtain accurate detection results and eliminate the influence of multipath phenomenon. This method has certain practicability, but has certain restrictions for the scenario with multiple objects in the view field. The diffusion angle is required to be adjusted in the image acquisition or processing, making the whole solution more complex. Therefore, it is an urgent problem to be solved how to not only achieve a sufficient diffusion angle to ensure that the view field of view is sufficient, but also output the emitted light only for specific regions so as to realize projection on the object of interest by the constrained light beam in the existing technology, so as to weaken or eliminate the multipath phenomenon.
A detection method and a detection system are provided in the present disclosure, by which targeted projection can be performed on different regions to weaken or eliminate the multipath interference mentioned in the section of BACKGROUND.
Technical solutions in embodiments of the present disclosure are provided as follows.
In a first aspect, a detection method is provided according to an embodiment of the present disclosure. The detection method is performed by a detection system including a light emitting module, a processing module and a light receiving module. The light emitting module includes N emitting regions, where N is an integer greater than or equal to 2, lights respectively emitted from every two adjacent emitting regions among the emitting regions having different polarization angles. The light receiving module includes N receiving regions respectively corresponding to the N emitting regions in the light emitting module, and every two adjacent receiving regions among the receiving regions receiving a light returned from a detected object at different polarization angles. The detection method includes:
In an embodiment, the detection system further includes a polarization module arranged upstream of the light receiving module in a direction of the return light, the polarization module includes N polarization filtering regions respectively corresponding to the N emitting regions in the light emitting module. The method further includes: receiving, by at least one of the polarization filtering regions of the polarization module in the at least part of the time period, the return light reflected by the detected object with different polarization angles that is outputted by the at least two emitting regions, where the at least one polarization filtering region is used to partially or fully filter the return light with the at least one polarization angle, and the light receiving module receives the return light after being filtered by the polarization module and outputs the excited photogenerated electrical signal.
In an embodiment, the emitting lights of the N emitting regions correspond to regions in a view field, and two emitting lights at least partially adjacent have an overlapping region.
In an embodiment, the emitting lights of the N emitting regions correspond to multiple regions in a view field, and in at least part of the time period, one of detected targets in one of the regions in the view field reflects at least part of the emitting light with a first polarization angle to another detected target in another region.
In an embodiment, the some receiving regions among the N receiving regions receive a light with at least two different polarization angles returned by the detected object.
In an embodiment, in at least part of the time period, the N receiving regions receive an echo obtained by one of the detected targets in one of the regions in the view field reflecting at least part of the emitting light with a first polarization angle to another detected target in another region.
In an embodiment, the light receiving module includes one or more image planes, and the light receiving module is provided with a photogenerated electrical signal conversion section configured at one of the image planes of the return light, and k image planes exist before the return light reaches the light receiving module, where k is an integer greater than or equal to 1.
In an embodiment, the polarization module is located at at least one of the k image planes.
In an embodiment, a proportion of an energy of the return light of the filtered polarization angle to a total return light energy of the polarization angle is not less than 20%.
In an embodiment, N is an even number greater than or equal to 2.
In an embodiment, N is 4, and two diagonally arranged emitting regions emit the light with polarization angles differing by 45° or 90°, and the light receiving module is provided with 4 receiving regions at polarization angles corresponding to the emitting regions.
In an embodiment, the polarization module is provided with 4 polarization filtering regions at the polarization angles corresponding to the emitting regions.
In a second aspect, a detection system is provided according to an embodiment of the present disclosure, to perform the detection method described in the first aspect above. The detection system includes a light emitting module, a processing module, and a light receiving module. The light emitting module includes N emitting regions, where N is an integer greater than or equal to 2, and lights respectively emitted from every two adjacent emitting regions among the emitting regions have different polarization angles. The light receiving module includes N receiving regions respectively corresponding to the N emitting regions in the light emitting module, and every two adjacent receiving regions among the receiving regions receive a light returned from a detected object at different polarization angles. At least some of the receiving regions in the light receiving module is configured to: in at least part of a time period, receive the return light reflected by the detected object with different polarization angles that is outputted by at least two emitting regions. The light receiving module is configured to generate a photogenerated electrical signal under excitation of the return light with at least one of the polarization angles after being partially or fully filtered, corresponding to at least a part of a region of the return light with different polarization angles. The processing module is configured to process the photogenerated electrical signal generated by the light receiving module under the excitation of the return light after being filtered, to obtain final target information of the detected object.
According to the detection method provided in the embodiment of the present disclosure, the detection method is performed by a detection system including a light emitting module, a processing module, and a light receiving module. The light emitting module includes N emitting regions, where N is an integer greater than or equal to 2, lights respectively emitted from every two adjacent emitting regions among the emitting regions having different polarization angles. The light receiving module includes N receiving regions respectively corresponding to the N emitting regions in the light emitting module, and every two adjacent receiving regions among the receiving regions receiving a light returned from a detected object at different polarization angles. The detection method includes: receiving, by at least some of the receiving regions in the light receiving module in at least part of a time period, the return light reflected by the detected object with different polarization angles that is outputted by at least two emitting regions; generating a photogenerated electrical signal by the light receiving module under excitation of the return light with at least one of the polarization angles after being partially or fully filtered, corresponding to at least a part of a region of the return light with different polarization angles; and processing, by the processing module, the photogenerated electrical signal generated by the light receiving module under the excitation of the return light after being filtered, to obtain final target information of the detected object. In this way, in the case that the detection method is applied to the object distance acquisition, the detector emitting light source is divided into N different regions, and the lights emitted from each adjacent two emitting regions have different polarization angles, so that it is equivalent that the emitting light source VCSEL array is divided, and each adjacent two emitting regions have different polarization angles. In this case, the lights outputted from the adjacent two light emitting regions have different polarization angles, and the light emitted from each region has certain identifiable characteristics. The projection view field of the emitting light of each region can be changed to 1/N of the whole view field by setting N regions. Further, the N emitting regions form the complete view field. Therefore, with the proposed solution, the targeted projection can be achieved on the premise of ensuring the detection range of the view field. With the light path between the return light to the receiving end or a polarization filter structure arranged on the receiving end, the reflected return light corresponding to directed emitted light can be identified, thus achieving the key object detection of key regions, and reducing or even eliminating the multipath interference.
Details of one or more embodiments of the present disclosure are presented in the drawings and description below. Other features, objects and advantages of the present disclosure are apparent from the specification, drawings and claims.
In order to illustrate technical solutions of embodiments of the present disclosure more clearly, the drawings used for the embodiments are briefly introduced in the following. It should be understood that the drawings show only some embodiments of the present disclosure, and should not be regarded as a limitation of the scope. Other drawings may be obtained by those skilled in the art from these drawings without any creative work.
In order to make objects, technical solutions and advantages of the embodiments of the present disclosure clearer, the technical solutions in the embodiments of the present disclosure are clearly and completely described below with reference to the drawings in the embodiments of the present disclosure. Apparently, the described embodiments are some but not all embodiments of the present disclosure. Components of the embodiments generally described and illustrated in the drawings herein may be arranged and designed in a variety of different configurations.
Therefore, the following detailed description for the embodiments of the present disclosure provided in the drawings is not intended to limit the scope of the present disclosure as claimed, but is merely representative of selected embodiments of the present disclosure. Based on the embodiments in the present disclosure, all other embodiments obtained by those skilled in the art without creative work shall fall in the protection scope of the present disclosure.
It should be noted that, similar numerals and letters refer to similar items in the following drawings. Therefore, if an item is defined in a drawing, the item is not required to be further defined and explained in subsequent drawings.
φ=ArcTan(A1−A3/A2−A4) (1)
A ratio of a difference between A1 and A3 to a difference between A2 and A4 is equal to the tangent of the phase angle. ArcTan is actually a bivariate inverse tangent function that can be mapped to the appropriate quadrant, and defined as 0° or 180° in the case of A2=A4 and A1>A3 or A3>A1, respectively.
The distance to the object is determined by the following equation.
d=c·φ/4π·f (2)
Further, the frequency of the emitted laser is required to be determined to perform the distance measurement, where c represents the light speed, φ represents a phase angle (measured in radians), and f represents a modulation frequency. With the above solution, the distance detection for the detected object in the view field can be achieved. However, the light arrives at the object via other paths, which results in the interference to the distance measurement signal as shown in
In the solution mentioned in the section of BACKGROUND, key regions are pre-identified to perform key projection. In this way, multipath interference can be identified by avoiding the projection for interference regions, to further obtain the detection result after eliminating the multipath interference. However, this solution has problems of efficiency and complexity. Therefore, it is helpful to design a solution for targeted emission to reduce or eliminate the multipath interference. A partitioning emission solution is given by previous studies to solve this problem. In a filed Chinese patent application No. 202011040936.9 titled “DEVICE AND METHOD FOR MEASURING DISTANCE BY TIME OF FLIGHT”, a solution that the emitting end is partitioned to perform emitting in time, and the distance information of the detected object in the complete view field is obtained by synthesis, which overcomes the problem of the limited view field in the solution shown in the section of BACKGROUND, and achieves a feasible solution to eliminate the problem of the multipath interference in large view field requirements by entering the mode by manual selection, self-adaption, or the like. In order to further ensure the efficient and continuous elimination and reduction of the multipath interference, more studies are made in the present disclosure. In the distance measurement process of an ordinary TOF sensor, if both the emitting end and the receiving end are divided into N regions (where N is an integer greater than or equal to 2), these regions are in fact conjugated in a group of imaging systems, that is, the light emitted from a region A at the emitting end is required to form a real image on a region A at the receiving end by the receiving system (the receiving system herein includes but is not limited to lens imaging, small hole imaging, etc.) In other words, in the entire view field, the emitting end is artificially divided into N different regions, and the emitted lights respectively from these N emitting regions correspond to the N regions in the view field. In the normal detection process, due to the correspondence of imaging, the receiving end correspondingly receives the reflected return light in each different view field, that is, the receiving end is correspondingly divided into N regions. In the case of introduction of the multipath interference, the following description is further given with reference to
The polarization process at the receiving end is performed to differentiate the multipath effect at the emitting end, and the pixel processing is performed by adding a line grating on the pixel, where other parameters such as grating material, period, and line width depend on the wavelength of the signal light. The line grating is mainly used to filter all or part of the light that does not match the polarization direction of the emitted light in the corresponding region, that is, at least one of the light receiving module receives the return light with different polarization angles from at least two emitting regions, and the grating can fully or partially filter the interference light caused by the multipath phenomenon of the polarization angle that does not correspond to the region.
In
In a target system as shown in
It can be seen from results of the above table that, with the solution of the present disclosure, the proportion of the return light with the polarization angle generating the crosstalk for other regions due to the multipath interference that is required to be filtered out to the total return light of the polarization angle is not less than 50%, thereby weakening or eliminating the multipath interference.
It is assumed that 10% of the light from each region generates the crosstalk for other regions and is reflected back to the corresponding region at the receiving end. By the polarization receiving end as shown in
When assuming a flight time has a delay of 20 ns, regional delay results of different regions under the multipath influence are obtained, as shown in Table 3 below.
Further, the statistics is performed on measurement results of the whole system to obtain a light path difference statistic caused by the multipath effect, as shown in Table 4 below.
In a case that no crosstalk is generated, the signals respectively received in the four regions A, B, C, and D at the receiving or in four different receiving regions corresponding to the return light filtered by a polarization section of a separate image plane are shown in
It can be seen from the above results that, with this solution, the error caused by the multipath phenomenon is significantly reduced after at least 50% filtering of the multipath interference light in at least one region of at least one polarization angle. In practice, the multipath interference is not limited to occurring in one plane, and the advantages brought by the design of the present disclosure are similar. The accounting and the effect for one case are given, and the multipath phenomenon is not limited to this assumption in the explanation. Since the multipath interference light caused due to the polarization filtering is different from the actual return light, the identification may be performed by some subsequent processing, which is not limited herein.
In summary, the solution of the present disclosure has at least the following technical effects.
It should be noted that, relational terms such as “first” and “second” herein are only used to distinguish one entity or operation from another entity or operation, and do not necessarily require or imply there is such actual relationship or sequence between these entities or operations. Moreover, terms “comprising”, “including” or any other variations thereof are intended to encompass a non-exclusive inclusion, such that a process, a method, an article or a device including a series of elements includes not only those elements, but also includes other elements that are not explicitly listed or inherent to such the process, method, article or device. Without further limitation, an element defined by a phrase “including a . . . ” does not preclude the presence of additional identical elements in a process, method, article or device including the element.
It should further be noted that the terms “module,” “unit,” and “component” as used in this specification are intended to denote a computer-related entity, which may be implemented by hardware, software, a combination of hardware and software, or software in execution. For example, the component may be but is not limited to, a process running on a processor, a processor, an object, an executable code, an executed thread, a program, or a computer. As an illustration, both the application running on the server and the server may be components. One or more components may reside in a process and/or an executed thread, and the components may be located in a computer or distributed between two or more computers.
Preferred embodiments of the present disclosure are given in the above description, and are not intended to limit the present disclosure. For those skilled in the art, the present disclosure may have various modifications and changes. Any modifications, equivalents and improvements made in the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure. It should be noted that similar numerals and letters refer to similar items in the following drawings. Therefore, if an item is defined in a drawing, the item is not required to be further defined and explained in subsequent drawings. Preferred embodiments of the present disclosure are given in the above description, and are not intended to limit the present disclosure. For those skilled in the art, the present disclosure may have various modifications and changes. Any modifications, equivalents and improvements made in the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
202011185790.7 | Oct 2020 | CN | national |
202011187367.0 | Oct 2020 | CN | national |
202011189912.X | Oct 2020 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2021/126590 | 10/27/2021 | WO |