This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2022-043107, filed on Mar. 17, 2022, the entire contents of which are incorporated herein by reference.
An embodiment of the present invention relates to a distance measuring device and a distance measuring method.
A distance measuring device using the time of flight (ToF) method is widely used in which a distance to an object is measured in a non-contact manner by radiating an optical signal to the object and receiving a reflected optical signal from the object. In distance measuring device using the ToF method, an optical signal emitted from a light emission unit is scanned in a one-dimensional direction or a two-dimensional direction, whereby distances to an object located within a predetermined area are measured, and a distance image is generated. Each of the pixels of the distance image corresponds to the reflected optical signal received by an individual one of light receiving elements.
Each light receiving element corresponding to one of the pixels receives a reflected optical signal from a direction corresponding to the light receiving element, and, in some cases, a plurality of objects are located in that direction. In this case, distance information of a plurality of objects is contained in one pixel, and there is a possibility that each of the object is hardly identified.
To address this issue, it can be considered that, in a case where distance information of a plurality of objects is included in one pixel, the pixel is divided so that the distance information is obtained for each divided pixel area. However, orientations, shapes, and sizes of a plurality of objects located in the same direction are not necessarily the same. Therefore, if the division direction of the pixel is not appropriate, the disposition information of each object cannot be correctly measured.
According one embodiment a distance measuring device has:
a plurality of light receiving elements each of which receives a reflected optical signal reflected by an object; and
an image processor that generates a distance image in accordance with distances to the object, based on signal intensities and light reception timings of the reflected optical signal received by the plurality of light receiving elements,
wherein the image processor is configured to:
detect a direction of the object, based on at least either the signal intensities of the reflected optical signal received by the light receiving elements or the distances to the object measured based on the reflected optical signal; and
divide at least one or some of pixels included in the distance image, based on the direction of the detected object.
Hereinafter, embodiments of a distance measuring device and a distance measuring method will be described with reference to the drawings. Hereinafter, main components of a distance measuring device will be mainly described, but the distance measuring device may have components and functions that are not illustrated or described. The following description does not exclude components and functions that are not illustrated or described.
(Basic Configuration of Distance Measuring Device)
At least a part of the distance measuring device 1 of
The light emission unit 2 emits an optical signal. The optical signal is, for example, a laser light signal having a predetermined frequency band and a predetermined pulse width. The laser light is coherent light having a uniform phase and frequency. The light emission unit 2 intermittently emits an optical signal in a pulse form at a predetermined cycle. The cycle at which the light emission unit 2 emits the optical signal is a time interval equal to or longer than a time required for a distance measurement unit 42 to measure a distance on the basis of one pulse of the optical signal.
The light emission unit 2 includes an oscillator 11, a light emission control unit 12, a light emitting element 13, a first drive unit 14, and a second drive unit 15. The oscillator 11 generates an oscillation signal in accordance with the cycle at which the optical signal is emitted. The first drive unit 14 intermittently supplies power to the light emitting element 13 in synchronism with the oscillation signal. The light emitting element 13 intermittently emits an optical signal on the basis of the power from the first drive unit 14. The light emitting element 13 may be a laser element that emits a single laser light beam or a laser unit that simultaneously emits a plurality of laser light beams. The light emission control unit 12 controls the second drive unit 15 in synchronism with the oscillation signal. The second drive unit 15 supplies a drive signal synchronized with the oscillation signal to the light control unit 3 in accordance with an instruction from the light emission control unit 12.
The light control unit 3 controls a traveling direction of the optical signal emitted from the light emitting element 13. In addition, the light control unit 3 controls the traveling direction of the optical signal on the basis of the drive signal from the second drive unit 15.
The light control unit 3 includes a first lens 21, a beam splitter 22, a second lens 23, and a scanning mirror 24.
The first lens 21 condenses the optical signal emitted from the light emission unit 2 and guides the optical signal to the beam splitter 22. The beam splitter 22 branches the optical signal from the first lens 21 in two directions and guides the optical signal to the second lens 23 and the scanning mirror 24. The second lens 23 guides the split light from the beam splitter 22 to the light receiving unit 4. The reason for guiding the optical signal to the light receiving unit 4 is to detect a light emission timing by the light receiving unit 4.
The scanning mirror 24 rotationally drives a mirror surface in synchronism with the drive signal from the second drive unit 15 in the light emission unit 2. This controls a reflection direction of the split light (optical signal) having passed through the beam splitter 22 and then entering the mirror surface of the scanning mirror 24. By rotationally driving the mirror surface of the scanning mirror 24 at a constant cycle on the basis of the drive signal from the second drive unit 15, the optical signal emitted from the light control unit 3 can be scanned in at least a one-dimensional direction within a predetermined range. By providing shafts for rotationally driving the mirror surface in two directions, it is also possible to scan the optical signal emitted from the light control unit 3 in a two-dimensional direction within a predetermined range.
In a case where an object 20 is present within a scanning area of the optical signal emitted from the distance measuring device 1, the optical signal is reflected by the object 20. At least a part of the reflected light reflected by the object 20 is received by the light receiving unit 4.
The light receiving unit 4 includes a photodetector 31, an amplifier 32, a third lens 33, a light receiving element 34, and an A/D converter 35. The photodetector 31 receives the light branched by the beam splitter 22 and converts the light into an electric signal. The photodetector 31 can detect the light emission timing of the optical signal. The amplifier 32 amplifies the electric signal that is output from the photodetector 31.
The third lens 33 forms an image of the laser light reflected by the object 20 on the light receiving element 34. The light receiving element 34 receives the laser light and converts the laser beam into an electric signal. The light receiving element 34 is, for example, a silicon photon avalanche diode (SPAD). In the SPAD, an avalanche photo-diode (APD) is made to operate in Geiger mode, and can output an electric signal obtained by photoelectrically converting a single received photon. Actually, a plurality of light receiving elements 34 are arranged in a one-dimensional or a two-dimensional direction. One SPAD may constitute one pixel, or two or more SPADs may constitute one pixel. The unit of one pixel is also called a silicon photomultiplier (SiPM).
The A/D converter 35 samples the electric signal output from the light receiving element 34 at a predetermined sampling rate to perform A/D conversion, and generates a digital signal. Instead of the A/D converter 35, a time-to-digital converter (TDC) may be provided.
The signal processing unit 5 includes a histogram generation unit 41 and the distance measurement unit 42. On the basis of the digital signal generated by the A/D converter 35, the histogram generation unit 41 generates a histogram that is a temporal distribution of a signal intensity of a reflected optical signal received by the light receiving unit 4.
The distance measurement unit 42 determines a timing at which a time frequency of the histogram is at its maximum to be a light reception timing of the reflected optical signal, and measures the distance to the object on the basis of a time difference between the light reception timing and the timing at which the light emission unit 2 emits the optical signal. More specifically, the distance measurement unit 42 measures the distance to the object 20 on the basis of following Equation 1.
Distance=light speed×(light reception timing of reflected light−light emission timing of optical signal)/2 Equation 1
The “light reception timing of reflected light” in Equation 1 is more precisely the light reception timing which is obtained from the histogram and at which the signal intensity of the reflected optical signal is the maximum.
The image processing unit 6 generates a distance image on the basis of the distances to the object measured by the distance measurement unit 42. The distance image includes a plurality of pixels, and each pixel represents distance information based on the reflected optical signal received by the corresponding light receiving element 34. The distance measuring device 1 according to the present embodiment is characterized by a processing operation of the image processing unit 6, and details thereof will be described later.
The control unit 7 controls the light emission unit 2, the light receiving unit 4, the signal processing unit 5, and the image processing unit 6. Specifically, the control unit 7 controls the timing at which the light emission unit 2 emits an optical signal, and also controls to cause the light receiving unit 4 to receive a reflected optical signal and to generate a digital signal. The control unit 7 may further control a time resolution with which the A/D converter 35 in the light receiving unit 4 performs A/D conversion. In addition, the control unit 7 controls to cause the histogram generation unit 41 in the signal processing unit 5 to generate a histogram and controls to cause the distance measurement unit 42 to measure the distance to the object. The control unit 7 further controls to cause the image processing unit 6 to generate a distance image.
At least a part of the distance measuring device 1 in
In the layout diagram of
The distance measuring device 1 of
In addition, the light receiving element 34 also receives an ambient optical signal such as sunlight. Since a signal level of the ambient optical signal is usually much smaller than a signal level of the reflected optical signal, the ambient optical signal can be removed by a filling process or the like, but since the signal level of the reflected optical signal from an object located far away is accordingly smaller, it may be difficult to distinguish the ambient optical signal.
The light receiving element 34 receives the first echo signal and the second echo signal from the first object A and the second object B located in the same direction as illustrated in
As described above, in the example of
A distance measuring device 1 according to a first embodiment has a block configuration similar to that in
Distance information for each pixel measured by the distance measurement unit 42 is input to the object direction detection unit 61. The object direction detection unit 61 detects a direction of the object on the basis of at least one of the signal intensity of the reflected optical signal received by the light receiving element 34 and a distance to the object measured on the basis of the reflected optical signal.
The pixel division unit 62 divides at least one or some pixels constituting the distance image on the basis of the direction of the object detected by the object direction detection unit 61. The pixel division unit 62 divides a pixel including distance information of a plurality of objects.
The object direction detection unit 61 detects the direction of at least one of the two or more objects on the basis of at least either the signal intensity of the reflected optical signal from the two or more objects received by the light receiving element 34 or the distances to the two or more objects measured on the basis of the reflected optical signal. The pixel division unit 62 divides a pixel including distance information of two or more objects on the basis of the direction of at least one of the two or more objects.
The pixel division unit 62 may perform pixel division, depending on positions of individual objects in a pixel including distance information of two or more objects. For example, as will be described later, in the case where the distance information of another object is included in an upper right part in one pixel, the upper right part may be divided and the distance information of another object may be allocated.
As described above, in the distance measuring device 1 according to the first embodiment, in the case where distance information of a plurality of objects is included in pixels constituting a distance image, a direction of an overlapping object is detected, and the pixels are divided in accordance with the direction of the object. Consequently, the pixels can be divided depending on how the objects overlap with each other, and the resolution of the distance image can be improved.
In the present embodiment, instead of dividing all the pixels constituting the distance image, the pixel division is performed only on the pixels including the distance information of a plurality of objects, so that the resolution of the distance image can be increased without extremely increasing a data amount of the distance image.
In a second embodiment, in the case where distance information of a plurality of objects is included in one pixel, an area of each object in the one pixel is calculated.
A distance measuring device 1 according to a second embodiment has a block configuration similar to that in
The object direction detection unit 61 in
The signal intensity detection unit 63 detects a signal intensity of the reflected optical signal received by the light receiving element 34. The signal intensity corresponds to luminance information of each pixel signal constituting the distance image.
The object area calculation unit 64 calculates, on the basis of a signal intensity detected by the signal intensity detection unit 63, an area of an object in the pixel corresponding to the light receiving element 34 that received the reflected optical signal.
The pixel division unit 62 divides the pixel on the basis of the direction of the object detected by the object direction detection unit 61 and the area of the object in the pixel calculated by the object area calculation unit 64.
In the case where distance information of a plurality of objects is included in one pixel, the object area calculation unit 64 calculates an area of a specific object on the basis of a result of calculating each of areas of the plurality of objects included in the one pixel or calculates the area of the specific object included in the one pixel without considering areas of objects other than the specific object included in the one pixel.
More specifically, the object area calculation unit 64 calculates the areas of the plurality of objects included in the one pixel on the basis of area percentages of the plurality of objects in the one pixel, or calculates the area of the specific object included in the one pixel without considering an area of any object other than the specific object included in the one pixel. In the case where distance information of a plurality of objects is included in one pixel, the object area calculation unit 64 calculates the area of each of the plurality of objects included in the one pixel by comparing the signal intensities of the reflected optical signal from the plurality of objects.
The object area calculation unit 64 may include a first calculation unit, a second calculation unit, and a third calculation unit.
For example, the first calculation unit calculates the proportion of the signal intensity of the light receiving element 34 corresponding to a second pixel in the case where the first object A is included in a part in the second pixel, to the signal intensity of the light receiving element 34 corresponding to a first pixel in the case where the distance information of the first object A is included in the entire area in the first pixel. The first calculation unit calculates the first term of the numerator on the right-hand side of Equation 5 to be described later.
For example, the second calculation unit calculates a proportion of the signal intensity of the light receiving element 34 corresponding to the second pixel in the case where the second object B is included in a part in the second pixel, to the signal intensity of the light receiving element 34 corresponding to a third pixel in the case where the second object B is included in the entire area in the third pixel. The second calculation unit calculates the second term in the numerator on the right-hand side of Equation 5 to be described later.
For example, the third calculation unit calculates an area proportion of at least one of the first object A and the second object B in the second pixel on the basis of the proportion calculated by the first calculation unit and the proportion calculated by the second calculation unit. The third calculation unit calculates Equation 5 to be described later.
The object area calculation unit 64 may further include a fourth calculation unit. For example, the fourth calculation unit calculates a value obtained by subtracting the proportion calculated by the second calculation unit from 1. The third calculation unit calculates an area proportion of the first object A in the second pixel on the basis of the proportion calculated by the first calculation unit and the proportion calculated by the fourth calculation unit. The fourth calculation unit performs an alternative calculation for the second term in the numerator on the right-hand side of Equation 5 to be described later.
The object area calculation unit 64 calculates an area proportion p3_A_area of the first object A in the third pixel px3 on the basis of following Equation 2. The area proportion p3_A_area is an area proportion of the first object A in the third pixel px3 when the area of the third pixel px3 is 1.
In Equation 2, the signal intensity of the reflected optical signal from the first object A in the third pixel px3 is p3_A_1st_return, the average value of the signal intensities of the fourth pixel px4 and the fifth pixel px5 is p4_p5_avr, the signal intensity of the reflected optical signal from the second object B in the third pixel px3 is p3_B_2nd_return, and the average value of the signal intensities of the first pixel px1 and the second pixel px2 is p1_p2_avr.
The first term of the numerator on the right-hand side of Equation 2 is a proportion of the signal intensity of the reflected optical signal from the first object A of the third pixel px3, to the average value of the signal intensities of the reflected optical signal from the first object A of the fourth pixel px4 and the fifth pixel px5. That is, the first term of the numerator on the right-hand side of Equation 2 represents the area proportion of the first object A in the third pixel px3.
The terms inside the parentheses in the numerator on the right-hand side of Equation 2 is a value obtained by subtracting from 1 a proportion of the signal intensity of the reflected optical signal from the second object B of the third pixel px3, to the average value of the signal intensities of the reflected optical signal from the second object B of the first pixel px1 and the second pixel px2. That is, the terms inside the parentheses in the numerator on the right-hand side of Equation 2 is a value obtained by subtracting from 1 the area proportion of the second object B in the third pixel px3, and represents the area proportion of the first object A in the third pixel px3.
As described above, Equation 2 calculates an average value of the area proportion of the first object A in the third pixel px3 obtained from the reflected optical signal of the first object A and the area proportion of the first object A in the third pixel px3 obtained from the reflected optical signal of the second object B.
Instead of calculating the area proportion p3_A_area of the first object A in the third pixel px3 on the basis of Equation 2, there can be considered a comparative example in which the area proportion p3_A_area of the first object A in the third pixel px3 is calculated on the basis of following Equation 3 or Equation 4.
Equation 3 is the first term of the numerator on the right-hand side of Equation 2 and is the proportion of the signal intensity of the reflected optical signal from the first object A of the third pixel px3, to the average value of the signal intensities of the reflected optical signal from the first object A of the fourth pixel px4 and the fifth pixel px5. Equation 4 is the terms inside the parentheses in the numerator on the right-hand side of Equation 2 and is the value obtained by subtracting from 1 the proportion of the signal intensity of the reflected optical signal from the second object B of the third pixel px3, to the average value of the signal intensities of the reflected optical signal from the second object B of the first pixel px1 and the second pixel px2.
Equation 3 obtains the area proportion p3_A_area of the first object A in the third pixel px3 by comparing the signal intensity of the third pixel px3 with the signal intensities of the fourth pixel px4 and the fifth pixel px5, but does not consider the signal intensity of the first pixel px1 or the second pixel px2. Equation 4 obtains the area proportion p3_A_area of the first object A in the third pixel px3 by comparing the signal intensity of the third pixel px3 with the signal intensities of the first pixel px1 and the second pixel px2, but does not consider the signal intensity of the fourth pixel px4 or the fifth pixel px5.
In contrast, Equation 2 obtains the area proportion p3_A_area of the first object A in the third pixel px3, considering the result of comparing the signal intensity of the third pixel px3 with the signal intensities of the fourth pixel px4 and the fifth pixel px5 and the result of comparing the signal intensity of the third pixel px3 with the signal intensities of the first pixel px1 and the second pixel px2. Therefore, there is a high possibility that the area proportion p3_A_area of the first object A in the third pixel px3 is calculated more accurately by Equation 2 than by Equation 3 or Equation 4.
Note that, as will be described later, depending on the area percentages of the first object A and the second object B included in the third pixel px3, the area proportion of the first object A in the third pixel px3 can be sometimes more accurately calculated by Equation 3 or Equation 4 than by Equation 2. This will be described later.
In the example of
In Equation 2, the area proportion p3_A_area of the first object A in the third pixel px3 is calculated, but the area proportion p3_B_area of the second object B in the third pixel px3 may be calculated on the basis of following Equation 5.
Alternatively, the area proportion p3_B_area of the second object B in the third pixel px3 may be calculated by following Equation 6 or Equation 7.
In Equation 2, the average value of the following two values is calculated: the proportion of the signal intensity of the reflected optical signal from the first object A of the third pixel px3, to the average value of the signal intensities of the reflected optical signal from the first object A of the fourth pixel px4 and the fifth pixel px5; and the proportion of the signal intensity of the reflected optical signal from the first object A of the third pixel px3, to the average value of the signal intensities of the reflected optical signal from the first object A in the first pixel px1 and the second pixel px2. However, instead of calculating the average value of these two proportions, a root mean square (RMS) of these two proportions may be calculated. In addition, a final area proportion may be calculated in consideration of both the average value and the root mean square. The same applies to Equation 5, and the root mean square may be calculated, or the final area proportion may be calculated in consideration of both the average value and the root mean square.
As illustrated by the curves w1 and w2 in
In addition, when the area proportion of the first object A in the third pixel px3 is calculated on the basis of Equation 3 or Equation 4, the variation in the area of the first object A in the third pixel px3 becomes larger as the area percentage of the first object A in the first pixel px1 becomes larger.
As illustrated in
As described above, in the distance measuring device 1 according to the second embodiment, the object direction detection unit 61, the signal intensity detection unit 63, the object area calculation unit 64, and the pixel division unit 62 are provided in the image processing unit 6. With respect to a pixel including distance information of a plurality of objects, the object area calculation unit 64 calculates an area proportion of each object in the pixel on the basis of the signal intensities of the reflected optical signal received by the light receiving element 34. The pixel division unit 62 performs pixel division on the basis of a direction of the object detected by the object direction detection unit 61 and an area of the object in the pixel calculated by the object area calculation unit 64. As a result, the pixel division can be performed in consideration of an overlapping state of the plurality of objects in the pixel; therefore, the resolution of the pixel including the distance information of the plurality of objects in a distance image can be improved, and visibility of the distance image is improved.
Furthermore, in the second embodiment, with respect to the pixel including distance information of the plurality of objects, the area proportion of each object is calculated in consideration of the distance information of the objects included in the surrounding pixels, and pixel division is performed on the basis of the calculation result. As a result, the pixel division can be performed in accordance with the position and direction of the object.
At least a part of the distance measuring device 1 described in the above-described embodiments may be configured with hardware or software. In the case where software is used to configure, a program that realizes at least some functions of the distance measuring device 1 may be stored in a recording medium such as a flexible disk or a compact disc-read only memory (CD-ROM) and be read and executed by a computer. The recording medium is not limited to a removable recording medium such as a magnetic disk or an optical disk, and may be a fixed recording medium such as a hard disk device or a memory.
In addition, a program that implements at least some of the functions of the distance measuring device 1 may be distributed via a communication line (including wireless communication) such as the Internet. Further, the program may be distributed, in an encrypted, modulated, or compressed state, via a wired line or a wireless line such as the Internet or in a state being stored in a recording medium.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the disclosures. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the disclosures. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the disclosures.
Number | Date | Country | Kind |
---|---|---|---|
2022-043107 | Mar 2022 | JP | national |