The present invention relates to a distance measurement device that measures the distance to an object using light and that is, for example, suitable to be used for a headlight of a vehicle.
To date, studies have been made on use of the light emitted from a headlight or the like as light for distance measurement. For example, International Publication No. WO2015/025497 describes a distance measurement system which projects illumination light to illuminate the front of a vehicle and receives reflected light thereof with an imaging unit to measure the distance to an object that exists in front of the vehicle. In this system, an illumination mode for illuminating the front and a distance measurement mode for measuring a distance are repeatedly executed in a time-division manner. In the distance measurement mode, light is emitted in a pulsed manner at a short time interval, and the distance to an object is measured by the TOF (Time Of Flight) method on the basis of the reception timing of the reflected light.
According to this configuration, after light is emitted in a pulsed manner in the distance measurement mode, projection of light is stopped for distance measurement until the illumination mode is started. Therefore, the duty during the period in which the illumination light is applied may decrease, and the amount of the illumination light applied may be insufficient.
A distance measurement device according to a main aspect of the present invention includes: a light source configured to emit illumination light including visible light; a photodetector configured to receive reflected light of the illumination light from an object; and a signal processing circuit configured to reduce the emission of the illumination light in a predetermined period and measure a distance to the object on the basis of a timing when the reception of the reflected light at the photodetector is reduced due to the reduction of the illumination light.
In the distance measurement device according to the aspect, since the timing in which the reception of the reflected light is reduced due to the reduction of the illumination light is detected and the distance to the object is measured, a period in which the illumination light is reduced for distance measurement can be reduced to be a short period. Thus, distance measurement can be performed smoothly while applying a sufficient amount of the illumination light (according to the eye-safe standard IEC60825-1).
The above and other objects and new features of the present invention will be fully clarified by the following description of the embodiment, when read in conjunction with accompanying drawings.
It should be noted that the drawings are solely for description and do not limit the scope of the present invention by any degree.
Hereinafter, embodiments of the present invention will be described with reference to the drawings. In each embodiment, a configuration example in the case where a distance measurement device is mounted on a vehicle is shown.
As shown in
In addition to the light sources 110 and the camera 120, the distance measurement device 100 includes a light emission control circuit 131, an imaging control circuit 132, a signal processing circuit 133, and a communication interface 134 as components of a circuitry.
The light source 110 includes a plurality of LEDs (light emitting diodes) 111. Each LED 111 emits white light. The LEDs 111 do not have to emit light of the same color, and may emit light of different colors. In this case, visible light of a predetermined color is generated by mixing the light of the respective colors. Although the three LEDs 111 are shown in
Moreover, instead of the LEDs 111, another light-emitting element such as a halogen lamp or a semiconductor laser may be used. In addition, a wavelength conversion element that generates visible light such as white light from light having a predetermined wavelength may be used. Furthermore, an optical system for guiding the light emitted from the LEDs 111, as the illumination light L1, to the front of the vehicle 10 may be provided to the light source 110. This optical system may include a lens or the like that converges the illumination light L1 into parallel light or light that slightly spreads from parallel light.
The camera 120 includes an imaging element 121 and an imaging lens 122. The imaging element 121 is a CMOS (complementary metal oxide semiconductor) image sensor, a CCD (Charge Coupled Device) image sensor, or the like. The imaging lens 122 condenses the reflected light R1 on a light receiving surface of the imaging element 121.
A filter that allows light in the wavelength band of the illumination light L1 emitted from the light source 110 to pass therethrough and that blocks light in the other wavelength bands may be disposed on the front side of the imaging element 121 (for example, between the imaging lens 122 and the imaging element 121). For example, in the case where the plurality of LEDs 111 emit light having different wavelengths, a filter that allows only light of any one of these wavelengths to pass therethrough may be disposed on the front side of the imaging element 121. Accordingly, unnecessary light other than the reflected light R1 can be inhibited from being incident on the imaging element 121.
The light emission control circuit 131 controls the LEDs 111 of the light source 110 on the basis of instructions from the signal processing circuit 133. The light emission control circuit 131 performs the same control on the two light sources 110. The imaging control circuit 132 controls the imaging element 121 of the camera 120 on the basis of instructions from the signal processing circuit 133.
The signal processing circuit 133 controls the LEDs 111 and the imaging element 121 via the light emission control circuit 131 and the imaging control circuit 132 in order to perform distance measurement while applying the illumination light L1. The signal processing circuit 133 includes an internal memory used as a work area in distance measurement. The signal processing circuit 133 transmits distance data acquired through distance measurement, to a circuitry on the vehicle 10 side via the communication interface 134.
Here, the signal processing circuit 133 measures a distance for each pixel of the imaging element 121 on the basis of a received state of the reflected light at each pixel, and transmits the measurement result (distance data) of each pixel to the circuitry on the vehicle 10 side.
As shown in
The signal processing circuit 133 shown in
Next, a distance data acquisition process performed by the signal processing circuit 133 will be described.
In order to acquire distance data of each pixel 121b, n consecutive sections are set on the time axis. The time width of each section is the same. The time width of each section is, for example, about 10 μsec.
The top chart of
The signal processing circuit 133 lowers the drive signal to the low level for a stop period ΔT1 at the timing when a fixed time elapses from the start of each section. The stop period ΔT1 is, for example, about 10 nsec. The stop period ΔT1 is set at the timing of the fixed time from the start of each section as described above. Therefore, the cycle of the stop period ΔT1 is the same as the cycle of the section.
When an object exists in the projection region of the illumination light L1, the reflected light R1 from the object is condensed on the imaging element 121. Here, the reflected light R1 is incident on the pixels 121b on which an image of the object is projected, at a delay time corresponding to the distance to the object. At this time, the reflected light R1 is missing at the reception timing of the imaging element 121 corresponding to the stop period ΔT1.
The second chart from the top of
In order to detect the timing of the missing period ΔT2, the signal processing circuit 133 controls exposure of each pixel 121b as follows.
The third chart from the top of
Here, the time width T is set, for example, to be the same as that of the exposure period Et. Accordingly, the time positions of the exposure periods Et with respect to the stop periods ΔT1 do not overlap each other between the sections. It should be noted that the time width T does not necessarily have to be set to be the same as that of the exposure period Et, and, for example, the time width T may be set to be shorter than that of the exposure period Et.
The exposure period Et in section 1 is set at the position on the time axis at which the missing period ΔT2 occurs when the object is at the minimum distance of a distance range (distance measurement range) for which distance measurement is to be performed. For example, the exposure period Et in section 1 is set at the time position delayed from the stop period ΔT1 in section 1 by a time corresponding to the minimum distance of the distance measurement range. The exposure period Et in section n is set at the time position delayed from the stop period ΔT1 in section n by a time corresponding to the maximum distance of the distance measurement range.
When the exposure period Et in each section is set as described above, the exposure period Et in one of the sections and the missing period ΔT2 of the reflected light R1 match each other. That is, when the object exists at the position corresponding to each pixel 121b, the exposure period Et and the missing period ΔT2 of the reflected light R1 match each other in the section in which the exposure period Et is set at the position, on the time axis, corresponding to the distance to the object, and the exposure period Et and the missing period ΔT2 of the reflected light R1 do not match each other in the other sections.
Here, in the section in which the exposure period Et matches the missing period ΔT2, the reflected light R1 is not incident on the pixel 121b, so that a detection signal of the pixel 121b based on the reflected light R1 is not generated. On the other hand, in each of the sections in which the exposure period Et does not match the missing period ΔT2, the reflected light R1 is incident on the pixel 121b, so that a detection signal of the pixel 121b based on the reflected light R1 is generated.
For example, in the example of
Here, the exposure period Et in section 3 is delayed by the time width 2T from the stop period ΔT1 in section 3. The time width 2T matches the delay time Dt of the reflected light R1. That is, the time width 2T corresponds to the distance to the object. Therefore, the time width 2T can be grasped by specifying section 3 in which a detection signal has not been obtained, and thus the distance to the object can be obtained.
In the present embodiment, a table in which a section and a distance are associated with each other is stored in advance in the signal processing circuit 133. Accordingly, it is not necessary to calculate the distance to the object on the basis of the time widths T, 2T, . . . , (n−1)T of the respective sections, and the process can be simplified. The signal processing circuit 133 specifies a section in which a detection signal has not been obtained, among sections 1 to n, and acquires the distance associated with the specified section from the table. The distance acquired thus is set in the distance data of the pixel 121b to be processed.
The processing for sections 1 to n is repeatedly executed while the distance measurement operation continues. A rest period having a predetermined time length may be set when shifting to the next processing for sections 1 to n. In this case, the illumination light L1 may be continuously emitted in the rest period. Furthermore, in the rest period, a distance measurement process based on the detection signal of each pixel 121b acquired in the immediately preceding sections 1 to n may be performed. The total number of sections 1 to n is set on the basis of the maximum distance of the distance measurement range and the resolution of a distance value.
In control in section 1 to section n, the signal processing circuit 133 acquires the detection signal values of each pixel 121b in the respective sections, and stores the acquired detection signal value in the internal memory. Thereafter, the signal processing circuit 133 acquires the detection signal values in the respective sections for each pixel 121b from the memory (S11), and acquires a minimum value Sm of the acquired detection signal values (S12). Furthermore, the signal processing circuit 133 acquires an average value Sa from the detection signal values other than the minimum value Sm (S13), and determines whether the difference between the average value Sa and the minimum value Sm is greater than a threshold Sth (S14).
When the difference between the average value Sa and the minimum value Sm is greater than the threshold Sth (S14: YES), the signal processing circuit 133 determines the timing of exposure in the section in which the minimum value Sm is acquired as the timing when the reception of the reflected light R1 is missing, and acquires a distance value to the object (S15). Here, the table shown in
On the other hand, when the difference between the average value Sa and the minimum value Sm is equal to or less than the threshold Sth (S14: NO), the signal processing circuit 133 sets NULL indicating infinity, for the pixel 121b to be processed (S16). Then, one cycle of the distance measurement process is completed. When detection signal values are acquired for the next section 1 to section n, the signal processing circuit 133 returns the process to step S11 and executes the same process.
The distance values for one frame acquired for each pixel 121b by the process of
According to Embodiment 1, the following effects are achieved.
As described with reference to
The imaging element 121 is used as a photodetector for receiving the reflected light R1, and the signal processing circuit 133 performs distance measurement based on the reception of the reflected light R1 being missing, for each pixel 121b of the imaging element 121. Accordingly, the distance to an object that exists in the irradiation region of the illumination light L1 can be acquired with high spatial resolution.
As described with reference to
By using the exposure control on each pixel 121b for distance measurement as described above, the distance value of each pixel can be acquired more smoothly by a simple process.
As shown in
In the process of
In Embodiment 1 described above, a combination of the stop period ΔT1 and the exposure period Et delayed from the stop period ΔT1 by a time corresponding to the distance value is set once per cycle. On the other hand, in Embodiment 2, this combination is set a plurality of times in one cycle. Then, detection signal values acquired in each of the exposure periods Ets of the plurality of times are accumulated, and the accumulated values are compared to acquire a distance value.
That is, in Embodiment 2, exposure is performed on the pixel 121b a plurality of times at the same exposure timing, and the accumulated value of signal values acquired as a result of the respective exposures of the plurality of times is used as a value indicating the magnitude of a signal at the same exposure timing. Then, when the accumulated value is a minimum value and deviates from other accumulated values, distance measurement is performed with the exposure timing when the accumulated value is acquired, as the timing when the reflected light R1 is missing.
As shown in
A distance value is acquired for each pixel 121b on the basis of detection signal values of each pixel 121b acquired in processing sections 1 to k. Following the final processing section k, light emission for illumination is performed. In the light emission for illumination, the illumination light L1 is continuously emitted from the light source 110. In the period of light emission for illumination, a process of calculating a distance value is performed on the basis of the detection signal values acquired for each pixel 121b one cycle before. The signal values in processing sections 1 to k acquired one cycle before are stored in the internal memory of the signal processing circuit 133.
As shown in
In the exposure control in processing section 1, an exposure period Et is set at a fixed time position from the start timing of each of sections 1 to m. Similar to Embodiment 1 described above, in the exposure period Et, the pixel 121b accepts incidence of the reflected light R1. The exposure period Et is, for example, about 10 nsec. The exposure period Et in processing section 1 is set at the time position delayed from the stop period ΔT1 in processing section 1 by the time corresponding to the minimum distance of the distance measurement range.
As shown in
As described above, each exposure period Et in each processing section is set at the same time position with respect to the stop period ΔT1 in the same section, and is set at time positions different from each other, between different processing sections. In processing section 2, an exposure period Et is set at the time position delayed by the time width T with respect to the exposure period Et in processing section 1, and, in processing section 3, an exposure period Et is set at the time position further delayed by the time width T with respect to the exposure period Et in processing section 2. As described above, the exposure period Et shifts in the delay direction by the time width T each time the processing section changes. The stop period ΔT1 is uniformly set at the same time position in sections 1 to m of all the processing sections.
In this case, when an object exists in the irradiation region of the illumination light L1, the reflected light R1 is incident on the pixel 121b of the imaging element 121 at the timing when a delay time corresponding to the distance to the object elapses from the light emission. In other words, the missing period of the reflected light R1 due to the stop period ΔT1 occurs in each section at a delay timing corresponding to the distance to the object.
Therefore, in the processing section in which the time position of the exposure period Et is delayed from the stop period ΔT1 by the delay time corresponding to the distance to the object, in each of sections 1 to m, the exposure period Et coincides with a period in which the reflected light R1 is missing due to the stop period ΔT1, and the reflected light R1 is not received by the pixel 121b. Thus, in this processing section, a detection signal is not outputted from the pixel 121b. On the other hand, in the other processing sections in which the time position of the exposure period Et does not match the time position corresponding to the delay time of the reflected light R1, in each of sections 1 to m, the reflected light R1 is received by the pixel 121b, and a detection signal based on the reflected light R1 is outputted from the pixel 121b.
In Embodiment 2, a processing section in which the exposure period Et coincides with the period in which the reflected light R1 is missing due to the stop period ΔT1 is detected, and the distance to the object is measured on the basis of the time position of the exposure period Et in this processing section. Specifically, the values of the detection signal outputted from the pixel 121b in sections 1 to m are accumulated for each processing section, and a processing section in which the accumulated value that is the smallest and deviates from the other accumulated values is acquired is detected as the processing section in which the exposure period Et coincides with the period in which the reflected light R1 is missing due to the stop period ΔT1. Then, the distance value to the object is acquired for each pixel 121b on the basis of the time position of the exposure period Et in the detected processing section, that is, the time difference between the stop period ΔT1 and the exposure period Et.
In Embodiment 2 as well, similar to Embodiment 1, a table in which a processing section and a distance are associated with each other is stored in advance in the signal processing circuit 133. Accordingly, it is not necessary to calculate the distance to the object on the basis of the time difference between the stop period ΔT1 and the exposure period Et in each processing section, and the process can be simplified.
The signal processing circuit 133 acquires the detection signal values in each processing section one cycle before, which are stored in the internal memory, for each pixel 121b from the memory (S21), and accumulates the acquired detection signal values for each processing section (S22). Next, the signal processing circuit 133 obtains a minimum value TSm of the accumulated values acquired for the respective processing sections (S23), and further acquires an average value TSa from the accumulated values other than the minimum value TSm (S24). Then, the signal processing circuit 133 determines whether the difference between the acquired average value TSa and the acquired minimum value TSm is greater than a threshold TSth (S25).
When the difference between the average value TSa and the minimum value TSm is greater than the threshold TSth (S25: YES), the signal processing circuit 133 determines the timing of exposure in the processing section in which the minimum value TSm is acquired, as the timing when the reception of the reflected light R1 is missing, and acquires a distance value to the object (S26). Here, the table shown in
On the other hand, when the difference between the average value TSa and the minimum value TSm is equal to or less than the threshold TSth (S25: NO), the signal processing circuit 133 sets NULL indicating infinity, for the pixel 121b to be processed (S27). Then, one cycle of the distance measurement process is completed. When detection signal values are acquired for the next processing section 1 to processing section k, the signal processing circuit 133 returns the process to step S21 and executes the same process.
In this case as well, the distance values for one frame acquired for each pixel 121b by the process of
In the distance measurement device 100 according to Embodiment 2 as well, the same effects as those of Embodiment 1 can be achieved. Specifically, a period (stop period ΔT1) in which the illumination light L1 is stopped for distance measurement can be reduced to be a short period, and thus distance measurement can be smoothly performed while applying a sufficient amount of the illumination light L1.
Moreover, in the configuration of Embodiment 2, the effect that the measurement accuracy of the distance value for each pixel 121b can be improved as compared to Embodiment 1 can be achieved as follows.
That is, in the configuration of Embodiment 2, the emission of the illumination light L1 is stopped at a fixed timing in each of sections 1 to m each having a predetermined cycle, the timing of exposure of each pixel 121b is fixed in each processing section including a plurality of sections 1 to m, but is changed between the processing sections, and the distance to the object is measured for each pixel 121b of the imaging element 121 on the basis of the values of the signal outputted from each pixel 121b in each processing section as a result of the exposure.
Specifically, the signal processing circuit 133 accumulates the signal values of each pixel 121b for each of processing sections 1 to k to obtain an accumulated value, and the distance to the object is measured for each pixel 121b on the basis of the accumulated value of each of processing sections 1 to k.
More specifically, the signal processing circuit 133 determines the processing section in which one accumulated value (minimum value TSm) that is the smallest and deviates from the other accumulated values is acquired, as the processing section in which the reception of the reflected light R1 is missing. In the configuration shown in
By comparing the accumulated values each obtained by accumulating a plurality of detection signal values as described above, erroneous detection of the processing section corresponding to the missing timing of the reflected light R1 can be more reliably prevented, so that the measurement accuracy of the distance value can be improved.
For example, in the case where the distance to the object is long, the amount of the reflected light R1 incident on the pixel 121b is significantly decreased. That is, the amount of the reflected light R1 incident on the pixel 121b is inversely proportional to the square of the distance to the object. Therefore, in the case where the distance to the object is long, the difference between the detection signal value detected in the exposure period Et whose time position coincides with the timing when the reflected light R1 is missing, and the detection signal value detected in each of the other exposure periods Et, becomes much smaller.
Therefore, in the case where a combination of the stop period ΔT1 and the exposure period Et delayed from the stop period ΔT1 by the time corresponding to the distance value is set once per cycle as in Embodiment 1 described above, the exposure period Et whose time position does not coincide with the timing in which the reflected light R1 is missing may be erroneously detected, due to the influence of unnecessary light, etc., as the exposure period Et at the timing when the reflected light R1 is missing.
On the other hand, in Embodiment 2, since the accumulated value of detection signal values acquired for the exposure periods Et of a plurality of times (m times) is used for obtaining a distance value, the difference between the accumulated value for the exposure period Et whose time position coincides with the timing when the reflected light R1 is missing and the accumulated value for each of the other exposure periods Et becomes greater. Accordingly, even in the case where the distance to the object is long, the processing section in which the reflected light R1 is not missing can be reliably prevented from being erroneously detected as the processing section in which the reflected light R1 is missing. As a result, the measurement accuracy of the distance value for each pixel 121b can be improved.
In the process of
In the process of
In Embodiments 1 and 2 described above, the distance measurement process in a state where the illumination light L1 is projected, such as during travelling at night, has been described. However, in Embodiment 3, a distance measurement process in a state where the illumination light L1 is not projected, such as during daytime travelling, will be described.
In Embodiment 3, the signal processing circuit 133 raises a drive signal to a high level for a projection period ΔT11 at a timing when a fixed time elapses from the start of each section. In this case, only in the projection period ΔT11, the illumination light L1 is projected to a projection region. When an object exists in the projection region, the reflected light R1 is received by pixels on which the object is projected. A reception period ΔT12 of the reflected light R1 is the timing delayed from the projection period ΔT11 by the delay time Dt corresponding to the distance to the object.
The signal processing circuit 133 sets an exposure period Et at the same timing as in
Therefore, in Embodiment 3, the time width 2T can be grasped by specifying section 3 in which the high-level detection signal is obtained, and thus the distance to the object can be obtained. In this case as well, similar to Embodiment 1 described above, the signal processing circuit 133 acquires the distance to the object by referring to a table in which a section and a distance are associated with each other.
In control in section 1 to section n, the signal processing circuit 133 acquires detection signal values of each pixel 121b in the respective sections, and stores the acquired detection signal values in the internal memory. Thereafter, the signal processing circuit 133 acquires the detection signal values in the respective sections for each pixel 121b from the memory (S31), and acquires a maximum value Sm1 of the acquired detection signal values (S32). Furthermore, the signal processing circuit 133 acquires an average value Sal from the detection signal values other than the maximum value Sm1 (S33), and determines whether the difference between the maximum value Sm1 and the average value Sal is greater than a threshold Sth1 (S34).
When the difference between the maximum value Sm1 and the average value Sal is greater than the threshold Sth1 (S34: YES), the signal processing circuit 133 determines the timing of exposure in the section in which the maximum value Sm1 is acquired, as the timing when the reception of the reflected light R1 occurs, and acquires the distance value to the object (S35). Here, a distance value is acquired from a table similar to that of
On the other hand, when the difference between the maximum value Sm1 and the average value Sal is equal to or less than the threshold Sth1 (S34: NO), the signal processing circuit 133 sets NULL indicating infinity, for the pixel 121b to be processed (S36). Then, one cycle of the distance measurement process is completed. When detection signal values are acquired for the next section 1 to section n, the signal processing circuit 133 returns the process to step S31 and executes the same process.
According to Embodiment 3, distance measurement can be performed even in a situation where the illumination light L1 is not projected, such as during daytime.
In Embodiment 3 as well, similar to Embodiment 2 described above, distance measurement may be performed using a plurality of processing sections each including sections 1 to n. In this case, in the flowchart of
The distance measurement process (daytime mode) in Embodiment 3 and the distance measurement process (night mode) in Embodiments 1 and 2 described above may be automatically switched in accordance with the ambient brightness. In this case, whether the ambient brightness has changed to a situation for switching between these modes may be determined on the basis of detection signals acquired in sections 1 to n.
The left side of
The right side of
The left side of
The right side of
When the engine of the vehicle starts, the signal processing circuit 133 turns off the illumination light L1 (S41), and sets the distance measurement mode to the daytime mode (S42). Next, the signal processing circuit 133 determines whether the average value Sal acquired during execution of the daytime mode, that is, the average value of the detection signal values other than the maximum value among the detection signal values acquired in the respective sections, is smaller than the threshold Th2 (S43). When the average value Sal is equal to or greater than the threshold Th2 (S43: NO), the signal processing circuit 133 continues the daytime mode (S42).
On the other hand, when the average value Sal is smaller than the threshold Th2 (S43: YES), the signal processing circuit 133 performs a process of switching the distance measurement mode to the night mode. In this case, the signal processing circuit 133 turns on the illumination light L1 (S44), and sets the distance measurement mode to the night mode (S45). Thereafter, the signal processing circuit 133 determines whether the minimum value Sm acquired during execution of the night mode, that is, the minimum value among the detection signal values acquired in the respective sections, is greater than the threshold Th1 (S46). When the minimum value Sm is equal to or less than the threshold Th1 (S46: NO), the signal processing circuit 133 continues the night mode (S45). On the other hand, when the minimum value Sm is greater than the threshold Th1 (S46: YES), the signal processing circuit 133 returns the process to step S41 and performs a process of switching the distance measurement mode to the daytime mode.
According to this configuration, the distance measurement mode can be automatically switched between the daytime mode and the night mode while automatically switching the illumination light L1 on/off in accordance with the ambient brightness. Therefore, the convenience of the driver can be improved.
Moreover, since a change in ambient brightness is determined on the basis of the detection signals acquired during execution of the daytime mode and the night mode, it is not necessary to additionally provide a sensor for detecting the ambient brightness. Therefore, the distance measurement mode can be switched between the daytime mode and the night mode with a simple configuration and processing.
However, this effect does not necessarily exclude the provision of an illuminance sensor for detecting the ambient brightness.
In the flowchart of
Moreover, in step S43 of
<Modifications>
In Embodiments 1 and 2 described above, the emission of the illumination light L1 is reduced by stopping the emission of the illumination light L1. However, the emission of the illumination light L1 may be reduced by lowering the emission level of the illumination light L1 as compared to that during normal illumination operation. For example, in the period corresponding to the above-mentioned stop period ΔT1, the emission level of the illumination light L1 may be lowered to about ⅕ to 1/10 of that during normal illumination operation. In this case, for example, the reflected light is received even in the missing period ΔT2 of
In this case, the difference between the average value Sa and the minimum value Sm in step S14 of
In Embodiments 1 and 2 described above, the imaging element 121 is used as the photodetector, but the photodetector used for distance measurement is not limited thereto. For example, a photodetector 123 in which a plurality of sensors 123a (detection regions) are arranged in a matrix as shown in
It should be noted that the imaging element 121 has a higher resolution than the photodetector 123 of
In Embodiments 1 and 2 described above, by shifting the exposure period Et, the missing timing of the reflected light R1 is detected, and the distance to the object is measured on the basis of the detection result. However, the method for measuring the distance to the object is not necessarily limited thereto. For example, in the case where the photodetector 123 of
In Embodiments 1 and 2 described above, the visible light is emitted as the illumination light L1 from the light sources 110. However, it is sufficient that the illumination light L1 emitted from the light sources includes visible light, and, for example, invisible light such as infrared light may be included in the illumination light L1 together with visible light.
In Embodiments 1 and 2 described above, the distance measurement device 100 is mounted on the vehicle 10, but the apparatus on which the distance measurement device 100 is mounted is not limited thereto. For example, the distance measurement device 100 may be used for spotlights for crime prevention.
In addition to the above, various modifications can be made as appropriate to the embodiments of the present invention, without departing from the scope of the technological idea defined by the claims.
Number | Date | Country | Kind |
---|---|---|---|
JP2018-164954 | Sep 2018 | JP | national |
This application is a continuation of International Application No. PCT/JP2019/30024 filed on Jul. 31, 2019, entitled “DISTANCE MEASUREMENT DEVICE”, which claims priority under 35 U.S.C. Section 119 of Japanese Patent Application No. 2018-164954 filed on Sep. 3, 2018, entitled “DISTANCE MEASUREMENT DEVICE”. The disclosure of the above applications is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
20040233416 | Doemens | Nov 2004 | A1 |
20050062710 | Kasai | Mar 2005 | A1 |
20080151044 | Sawachi | Jun 2008 | A1 |
20080152214 | Sawachi | Jun 2008 | A1 |
20110304842 | Kao et al. | Dec 2011 | A1 |
20130228691 | Shah | Sep 2013 | A1 |
20160161611 | Ito et al. | Jun 2016 | A1 |
20160266253 | Kubota | Sep 2016 | A1 |
20170127036 | You | May 2017 | A1 |
20180053799 | Otani | Feb 2018 | A1 |
20180135980 | Nakamura | May 2018 | A1 |
20180247148 | Saitou | Aug 2018 | A1 |
20180259647 | Takano et al. | Sep 2018 | A1 |
20190072648 | Iwai | Mar 2019 | A1 |
20190079170 | Masuda | Mar 2019 | A1 |
Number | Date | Country |
---|---|---|
201703340 | Apr 2010 | CN |
201703340 | Jan 2011 | CN |
H07110432 | Oct 1993 | JP |
H07144889 | Nov 1993 | JP |
H07144889 | Nov 1993 | JP |
H07110432 | Apr 1995 | JP |
2002262308 | Mar 2001 | JP |
2002262308 | Sep 2002 | JP |
2013-183460 | Sep 2013 | JP |
2016-170114 | Sep 2016 | JP |
2017-138110 | Aug 2017 | JP |
2015025497 | Feb 2015 | WO |
2017085916 | May 2017 | WO |
Entry |
---|
International Search Report issued in corresponding International Patent Application No. PCT/JP2019/030024, dated Oct. 29, 2019, with English translation. |
Number | Date | Country | |
---|---|---|---|
20210166410 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2019/030024 | Jul 2019 | US |
Child | 17171907 | US |