This application claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application Nos. 2019-053171, filed on Mar. 20, 2019, and 2019-162233, filed on Sep. 5, 2019 in the Japan Patent Office, the disclosure of which is incorporated by reference herein in its entirety.
This disclosure relates to a range finding device and a range finding method.
A range to an object existing in a range finding area can be detected or found using a method of emitting a laser beam toward the range finding area, receiving a reflection light reflected from the object existing in the range finding area, and calculating a distance to the object using a time difference between a start time when the laser beam is emitted and a time when the light reflected from the object is received. The range finding method is known as the time-of-flight (TOF) method or system. A range finding device that measures the range or distance to objects using the TOF system is referred to as a TOF camera or range finding camera. Further, the range finding device can be also referred to as an image capture apparatus or TOF camera.
In the range finding device (TOF camera) using the TOF method, the light-receiving level (the amount of received light) of the reflection light coming from the object is a critical factor. If the light receiving level is too low, the sensor output signal is not distinguishable from the noise signal, and if the light reception level is too high, the sensor output signal becomes saturated, with which the range or distance to object cannot be computed accurately.
One technology includes a light-receiving unit having a light receiving surface, and a lighting unit that emits the light, in which the intensity of light emitted from the lighting unit is controlled so that the light receiving surface receives the light amount of normal level to obtain an accurate range finding value.
This image capture apparatus (TOF camera) performs the exposure control based on the measured light amount of the target range finding area. That is, when the light-receiving unit measures the exposed light amount, if the measured light amount is equal to or greater than a given value (e.g., normal value), the intensity of light emitted from the lighting unit is reduced, and if the measured light amount is lower than the given value, the intensity of light emitted from the lighting unit is increased. However, this image capture apparatus (TOF camera) may not cope with a situation that the light intensity of the reflection light fluctuates or varies depending on a distance from the image capture apparatus (TOF camera) to a target object when the light is emitted from the lighting unit and then the light reflected from the object reaches the image capture apparatus (TOF camera).
In one aspect of the present invention, a range finding device is devised. The range finding device includes a light-emitting unit including a plurality of light-emitting regions arranged in a two-dimensional array, configured to emit light from each of the plurality of light-emitting regions; an optical element configured to guide the light emitted from the light-emitting unit to a range finding area; a light-receiving unit configured to receive light reflected from a target object existing in the range finding area when the light emitted from the light-emitting unit hits the target object and reflects from the target object, the light-receiving unit being divided into a plurality of light receiving regions; and circuitry configured to control light emission amount of each of the plurality of light-emitting regions of the light-emitting unit respectively corresponding to the plurality of light receiving regions of the light-receiving unit; measure an amount of light received at each of the light receiving regions of the light-receiving unit; measure a distance to the target object for each of the light receiving regions of the light-receiving unit by measuring a time difference between a start time when the light is emitted from the light-emitting unit and a time when the light reflected from the target object is received by each of the light receiving regions of the light-receiving unit; cause each of the plurality of light-emitting regions of the light-emitting unit to emit the light with the same light amount as a preliminary light emission stage; control the light emission amount of each of the plurality of light-emitting regions of the light-emitting unit to be used for a main light emission stage corresponding to a range finding operation of the target object based on the amount of light received at each of the light receiving regions of the light-receiving unit that was measured at the preliminary light emission stage; and measure the distance to the target object for each of the light receiving regions of the light-receiving unit by performing the range finding operation at the main light emission stage.
In another aspect of the present invention, a method of finding a range to a target object is devised. The method includes emitting light from a light-emitting unit including a plurality of light-emitting regions arranged in a two-dimensional array; guiding the light emitted from the light-emitting unit to a range finding area using an optical element; receiving, using a light-receiving unit, light reflected from the target object existing in the range finding area when the light emitted from the light-emitting unit hits the target object and reflects from the target object, the light-receiving unit being divided into a plurality of light receiving regions; controlling light emission amount of each of the plurality of light-emitting regions of the light-emitting unit respectively corresponding to the plurality of light receiving regions of the light-receiving unit; measuring an amount of light received at each of the light receiving regions of the light-receiving unit; measuring a distance to the target object for each of the light receiving regions of the light-receiving unit by measuring a time difference between a start time when the light is emitted from the light-emitting unit and a time when the light reflected from the target object is received by each of the light receiving regions of the light-receiving unit; causing each of the plurality of light-emitting regions of the light-emitting unit to emit the light with the same light amount as a preliminary light emission stage; controlling the light emission amount of each of the plurality of light-emitting regions of the light-emitting unit to be used for a main light emission stage corresponding to a range finding operation of the target object based on the amount of light received at each of the light receiving regions of the light-receiving unit that was measured at the preliminary light emission stage; and measuring the distance to the target object for each of the light receiving regions of the light-receiving unit by performing the range finding operation at the main light emission stage.
A more complete appreciation of the description and many of the attendant advantages and features thereof can be readily acquired and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
A description is now given of exemplary embodiments of the present inventions.
It should be noted that although such terms as first, second, etc. may be used herein to describe various elements, components, regions, layers and/or units, it should be understood that such elements, components, regions, layers and/or units are not limited thereby because such terms are relative, that is, used only to distinguish one element, component, region, layer or unit from another region, layer or unit. Thus, for example, a first element, component, region, layer or unit discussed below could be termed a second element, component, region, layer or unit without departing from the teachings of the present inventions.
In addition, it should be noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present inventions. Thus, for example, as used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Moreover, the terms “includes” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Hereinafter, a description is given of a range finding apparatus and a range finding method according to an embodiment with reference to the drawings.
The light-emitting unit 20 includes, for example, a vertical cavity surface emitting laser (VCSEL) 21 as a light source. The VCSEL 21 includes a plurality of light-emitting regions arrayed two-dimensionally. The light emission control unit 50 controls the light emission operation of each one of the light-emitting regions of the VCSEL 21. The light-emitting unit 20 further includes the lens 22 (an example of optical element) that adjusts (e.g., enlarge) an angle of view of the light emitted from each of light-emitting points or regions of the VCSEL 21 to a given required angle and then projects the projection light with the given angle of view.
The TOF camera 10 includes the determination unit 40. The determination unit 40 controls the light emission control unit 50 to control the emission timing and the emission amount of light emitted by the light-emitting regions of the VCSEL 21.
The light-receiving unit 30 includes, for example, the light receiving lens 31 and the light receiving sensor 32. The light receiving lens 31 collects the light reflected from the object, and the light receiving sensor 32 receives the light collected by the light receiving lens 31 and performs the photoelectric conversion of the light. In the embodiment, the light receiving sensor 32 employs, for example, a time-of-flight (TOF) sensor. The light receiving sensor 32 is configured by arranging light receiving regions (e.g., sensor elements) in a two-dimensional pattern corresponding to the two-dimensional pattern of the light-emitting regions of the VCSEL 21, and the detection signal of each sensor element is output individually. The sensor element is also referred to as the imaging element.
In this description, the TOF camera 10 includes the light-emitting unit having a plurality of light-emitting regions arranged in the two-dimensional array (e.g., laser light source unit), and the imaging element, in which an imaging face of the imaging element is divided into a plurality of discrete imaging regions (i.e., mesh pattern) by setting the number of discrete imaging regions same as the number of light-emitting regions and setting the same arrangement pattern for the light-emitting regions and the discrete imaging regions, and then the range of object is detected for each imaging region. In this description, the imaging region may be also referred to as light receiving region.
The detection signal of the light receiving sensor 32 is input to the light measurement/control unit 60. The light measurement/control unit 60 performs statistical data processing of the sensor output value received from each detection region of the light receiving sensor 32 such as the TOF sensor, and functions as a region light intensity measurement unit that measures the light receiving level (the amount or quantity of received light) for each imaging region. The light measurement/control unit 60 also controls the light reception time, light reception sensitivity, timing or the like of the light receiving sensor 32, which are synchronized with the light emission control of the light-emitting regions of the VCSEL 21.
The region light intensity measurement unit also measures the light receiving level (light receiving amount) for each imaging region during a preliminary light emission stage of the laser light by the VCSEL 21, and then a region light intensity control unit 41, to be described later, controls the light emission amount at the time of performing the range finding of object (hereinafter, main light emission stage) based on the amount of light received by each of imaging regions during the preliminary light emission stage.
The determination unit 40 includes, for example, the region light intensity control unit 41. The region light intensity control unit 41 determines the light emission amount of each light emission region of the VCSEL 21 and the light reception sensitivity of the light receiving sensor 32 during the image capturing operation based on the statistical sensor output data of each light receiving region of the light receiving sensor 32. The determination unit 40 further includes, for example, a range finding unit 42. The range finding unit 42 measures or calculates a distance to a target object for each imaging region by measuring the time difference between a time when emitting the laser beam from the VCSEL 21 and a time when receiving the light reflected from the object by the light receiving sensor 32 for each imaging region.
The range finding operation by the TOF camera 10 is equivalent to the known TOF camera operation using the general phase detection method. For example, the technology of the TOF camera described in JP-2018-077143-A can be used. Therefore, the detailed description of the range finding operation by the TOF camera 10 is omitted.
Hereinafter, a description is given of a two-dimensional arrangement of the sensor elements of the VCSEL 21 used as the light source unit of the light-emitting unit 20 with reference to
When a lateral direction of the light receiving surface of the light receiving sensor 32 is defined as X direction, a longitudinal direction of the light receiving surface of the light receiving sensor 32 is defined as Y direction, and a direction away from the light receiving surface of the light receiving sensor 32 is defined as Z direction, the light receiving surface becomes a X-Y plane. When a range finding area is viewed from the light receiving sensor 32, the distance to the object from the discrete imaging regions on the X-Y plane of the light receiving sensor 32 may vary for each of the discrete imaging regions except that the object existing in the range finding area is a plane object parallel to the XY plane.
When the light set with a given uniform intensity is emitted from the TOF system toward the range finding area, the level of the reflection light becomes higher in an area where a nearby object is present, and the output level of the light receiving sensor 32 is saturated.
Further, the level of the reflection light becomes lower in an area where a far-side object is present, and the output signal of the light receiving sensor 32 is not distinguishable from the noise signal. That is, when the object distance position is too close to or too far from the TOF camera 10, the range finding cannot be performed or the range finding precision deteriorates, and thereby the range of object detectable by performing the range finding operation is restricted.
Hereinafter, a description is given of this situation in detail with reference to
As indicated in
As illustrated in
If the distance to object is far and the output of the light receiving sensor 32 becomes smaller at the preliminary light emission stage, and thereby the range finding operation cannot be performed, the light-emission amount of the VCSEL 21 at the main light emission stage is increased. By contrast, if the distance to the object is close and the output of the light receiving sensor 32 is saturated at the preliminary light emission stage, the light-emission amount of the VCSEL 21 at the main light emission stage is reduced. The region light intensity control unit 41 controls the light emission control unit 50 to control the light emission amount at the main light emission stage.
In
In step S601, the preliminary light emitting is performed prior to an image capturing operation. In step S601, the light emission amount of the VCSEL 21 is set to the same for the entire pixels so that the light emission amount is set to a uniform level for the entire regions of the VCSEL 21.
In step S602, the reflection light is received and detected by the light receiving sensor 32 at the preliminary light emission stage. Since the light receiving sensor 32 includes the light receiving regions, each of which is the discrete light-receiving region as described above, a statistical value of each light receiving region is calculated from the sensor output value of the respective light receiving regions.
In step S603, the light emission amount at each region of the VCSEL 21 to be used for the main light emission stage (image capturing operation) is set based on the statistical value of each of the light receiving regions calculated in step S602. The light emission amount of the VCSEL 21 is set as described with reference to
In step S604, the main light emission stage is performed by emitting the light from the VCSEL 21 with the light amount set in step S603.
In step S605, the data of the light receiving sensor 32 is acquired in line with the light emission timing.
In step S606, the range finding processing is performed based on the data of the light receiving sensor 32 obtained in step S605 to calculate distance information of the object existing within the angle of view set by the lens 22 (optical element). As described above, by setting the appropriate light emission amount for each of the light-emitting regions of the VCSEL 21, the range of object detectable by performing the range finding operation can be improved.
When the light is emitted at the main light emission stage, in step S706, the reflection light is detected by the light receiving sensor 32, a statistical value of each light receiving region of the light receiving sensor 32 is calculated from the sensor output value of the respective light receiving regions of the light receiving sensor 32, and then a region where the range finding cannot be performed due to too-small sensor output is detected as a sensitivity-increase-required region.
In step S707, the light-receiving sensitivity of the light receiving sensor 32 is increased at the sensitivity-increase-required region alone, detected in step S706.
In step S708, the light-emission amount is set for each light-emitting region of the VCSEL 21.
In step S709, the light is emitted from each light-emitting region of the VCSEL 21 based on the light-emission amount set in step S708.
In step S710, the light is received at each light-receiving region of the light receiving sensor 32.
In step S711, it is confirmed whether or not the sensitivity-increase-required region still exists. If the sensitivity-increase-required region still exists (S711: YES), the sequence returns to step S707, and then the sensitivity of the sensitivity-increase-required region is increased and then the range finding operation is performed again and the light is received by the light receiving sensor 32 again. If the sensitivity-increase-required region does not exist (S711: NO), the sequence proceeds to step S712.
In step S712, the image capturing operation is performed, and then the data detected by the light receiving sensor 32 is stored.
In the sequence described in
In step S713, the range finding processing is performed based on the data of the light receiving sensor 32 obtained in step S712 to calculate the range or distance information of the object existing within the angle of view set by the lens 22 (optical element).
As described above, by setting the appropriate light emission amount for each light-emitting region of the VCSEL 21, and the appropriate sensitivity for each light-receiving regions of the light receiving sensor 32, the range of object detectable by performing the range finding operation can be further increased compared to the range finding operation of
According to the above described embodiment, a range of object detectable by performing the range finding operation can be improved for far-side objects and nearby objects.
In the above-described embodiments, the VCSEL having the plurality of light-emitting regions arranged in the two-dimensional array is described as the laser light source unit having the plurality of light-emitting regions arranged in the two-dimensional array, but is not limited thereto. For example, any light source unit having a plurality of light-emitting regions arranged in a two-dimensional array can be used instead of the VCSEL.
Numerous additional modifications and variations are possible in light of the above teachings. It is therefore to be understood that, within the scope of the appended claims, the disclosure of this specification can be practiced otherwise than as specifically described herein. Any one of the above-described operations may be performed in various other ways, for example, in an order different from the one described above.
Each of the functions of the above-described embodiments can be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA), system on a chip (SOC), graphics processing unit (GPU), and conventional circuit components arranged to perform the recited functions.
Number | Date | Country | Kind |
---|---|---|---|
2019-053171 | Mar 2019 | JP | national |
2019-162233 | Sep 2019 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
8761594 | Gross et al. | Jun 2014 | B1 |
9325920 | Van Nieuwenhove | Apr 2016 | B2 |
9921312 | Takano | Mar 2018 | B2 |
10641874 | Campbell | May 2020 | B2 |
20110102619 | Niinami | May 2011 | A1 |
20110261225 | Niinami | Oct 2011 | A1 |
20120038903 | Weimer et al. | Feb 2012 | A1 |
20130050408 | Masuda et al. | Feb 2013 | A1 |
20130242040 | Masuda et al. | Sep 2013 | A1 |
20140071226 | Satoh et al. | Mar 2014 | A1 |
20150062363 | Takenaka et al. | Mar 2015 | A1 |
20150222816 | Shohara et al. | Aug 2015 | A1 |
20160048973 | Takenaka | Feb 2016 | A1 |
20160147045 | Masuda et al. | May 2016 | A1 |
20160227104 | Guan et al. | Aug 2016 | A1 |
20180128919 | Ichikawa et al. | May 2018 | A1 |
20180143322 | Rosenzweig et al. | May 2018 | A1 |
20180284241 | Campbell et al. | Oct 2018 | A1 |
20190394447 | Yokota et al. | Dec 2019 | A1 |
20200029025 | Yokota et al. | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
2010-219810 | Sep 2010 | JP |
2011-233963 | Nov 2011 | JP |
2012-198337 | Oct 2012 | JP |
2013-054236 | Mar 2013 | JP |
2013-066086 | Apr 2013 | JP |
2013-066163 | Apr 2013 | JP |
2013-182218 | Sep 2013 | JP |
2013-192088 | Sep 2013 | JP |
2013-198062 | Sep 2013 | JP |
2013-198070 | Sep 2013 | JP |
2013-214947 | Oct 2013 | JP |
2013-218278 | Oct 2013 | JP |
2014-056048 | Mar 2014 | JP |
2014-057156 | Mar 2014 | JP |
2014-123797 | Jul 2014 | JP |
2016-027744 | Feb 2016 | JP |
2016-040670 | Mar 2016 | JP |
2016-058840 | Apr 2016 | JP |
2016-114953 | Jun 2016 | JP |
2016-140030 | Aug 2016 | JP |
2017-022742 | Jan 2017 | JP |
2017-058684 | Mar 2017 | JP |
2017-111457 | Jun 2017 | JP |
2017-175616 | Sep 2017 | JP |
2018-074479 | May 2018 | JP |
2018-077143 | May 2018 | JP |
2018-152632 | Sep 2018 | JP |
2018-155664 | Oct 2018 | JP |
2018-163363 | Oct 2018 | JP |
2019-074758 | May 2019 | JP |
2019-097178 | Jun 2019 | JP |
2019-118090 | Jul 2019 | JP |
2019-159284 | Sep 2019 | JP |
2019-159344 | Sep 2019 | JP |
Entry |
---|
Extended European Search Report dated Sep. 4, 2020 in European Patent Application No. 20162213.1, 9 pages. |
Number | Date | Country | |
---|---|---|---|
20200301011 A1 | Sep 2020 | US |