The present disclosure relates to a gating camera.
An object identification system that senses a position and a kind of an object existing in the vicinity of the vehicle is used for autonomous driving or for autonomous control of light distribution of a headlamp. The object identification system includes a sensor and an arithmetic processing device configured to analyze an output of the sensor. The sensor is selected from among cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter-wave radars, ultrasonic sonars, and the like, giving consideration to the application, required precision, and cost.
It is not possible to obtain depth information from a typical monocular camera. Accordingly, it is difficult to separate multiple objects at different distances even when the multiple objects overlap.
As a camera capable of acquiring depth information, a TOF camera is known. The time of flight (TOF) camera is configured to project infrared light by a light emitting device, measure the time of flight until reflected light returns to an image sensor, and obtain a TOF image obtained by converting the time of flight into distance information.
As an active sensor instead of the TOF camera, a gating camera (Gating Camera or Gated Camera) has been proposed (Patent Literatures 1 and 2). The gating camera is configured to divide an imaging range into multiple ranges, and to capture an image for each range while changing exposure timing and exposure time. This allows a slice image to be acquired for each target range. Each slice image includes only an object included in the corresponding slice.
With an active sensor such as a distance measurement sensor or a gating camera, there is a need to accurately calibrate a time difference between a light emission timing of a light emitting device and an exposure timing of a light receiving device. Patent Literatures 1 to 3 disclose techniques related to calibration.
The techniques disclosed in Patent Literatures 1 to 3 are each configured as a ToF sensor assuming that there is hardware for measuring the time of flight. The techniques cannot be applied to a gating camera.
Patent Literature 1 discloses a calibration method for a distance measurement system mounted on a small electronic apparatus. Specifically, the electronic apparatus is placed on a desk or the like, and a surface of the desk is used as a reflector. The application of the technique is limited to a small electronic apparatus, and the technique cannot be applied to a vehicle sensor that does not always have a reflector at the same distance.
In Patent Literature 2, a reflection unit that reflects light emitted from a light emitting unit to a light receiving unit is built into an optical distance measuring device. In the technique, a part of the light emitted from the light emitting unit is shielded by the reflection unit, or only the light reflected from the reflection unit is incident on a part of the light receiving unit. That is, a part of hardware is allocated for calibration, and thus cannot be used during normal imaging, and a part of hardware (or a part of energy) is wasted.
Patent Literature 3 discloses a technique in which a part of light emitted from a light source is incident on an image sensor through a light guide portion and an optical fiber. As in Patent Literature 2, a part of the image sensor is allocated for calibration, and thus cannot be used during the normal imaging, and a part of hardware is wasted.
The present disclosure has been made in view of such a situation, and an exemplary object of an aspect thereof is to provide a gating camera capable of calibration.
An aspect of the present disclosure relates to a gating camera configured to divide a field of view in a depth direction into multiple ranges, and to generate multiple slice images that correspond to the multiple ranges. The gating camera includes a controller configured to generate an emission control signal and a first exposure control signal, an illumination apparatus configured to emit probe light in accordance with the emission control signal during the normal imaging, an image sensor configured to perform exposure in accordance with the first exposure control signal, and a calibration light source configured to emit calibration light to the image sensor in accordance with the emission control signal during calibration. The controller sweeps a time difference between the emission control signal and the first exposure control signal, and acquires a time difference at which a pixel value of the image sensor increases during the calibration.
According to an aspect of the present disclosure, calibration of a gating camera is achieved.
Description will be made regarding a summary of some exemplary embodiments of the present disclosure. The summary is provided as a prelude to the detailed description that will be described later, is intended to simplify the concepts of one or more embodiments for the purpose of basic understanding of the embodiments, and is not intended to limit the scope of the invention or the disclosure. The summary is not an extensive overview of all possible embodiments and is not intended to limit essential components of the embodiments. For convenience, “an embodiment” may be used to refer to a single embodiment (example or modification) or multiple embodiments (example or modification) disclosed in the specification.
A gating camera according to an embodiment divides a field of view in a depth direction into multiple ranges, and generates multiple slice images that correspond to the multiple ranges. A controller sweeps a time difference between the emission control signal and the first exposure control signal, and monitors a change in the pixel value of the image sensor at each time difference during the calibration.
According to the configuration, a timing error can be calibrated in the gating camera having no hardware for measuring a flight time. Furthermore, by preparing a light source for calibration in addition to the light source used during the normal imaging, imaging using all the pixels of the image sensor can be performed during the normal imaging, and the probe light generated by the illumination apparatus is not shielded, and thus waste of hardware can be reduced.
In an embodiment, the controller may acquire a value of a time difference when the pixel value relatively increases.
In an embodiment, the controller may generate a second exposure control signal during a period in which the image sensor cannot detect the calibration light during the calibration. The controller may acquire a time difference in which a value calculated by correcting a pixel value calculated according to the first exposure control signal with a pixel value calculated according to the second exposure control signal increases. Since ambient light can be detected by the second exposure control signal and the influence of the ambient light can be reduced, accuracy of the calibration can be improved. In particular, in the case of a vehicle sensor, the ambient light cannot be blocked during the calibration, and thus the configuration is effective.
In an embodiment, the second exposure control signal may be generated every time the time difference is switched. In a case where the ambient light varies with time, calibration accuracy can be improved.
In an embodiment, the second exposure control signal may be generated as a set with the first exposure control signal. That is, the influence of the ambient light can be further prevented by imaging the ambient light every time the exposure for the purpose of imaging the calibration light is performed.
In an embodiment, the image sensor may be a multi-tap image sensor, and may image using a first tap according to the first exposure control signal and image using a second tap according to the second exposure control signal.
In an embodiment, the illumination apparatus may include a laser diode, and the calibration light source may include a light emitting diode. An increase in cost can be prevented by using the light emitting diode instead of a laser diode as the calibration light source.
In an embodiment, the illumination apparatus and the calibration light source may share a drive circuit.
In an embodiment, the controller may monitor multiple pixel values of the image sensor, and may acquire a time difference for each pixel value. In a case where a timing error exists for each pixel of the image sensor, the time difference for each pixel can be calibrated.
In an embodiment, the controller may monitor a pixel value of a predetermined range of the image sensor, and may acquire a time difference in which the pixel value increases. In a case where the timing error for each pixel is negligible, the method is preferably employed.
In an embodiment, the controller may monitor the multiple pixel values of the image sensor, and may acquire a time difference in which a representative value based on the multiple pixel values increases.
Hereinafter, preferred embodiments will be described with reference to the drawings. The same or similar components, members, and processes shown in the drawings are denoted by the same reference numerals, and redundant description thereof will be omitted as appropriate. The embodiments have been described for exemplary purposes only, and are by no means intended to limit the disclosure and the invention. Also, it is not necessarily essential for the disclosure and invention that all the features or a combination thereof be provided as described in the embodiments.
The sensing system 10 mainly includes a gating camera 20. The gating camera 20 includes an illumination apparatus 22, an image sensor 24, a controller 26, a processing device 28, and a calibration light source 30. The imaging by the gating camera 20 is performed by dividing afield of view into a plurality of N (N≥2) ranges RNG1 to RNGN in a depth direction. Adjacent ranges may overlap each other in the depth direction at a boundary therebetween.
The sensing system 10 is capable of calibration in addition to normal imaging. First, hardware and functions related to the normal imaging will be described.
The illumination apparatus 22 is used for the normal imaging, and emits probe light L1 in front of the vehicle in synchronization with an emission control signal S1 supplied from the controller 26. As the probe light L1, infrared light is preferably employed. However, the present invention is not restricted to such an arrangement. Also, as the probe light L1, visible light having a predetermined wavelength or ultraviolet light may be employed.
The image sensor 24 includes multiple pixels, is capable of exposure control in synchronization with an exposure control signal S2 supplied from the controller 26, and generates a raw image (RAW image) including the multiple pixels. The image sensor 24 is used for both normal imaging and calibration. The image sensor 24 is sensitive to the same wavelength as that of the probe light L1, and images reflected light (returned light) L2 reflected by the object OBJ. A slice image IMG_RAWi generated by the image sensor 24 with respect to the i-th range RNGi is referred to as a raw image or a primary image as necessary so as to be distinguished from a slice image IMGSi which is a final output of the gating camera 20.
The controller 26 generates the emission control signal S1 and the exposure control signal S2, and controls the emission timing (light emission timing) of the probe light L1 by the illumination apparatus 22 and the exposure timing by the image sensor 24. Specifically, the controller 26 is implemented as a combination of a processor (hardware) such as a central processing unit (CPU), a micro processing unit (MPU), a microcontroller, or the like, and a software program to be executed by the processor (hardware).
The image sensor 24 and the processing device 28 are connected via a serial interface or the like. The processing device 28 receives the raw image IMG_RAWi from the image sensor 24, and generates the slice image IMGsi.
Since the gating camera 20 images reflected light from a far-side object, a sufficient image may not be obtained by single imaging (set of light emission and exposure). Accordingly, the gating camera 20 may repeatedly image N times (N≥2) for each range RNGi. In this case, for one range RNGi, N raw images IMG_RAWi1 to IMG_RAWiN are generated. The processing device 28 may synthesize the N raw images IMG_RAWi1 to IMG_RAWiN for one range RNGi to generate one slice image IMGsi.
It should be noted that the controller 26 and the processing device 28 may be configured with the same hardware, and may be implemented, for example, by the combination of the microcontroller and the software program.
The above is the configuration and the function relating to the normal imaging. Next, the normal imaging by the gating camera 20 will be described.
A round-trip time TMINi, which is a period from the departure of light from the illumination apparatus 22 at a given time point, to the arrival of the light at the distance dMINi, up to the return of the reflected light to the image sensor 24, is represented by TMINi=2×dMINi/c. Here, c represents the speed of light.
Similarly, a round-trip time TMAXi, which is a period from the departure of light from the illumination apparatus 22 at a given time point, to the arrival of the light at the distance dMAXi, up to the return of the reflected light to the image sensor 24, is represented by TMAXi=2×dMAXi/c.
When only an object OBJ included in the range RNGi is imaged, the controller 26 generates the exposure control signal S2 so as to start the exposure at a time point t2=t0+TMINi, and so as to end the exposure at a time point t3=t1+TMAXi. This is a single exposure operation.
When the i-th range RNGi is imaged, the light emission and exposure may be performed N times. In this case, preferably, the camera controller 26 may repeatedly execute the above exposure operation multiple times with a predetermined period τ2.
When the slice image IMG2 is captured, the image sensor is exposed by only the reflected light from the range RNG2, and thus the slice image IMG2 includes only an image of the object OBJ2. Similarly, when the slice image IMG3 is captured, the image sensor is exposed by only the reflected light from the range RNG3, and thus the slice image IMG3 includes only an image of the object OBJ3. As described above, with the gating camera 20, an object can be separately imaged for each range.
The above is the normal imaging by the gating camera 20. Next, a configuration and functions related to calibration of the gating camera 20 will be described. Return to
The calibration light source 30 is active during calibration, and emits calibration light L3 to the image sensor 24 according to the emission control signal S1 generated by the controller 26.
Description will be made assuming that a difference ΔT between a delay time Ta from the assertion of the emission control signal S1 during the normal imaging until the light emission of the illumination apparatus 22 and a delay time Tb from the assertion of the emission control signal S1 during the calibration until the light emission of the calibration light source 30 is known.
During the calibration, the controller 26 sweeps a time difference τ between the emission control signal S1 and the exposure control signal S2, and monitors a change in the pixel value of one or more pixels (referred to as a “pixel of interest”) of the image sensor 24. For example, the controller 26 acquires a time difference τCAL when a relatively large pixel value is calculated.
The time difference τCAL may be determined by the controller 26 or the processing device 28.
Next, a calibration operation will be described based on some embodiments.
In Example 1, description will be made focusing on only one pixel (referred to as a “pixel of interest”) among raw images IMG_RAW generated by the image sensor 24.
A position of the pixel of interest is not limited, and may be a center of the image sensor 24.
Here, for simplicity, description will be made assuming that the time difference τ between the emission control signal S1 and the exposure control signal S2 is swept in 5 stages (τ−2, τ−1, τ0, τ1, and τ2). In practice, the time difference τ can be varied in finer steps with a larger number of steps.
L3a represents a departure time of the calibration light L3 from the calibration light source 30, and L3b represents an arrival time of the calibration light L3 at the image sensor 24. The delay time Tb exists from the assertion of the emission control signal S1 to the light emission timing (departure time) of the calibration light source 30.
There is a propagation delay Tc of the calibration light L3 between the departure time (L3a) and the arrival time (L3b) of the calibration light L3. The propagation delay Tc is determined according to a distance between the calibration light source 30 and the image sensor 24.
IS represents an exposure period of the calibration light source 30. A delay time Td also exists between the assertion of the exposure control signal S2 and the actual start of exposure of the image sensor 24.
If the influence of noise or ambient light is ignored, when the arrival time L3b of the calibration light L3 is outside the exposure period IS of the image sensor 24, a pixel value Pa of the pixel of interest becomes zero. When the arrival time L3b of the calibration light L3 is included in the exposure period IS of the image sensor 24, the pixel value of the pixel of interest Pa increases.
The method for determining the time difference τCAL is not limited in particular. For example, the time difference τ when the pixel value Pa takes the maximum value may be τCAL. Alternatively, a time difference may be τCAL when a value calculated by differentiating the pixel value Pa by the time difference τ on the horizontal axis exceeds a predetermined value.
The controller 26 can correct the time difference between the emission control signal S1 and the exposure control signal S2 during the normal imaging using the time difference τCAL.
As described above, a timing error can be calibrated in the gating camera having no hardware for measuring a flight time. Furthermore, by preparing a light source for calibration in addition to the light source used during the normal imaging, imaging using all the pixels of the image sensor 24 can be performed during the normal imaging, and the probe light L1 generated by the illumination apparatus 22 is not shielded, and thus waste of hardware can be reduced.
In the calibration, when the ambient light having a magnitude that cannot be ignored with respect to the calibration light L3 is incident on the image sensor 24, the calibration precision is reduced.
Accordingly, in Example 2, in addition to the exposure (first exposure) for detecting the calibration light L3, exposure (second exposure) for measuring only the ambient light is performed.
The controller 26 (or the processing device 28) corrects the pixel value Pa calculated according to the first exposure control signal S2a with a pixel value Pb calculated according to the second exposure control signal S2b. The controller 26 acquires the time difference τCAL when the corrected pixel value Pa′ increases. Most simply, Pb may be subtracted from Pa so as to generate the corrected pixel value Pa′ (=Pa−Pb).
In practice, in many cases, the intensity of the ambient light changes with time.
Therefore, in order to measure the ambient light that changes with time, as shown in
In
In this case, the second exposure is preferably executed every time the first exposure is executed. As a result, M Paj and M Pbj are generated. The pixel values Paj are corrected using the corresponding pixel values Pbj, thereby generating M corrected pixel values Paj′. By processing M Paj′, a pixel value Pj is generated. The controller 26 acquires the time difference τj when the pixel value Pj increases.
The image sensor 24 may be a multi-tap CMOS sensor having multiple floating diffusions for each pixel. In this case, the image sensor 24 may be a multi-tap image sensor, and may capture an image using a first tap according to the first exposure control signal S2a and capture an image using a second tap according to the second exposure control signal S2b.
Next, specific configuration examples of the illumination apparatus 22 and the calibration light source 30 will be described.
Similarly, the calibration light source 30 includes a semiconductor light emitting element 30a and a drive circuit 30b thereof. The semiconductor light emitting element 30a may preferably irradiate the nearest image sensor 24, and thus such a high output or directivity is not required. Accordingly, a light emitting diode is preferably employed.
It should be noted that a laser diode may be employed as the semiconductor light emitting element 30a.
The drive circuit 30b supplies a drive current ILED to the semiconductor light emitting element 30a so as to cause the semiconductor light emitting element 30a to emit pulsed light, in response to the emission control signal S1. The configuration of the drive circuit 30b is not limited in particular, and a known LED driver can be used.
In
The present invention has been described above based on the embodiments. It will be understood by those skilled in the art that the embodiments have been described for exemplary purposes only, and that various modifications can be made to the combinations of the respective components and the respective processing processes, and such modifications are also within the scope of the present invention. Hereinafter, such modifications will be described.
(Modification 1)
In Modification 1, multiple adjacent pixels of the image sensor 24 are set as the pixels of interest. The gating camera 20 may generate a representative value from multiple pixel values, and may acquire the time difference τCAL when the representative value increases. An average value, a total value, a maximum value, or the like of the multiple pixel values can be used as the representative value.
(Modification 2)
The exposure timing of the image sensor 24 may have an in-plane variation. In this case, multiple pixels of interest may be determined at positions far from the image sensor 24, and for each pixel of interest, the time difference τCAL may be acquired when the pixel value increases. Accordingly, the in-plane variation of the timing error of the image sensor 24 can be calibrated.
(Application)
The gating camera 20 generates multiple slice images IMGs1 to IMGsN that correspond to the multiple ranges RNG1 to RNGN. The i-th slice image IMGsi includes only an image of an object included in the corresponding range RNGi.
The arithmetic processing device 40 is configured to identify the kind of an object based on the multiple slice images IMGs1 to IMGsN that correspond to the multiple ranges RNG1 to RNGN generated by the gating camera 20. The arithmetic processing device 40 is provided with a classifier 42 implemented based on a learned model generated by machine learning. Also, the arithmetic processing device 40 may include multiple classifiers 42 optimized for the respective ranges. The algorithm of the classifier 42 is not limited in particular. Examples of algorithms that can be employed include You Only Look Once (YOLO), Single Shot MultiBox Detector (SSD), Region-based Convolutional Neural Network (R-CNN), Spatial Pyramid Pooling (SPP net), Faster R-CNN, Deconvolution-SSD (DSSD), Mask RCNN, or the like. Also, other algorithms that will be developed in the future may be employed.
The arithmetic processing device 40 may be implemented as a combination of a processor (hardware) such as a central processing unit (CPU), a micro processing unit (MPU), a microcontroller, or the like, and a software program to be executed by the processor (hardware). Also, the arithmetic processing device 40 may be configured as a combination of multiple processors. Alternatively, the arithmetic processing device 40 may be configured as hardware alone. The functions of the arithmetic processing device 40 and the processing device 28 may be implemented in the same processor.
As shown in
As shown in
The information on the object OBJ detected by the sensing system 10 may be used for light distribution control of the vehicle lamp 200. Specifically, the lamp ECU 210 generates a suitable light distribution pattern based on the information on the kind of the object OBJ and a position thereof generated by the sensing system 10. The lighting circuit 224 and the optical system 226 operate so as to provide the light distribution pattern generated by the lamp ECU 210. The arithmetic processing device 40 of the sensing system 10 may be provided outside the vehicle lamp 200, that is, on the vehicle side.
The information on the object OBJ detected by the sensing system 10 may be transmitted to the in-vehicle ECU 310. The in-vehicle ECU 310 may use the information for autonomous driving or driving support.
The embodiments have been described for exemplary purposes only, showing one aspect of the principles and applications of the present invention. Also, many modifications and variations can be made to the embodiments without departing from the spirit of the present invention as defined in the claims.
The present disclosure can be applied to a sensing technique.
Number | Date | Country | Kind |
---|---|---|---|
2020-218975 | Dec 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/046824 | 12/17/2021 | WO |