The present disclosure relates to object sensing devices.
A car equipped with a radar and a camera is known. For example, in PATENT DOCUMENT 1, the radar is used to measure the distance between the car equipped with the radar and a vehicle traveling in front thereof, and the camera is used to recognize a lane or a road edge.
Patent Document PATENT DOCUMENT 1: Japanese Patent Publication No. H11-212640
There is a demand for an object sensing device that can identify and sense what an object is like, irrespective of the weather. Specifically, such an object sensing device is expected to not only check the presence of an object but also identify and sense what kind of object is present, even in bad weather such as fog and rain.
In this case, the recognition of an object may be improved by enhancing the visual recognition performance of the camera by increasing the amount of light emitted by headlights, for example. However, if only the amount of light emitted by headlights is simply increased, not only the amount of light from an object of interest but also the amount of light traveling back due to backscattering by fog particles increase to the extent that the object of interest is no longer visible. This deteriorates the visibility of the object, and therefore, the object cannot be recognized or identified.
Under such circumstances, the rate of detection of an object may be improved by detecting the object using a detector such as a radar, which is not substantially affected by rain or fog. The radar has a significantly lower resolution than that of the camera. Therefore, the radar can sense the presence of an object, but cannot identify or recognize the object. As a result, in the case where the radar is applied to a vehicle, etc., the vehicle stops or decelerates too frequently for an object for which the vehicle does not need to stop, leading to uncomfortable traveling.
The present disclosure discloses implementations of an object sensing device that solves the above problem.
An object sensing device for sensing an object includes a radar configured to emit an electromagnetic wave to the object and generate a signal indicating a location of the object, a light source configured to emit light to the object, a sensor configured to obtain an image of the object, and a processor. The processor controls, based on the signal, timing of emission of the light by the light source, and exposure of the sensor.
Provided is an object sensing device that can obtain an image having a higher resolution in bad weather such as fog and rain.
Embodiments of the present disclosure will now be described with reference to the accompanying drawings. It should be noted that all of the embodiments described below are a specific preferred example of the present disclosure. Therefore, numerical values, shapes, materials, components, arrangements of components, and connections or couplings of components, etc., described in the embodiments below are merely illustrative and are in no way intended to limit the present disclosure. Therefore, of the components in the embodiments below, those that are not described in the independent claims indicating the broadest concept of the present disclosure are described as optional components.
Each figure is schematic and is not necessarily exactly to scale. Like parts are indicated by like reference characters throughout the drawings and will not be redundantly described or will be briefly described.
In the specification and drawings, the X axis, Y axis, and Z axis represent the three axes of a three-dimensional orthogonal coordinate system. In this embodiment, the Z axial direction is a vertical direction, and a direction perpendicular to the Z axis (a direction parallel to the X-Y plane) is a horizontal direction. The X and Y axes are orthogonal to each other and are both orthogonal to the Z axis.
The object 190 is typically another vehicle, but is not limited to this. The object 190 may be, for example, a pedestrian, a structure on a road, etc. The object 190 may be an obstacle, depending on the positional relationship between the own vehicle equipped with the object sensing device 100 and the object 190. In that case, based on the result of sensing by the object sensing device 100, warning to the driver of the own vehicle or braking of the own vehicle may be performed, for example.
The radar 110 is, for example, a millimeter wave radar. The radar 110 emits a pulsed millimeter wave to the object 190, and receives a reflected electromagnetic wave traveling back. The radar 110 outputs, to the sensing circuit 140, a signal indicating the times of emission and reception of an electromagnetic wave. Based on the times of emission and reception, the processor 150 generates a signal indicating a location with respect to the object 190. In the case where the radar 110 emits an electromagnetic wave in only one direction, the location corresponds to a one-dimensional location of the object 190 with respect to the radar 110, i.e., a distance between the radar 110 and the object 190. In the case where the radar 110 performs sector scan, i.e., the direction of emission sweeps through a sector in a horizontal plane with time, the location is a two-dimensional location of the object 190 with respect to the radar 110, i.e., a location of the object 190 in the horizontal plane with respect to the radar 110.
The light source 120 emits pulsed light to the object 190. The intensity of light emitted by the light source 120 may be changed with time in a rectangular or triangular waveform. The light source 120 may be, for example, a laser device or light emitting diode (LED). The light source 120 herein also includes a laser diode, which emits laser light. The light source 120, which typically emits a visible light beam, may also serve as a light source of a headlight of the own vehicle 210. The light source 120 may be dedicated to sensing and may emit near-infrared light. Laser devices, which can provide a high-speed response, are preferably used as a pulsed light source. The light source 120 may be an LED, provided that the light source 120 is driven by a circuit having high drive power.
The sensor 130 receives light only during a light reception period after a delay time has elapsed since emission of pulsed light, to image the object 190. The delay time corresponds to the distance between the radar 110 and the object 190. The light reception period corresponds to the length of the object 190 in the depth direction as viewed from the radar 110. The sensor 130 typically includes a two-dimensional array of imaging elements. The sensor 130 has a shutter, preferably a global shutter, whose shutter speed is relatively high.
The sensor 130 outputs a captured image to the low frequency removal circuit 160. The low frequency removal circuit 160 outputs an image enhanced by signal processing to the sensing circuit 170. The sensing circuit 170 senses an object, and outputs a result of the sensing.
The light source 120 emits pulsed light beams 330, 340 periodically to the object 190. The interval between the pulsed light beams 330 and 340 (also referred to as an “emission interval,” i.e., a time period from time t1 to time t4) is, for example, 10 μs. The present disclosure is not limited to this. The emission interval may be any suitable interval. For example, the emission interval of the light source 120 is in the range of 2-10 μs. As described below, the pulse width W of the pulsed light beams 330 and 340 is suitably selected, depending on the imaging range 230.
For example, in the case where images are captured at a rate of 30 frames per second, each frame has a duration of 33.3 ms. If the emission interval is 10 μs, the number of times pulsed light can be emitted per frame is of the order of 1000. The sensor 130 receives and accumulates photons generated by pulsed light emission performed a large number of times, and calculates summation, so that an image can be formed. An example element used for the sensor 130 to implement the operation of accumulating photons is an avalanche photodiode.
Assuming that the front edge of the pulsed light beam 330 is located at time t1, the sensor 130 performs imaging only for a light reception period (t3−t2) from time t2 to time t3. The time period from time t1 to time t2 is equal to 2d1/c. The time period from time t1 to time t3 is equal to 2d2/c. Therefore, the depth (d2−d1) of the imaging range 230 as viewed from the own vehicle 210 is expressed by (t3−t2)c/2. In other words, if the light reception period (t3−t2) is suitably set, the imaging range 230 suitable for the object 190 is obtained. Typically, the light reception period (t3−t2) is equal to the pulse width W of the pulsed light beams 330 and 340. If the processor 150 controls the emission and reception of light of the light source 120 in such a manner, the sensor 130 can selectively image only an object that is present in the imaging range 230. If the pulse width W is set to, for example, the length in the depth direction of a car which is the object 190, the influence of bad weather such as fog and rain on imaging can be minimized.
For example, if the pulse width (i.e., the time period from t2 to time t3) of the pulsed light beams 330 and 340 is 10 ns, the imaging range 230 (i.e., (d2−d1)) is 3 m, which corresponds to the depth of field. For example, if the pulse width (i.e., the time period from t2 to time 13) of the pulsed light beams 330 and 340 is 50 ns, the imaging range 230 (i.e., (d2−d1)) is 15 m. If it is assumed that a car is included as the object 190 in the imaging range 230, the pulse width (emission period) of the pulsed light beams 330 and 340 is preferably, for example, 10-50 ns. The present disclosure is not limited to this. The pulse width (emission period) of the pulsed light beams 330 and 340 may be any suitable time period. For example, the emission period of the light source 120 may be in the range of 10-100 ns.
The above structure allows the power of light emitted by the light source 120 to be concentrated only at or near the object 190. The intensity of a signal reflected by the object 190 present in fog or rain can be increased, and therefore, the object 190 can be sensed even if the object 190 is located further away. Imaging of the object 190 can be less affected by the influence of light from the light source 120 that has been reflected by fog or rain.
In the above embodiments, an operation mode may be implemented in which when two objects located at different distances are sensed by the radar 110, for example, the nearer object is imaged with priority, and the further object is not imaged. Alternatively, the nearer object and the further object may be imaged alternately on a frame-by-frame basis, whereby each of the plurality of objects can be clearly imaged. Here, the sensor 130 performs imaging at a rate of, for example, 30 frames per second.
The processor 150 determines whether or not a vehicle is present in the same lane as that of the own vehicle 210, based on a distance between the own vehicle 210 and that vehicle, and an angle of that vehicle with respect to a forward direction of the own vehicle 210. For example, in the case where the distance 1012 is 10 m and the angle 1014 is 20 degrees, 10 m×sin 20°=3.4 m, and therefore, the processor 150 determines that the vehicle 1010 is not present in the same lane 210, and is present in the adjacent lane 1016.
When the radar 110 senses the vehicles 1010, 1020, and 1030, the processor 150 calculates the degree of risk of a collision with each of the vehicles 1010, 1020, and 1030 based on the distances 1012, 1022, and 1032, and the angle 1014, the angle of 0° (the vehicle 1020 is present in front of the own vehicle 210), and the angle 1034. For example, it is determined that the degree of risk is high for the vehicle traveling in the same lane 1026. In addition, the shorter the distance, the higher the degree of risk determined. Therefore, the degree of risk for the vehicle 1010 is higher than that for the vehicle 1030. In the case where the degree of risk is determined in accordance with the above rule, the following relative order of magnitude of the degree of risk is obtained: the degree of risk for the vehicle 1020>the degree of risk for the vehicle 1010>the degree of risk for the vehicle 1030.
Based on the resultant degrees of risk, the processor 150 controls the timings of emission of the light source 120 and exposure of the sensor 130 so as to obtain images of two of the vehicles 1010, 1020, and 1030 in different frames. In the case of
In one embodiment, when the radar 110 senses a plurality of objects, the processor 150 controls the exposure of the sensor 130 based on a position signal of the closest object. As a result, image processing can be performed on a vehicle having the highest degree of risk with priority.
In one embodiment, when three or more objects having different distances are sensed, for example, only the closest and intermediate objects can be alternately imaged, excluding the furthest object from those is to be imaged, depending on the distance. For example, the vehicle 1030 is located furthest away and therefore may not be imaged, and only the vehicles 1010 and 1020 may be imaged.
The object sensing device 100 may be mounted on a mobile body. Preferably, the object sensing device 100 is mounted on a car. In one embodiment, the light source 120 illuminates the interior of the mobile body, and the sensor 130 is separated from the light source 120.
According to the above various embodiments, the light source 120 is driven in a pulsed mode, whereby the intensity of a signal corresponding to light from the object 190 can be improved by effectively using the same amount of light. In addition, offset noise (background noise) due to fog or rain can be reduced by pulsed exposure of the sensor 130.
The elements (or acts) in the above embodiments may be arbitrarily combined without departing the spirit and scope of the present invention.
What have been described above include various examples of the present invention. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the present invention, but one of ordinary skill in the art will recognize that many further combinations and permutations of the present invention are possible. Accordingly, the present invention is intended to embrace all such alterations, modifications, and variations that fall within the spirit and scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2018-037087 | Mar 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/041358 | 11/7/2018 | WO | 00 |