The present invention relates to a technology to determine the density of fog.
Patent Literature 1 describes a technology to decide whether fog is present around a vehicle.
In Patent Literature 1, when illuminating light is emitted by headlights, a decision as to whether fog is present is made based on a difference in luminosity of an area not illuminated by the illuminating light in an image captured of an area in front of the vehicle. Specifically, when no fog is present, the luminosity of the area not illuminated by the illuminating light is low. In contrast, when fog is present, the illuminating light is reflected by fog particles, so that the luminosity gradually increases from the center to the periphery of the image. In Patent Literature 1, this characteristic is used to decide whether fog is present.
In the technology described in Patent Literature 1, illumination from headlights is a prerequisite, so that a decision as to whether fog is present cannot be made without a certain level of darkness.
It is an object of the present invention to allow the density of fog to be determined regardless of the presence or absence of illumination from headlights.
A fog determination apparatus according to the present invention includes
In the present invention, the density of fog is determined based on a distance determined from point data obtained by an optical sensor, and on smoothness of luminance of image data. Therefore, the density of fog can be determined regardless of the presence or absence of illumination from headlights.
***Description of Configuration***
Referring to
The fog determination apparatus 10 is a computer, such as an electronic control unit (ECU), to be mounted on a mobile object 100.
In the first embodiment, the mobile object 100 is a vehicle. However, the mobile object 100 is not limited to the vehicle, and may be other types such as a ship or an airplane. The fog determination apparatus 10 may be implemented in a form integrated with or a form inseparable from the mobile object 100 or another component illustrated in the drawing, or may be implemented in a form detachable from or a form separable from the mobile object 100 or another component illustrated in the drawing.
The fog determination apparatus 10 includes hardware of a processor 11, a memory 12, a storage 13, and a communication interface 14. The processor 11 is connected with other hardware components via signal lines and controls the other hardware components.
The processor 11 is an integrated circuit (IC) that performs processing. Specific examples of the processor 11 are a central processing unit (CPU), a digital signal processor (DSP), and a graphics processing unit (GPU).
The memory 12 is a storage device to temporarily store data. Specific examples of the memory 12 are a static random access memory (SRAM) and a dynamic random access memory (DRAM).
The storage 13 is a storage device to store data. A specific example of the storage 13 is a hard disk drive (HDD). Alternatively, the storage 13 may be a portable recording medium, such as a Secure Digital (SD, registered trademark) memory card, CompactFlash (CF, registered trademark), a NAND flash, a flexible disk, an optical disc, a compact disc, a Blu-ray (registered trademark) disc, or a digital versatile disc (DVD).
The communication interface 14 is an interface for communication with external devices. Specific examples of the communication interface 14 are an Ethernet (registered trademark) port, a Universal Serial Bus (USB) port, and a High-Definition Multimedia Interface (HDMI, registered trademark) port.
The communication interface 14 is connected with an optical sensor 41 and a camera 42 that are mounted on the mobile object 100.
The optical sensor 41 is a device that emits a light beam, which is a beam of light, and receives reflected light of the emitted light beam reflected at a reflection point. A specific example of the optical sensor 41 is a LiDAR (Light Detection and Ranging). In the first embodiment, it is assumed that the optical sensor 41 emits a light beam forward in a direction of movement of the mobile object 100.
The camera 42 is a device that captures an image of an area around the mobile object 100 and generates image data. In the first embodiment, it is assumed that the camera 42 captures an image of an area forward in the direction of movement of the mobile object 100.
The fog determination apparatus 10 includes, as functional components, a point data acquisition unit 21, an image data acquisition unit 22, a luminance calculation unit 23, and a fog determination unit 24. The functions of the functional components of the fog determination apparatus 10 are realized by software.
The storage 13 stores programs for realizing the functions of the functional components of the fog determination apparatus 10. These programs are loaded into the memory 12 by the processor 11 and executed by the processor 11. This realizes the functions of the functional components of the fog determination apparatus 10.
The storage 13 realizes the function of a reference value storage unit 31.
***Description of Operation***
Referring to
The operation of the fog determination apparatus 10 according to the first embodiment corresponds to a fog determination method according to the first embodiment. The operation of the fog determination apparatus 10 according to the first embodiment also corresponds to processes of a fog determination program according to the first embodiment.
Referring to
(Step S11: Point Data Acquisition Process)
The point data acquisition unit 21 acquires, via the communication interface 14, a set of point data, each piece of point data indicating a reflection point obtained by the optical sensor 41 that receives reflected light of an emitted light beam reflected at the reflection point. The point data acquisition unit 21 writes the acquired set of point data in the memory 12. In the first embodiment, point data indicates a reflection point at which the intensity of reflected light of a given light beam emitted from the optical sensor 41 is greatest.
(Step S12: Image Data Acquisition Process)
The image data acquisition unit 22 acquires image data of an area around the reflection points indicated by the point data included in the set acquired in step S11.
Referring to
(Step S13: Luminance Calculation Process)
The luminance calculation unit 23 calculates smoothness of luminance of the image data acquired in step S12.
Referring to
Note that the smaller the sum of the absolute values, the higher the smoothness of the luminance. That is, there is less texture in the image data. As illustrated in
(Step S14: Fog Determination Process)
The fog determination unit 24 determines the density of fog based on a distance, which is determined from the point data included in the set acquired in step S11, from the optical sensor 41 to each of the reflection points and the smoothness of the luminance calculated in step S13.
Specifically, the fog determination unit 24 refers to reference values stored in the reference value storage unit 31. In the first embodiment, as illustrated in
The shorter the distance, the larger a value stored as the reference value. That is, the shorter the distance from the optical sensor 41 to each of the reflection points, the higher the possibility of decision that fog is present even when the smoothness of the luminance is low. This is because as the distance is longer, the luminance is smoother also for three-dimensional objects such as buildings and vehicles. The reference values depending on the distance may be obtained empirically.
As described above, the fog determination apparatus 10 according to the first embodiment determines the density of fog based on the distance to a reflection point and smoothness of luminance of image data around the reflection point. This allows the density of fog to be determined regardless of the presence or absence of illumination from headlights. That is, the density of fog can be determined even in a state with a certain level of lightness, such as in the daytime.
***Other Configurations***
<First Variation>
In the first embodiment, the fog determination unit 24 decides whether fog with a density greater than or equal to a certain level is present. However, the fog determination unit 24 may determine which level of density of fog, out of a plurality of levels, corresponds to the state around the mobile object 100.
In this case, as illustrated in
Specifically, if the luminance is smoother in comparison with the reference value for fog with a visibility of 15 m, the fog determination unit 24 determines that fog with a visibility of 15 m is present. If the luminance is not smoother in comparison with the reference value for fog with a visibility of 15 m and is smoother in comparison with the reference value for a visibility of 30 m, the fog determination unit 24 determines that fog with a visibility of 30 m is present. If the luminance is not smoother in comparison with the reference value for fog with a visibility of 30 m and is smoother in comparison with the reference value for a visibility of 50 m, the fog determination unit 24 determines that fog with a visibility of 50 m is present. If the luminance is not smoother in comparison with the reference value for fog with a visibility of 50 m, the fog determination unit 24 determines that no fog is present.
The reference value depending on the distance is smaller as the density of fog is greater. That is, the reference value depending on the distance is such that the value indicates higher smoothness as the density of fog is greater.
<Second Variation>
In the first embodiment, the functional components are realized by software. As a second variation, however, the functional components may be realized by hardware. With regard to this second variation, differences from the first embodiment will be described.
Referring to
When the functional components are realized by hardware, the fog determination apparatus 10 includes an electronic circuit 15, in place of the processor 11, the memory 12, and the storage 13. The electronic circuit 15 is a dedicated circuit that realizes the functions of the functional components, the memory 12, and the storage 13.
The electronic circuit 15 is assumed to be a single circuit, a composite circuit, a programmed processor, a parallel-programmed processor, a logic IC, a gate array (GA), an application specific integrated circuit (ASIC), or a field-programmable gate array (FPGA).
The functional components may be realized by one electronic circuit 15, or the functional components may be distributed among and realized by a plurality of electronic circuits 15.
<Third Variation>
As a third variation, some of the functional components may be realized by hardware, and the rest of the functional components may be realized by software.
Each of the processor 11, the memory 12, the storage 13, and the electronic circuit 15 is referred to as processing circuitry. That is, the functions of the functional components are realized by the processing circuitry.
<Fourth Variation>
In the first embodiment, the fog determination apparatus 10 is realized by one computer such as an ECU. However, the fog determination apparatus 10 may be realized by a plurality of computers such as ECUs.
A second embodiment differs from the first embodiment in that a sensor threshold value of a sensor for identifying an obstacle is set depending on the density of fog that has been determined. In the second embodiment, this difference will be described, and description of the same portions will be omitted.
***Description of Configuration***
Referring to
The fog determination apparatus 10 differs from that of the first embodiment in that a recognition unit 25 and a threshold value setting unit 26 are included.
***Description of Operation***
Referring to
The operation of the fog determination apparatus 10 according to the second embodiment corresponds to a fog determination method according to the second embodiment. The operation of the fog determination apparatus 10 according to the second embodiment also corresponds to processes of a fog determination program according to the second embodiment.
Referring to
Step S21 is the process to determine the density of fog described in the first embodiment.
(Step S22: Threshold Value Setting Process)
The threshold value setting unit 26 sets the sensor threshold value of the sensor for identifying an obstacle depending on the density of fog determined in step S21.
Referring to
When tail lamps are identified using the camera, a boundary line that linearly distinguishes tail lamps and others is used as the sensor threshold value on a UV plane of YUV data. Thus, the threshold value setting unit 26 sets this boundary line depending on the density of fog. The boundary line can be expressed as V=a·U+b. Thus, the threshold value setting unit 26 sets the values of a and b depending on the density of fog.
As illustrated in
Note that
(Step S23: Recognition Process)
The recognition unit 25 recognizes an obstacle, using the sensor threshold value set in step S22.
In the example in
As described above, the fog determination apparatus 10 according to the second embodiment sets the sensor threshold value depending on the density of fog. This allows an obstacle to be appropriately recognized.
A third embodiment differs from the first and second embodiments in that a sensor to be used for identifying an obstacle is decided depending on the density of fog. In the third embodiment, this difference will be described, and description of the same portions will be omitted.
Note that an example in which a function is added to the first embodiment will be described here. However, the function may also be added to the second embodiment.
***Description of Configuration***
Referring to
The fog determination apparatus 10 differs from that of the first embodiment in that the recognition unit 25 and a sensor decision unit 27 are included. Another difference from the first embodiment is that the storage 13 realizes the function of a reliability storage unit 32.
***Description of Operation***
Referring to
The operation of the fog determination apparatus 10 according to the third embodiment corresponds to a fog determination method according to the third embodiment. The operation of the fog determination apparatus 10 according to the third embodiment also corresponds to processes of a fog determination program according to the third embodiment.
Referring to
Step S31 is the process to determine the density of fog described in the first embodiment.
(Step S32: Sensor Decision Process)
The sensor decision unit 27 decides the sensor to be used for identifying an obstacle depending on the density of fog determined in step S31.
Specifically, the reliability storage unit 32 stores reliabilities depending on the distance, separately for each level of density of fog and for each sensor mounted on the mobile object 100. As illustrated in
The sensor decision unit 27 refers to the reliability storage unit 32, and decides, as the sensor to be used for identifying an obstacle, a sensor having a high reliability in the case of the density of fog determined in step S31. The sensor decision unit 27 may decide the sensor to be used for identifying an obstacle separately for each length of the distance.
For example, the sensor decision unit 27 decides to use the LiDAR and the camera when no fog is present, and decides to use the millimeter wave radar and the camera when fog is present.
(Step S33: Recognition Process)
The recognition unit 25 recognizes an obstacle, using the sensor decided in step S32.
As described above, the fog determination apparatus 10 according to the third embodiment decides the sensor to be used for identifying an object depending on the density of fog. This allows an obstacle to be appropriately recognized.
***Other Configuration***
<Fifth Variation>
In the third embodiment, the function is added to the first embodiment. However, the function may also be added to the second embodiment.
In this case, as illustrated in
Note that the processes of steps S41, S42, and S44 are the same as the processes of steps S31, S32, and S33 in
The embodiments and variations of the present invention have been described above. Any ones of these embodiments and variations may be implemented in combination. Alternatively, any one or ones of these embodiments and variations may be partially implemented. Note that the present invention is not limited to the above embodiments and variations, and various modifications can be made as necessary.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/009396 | 3/12/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/175920 | 9/19/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
8473144 | Dolgov et al. | Jun 2013 | B1 |
10254388 | LaChapelle | Apr 2019 | B2 |
20030222983 | Nobori et al. | Dec 2003 | A1 |
20040054473 | Shimomura | Mar 2004 | A1 |
20080007429 | Kawasaki et al. | Jan 2008 | A1 |
20130039544 | Robert | Feb 2013 | A1 |
20130342692 | Li | Dec 2013 | A1 |
20140067187 | Ferguson et al. | Mar 2014 | A1 |
20140121880 | Dolgov et al. | May 2014 | A1 |
20140297094 | Dolgov et al. | Oct 2014 | A1 |
20160349358 | Noda | Dec 2016 | A1 |
20170154409 | Murakami et al. | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
5-296766 | Nov 1993 | JP |
2000-329852 | Nov 2000 | JP |
2004-56778 | Feb 2004 | JP |
2004-112144 | Apr 2004 | JP |
2006-227876 | Aug 2006 | JP |
2008-33872 | Feb 2008 | JP |
2009-42177 | Feb 2009 | JP |
2009-177311 | Aug 2009 | JP |
2010-15436 | Jan 2010 | JP |
2010-97430 | Apr 2010 | JP |
2013-192003 | Sep 2013 | JP |
2014-89691 | May 2014 | JP |
2016-223872 | Dec 2016 | JP |
2017-97906 | Jun 2017 | JP |
2017-102245 | Jun 2017 | JP |
Entry |
---|
International Search Report issued in PCT/JP2018/009396 (PCT/ISA/210), dated May 29, 2018. |
Mori et al., “Fog density judgment by in-vehicle camera images and millimeter-wave radar data”, The Institute of Electronics, Information and Communication Engineers, IEICE Technical Report, 106 (605), Mar. 15, 2007, pp. 85-90. |
Number | Date | Country | |
---|---|---|---|
20210019537 A1 | Jan 2021 | US |