The present invention relates to an automatic light system that automatically turns on or off lights of a vehicle in accordance with surrounding environment.
The Safety Standards of the Road Transportation Vehicles revised in October 2016 requires mounting of an “auto light function” in a novel passenger vehicle that will start to be sold in or after April 2020. The “auto light function” required by the standards is that lights are automatically turned on within 2 seconds in a case where illuminance of surrounding environment is less than 1000 [1×], and automatically turned off within a range of 5 to 300 seconds in a case where illuminance of surrounding environment exceeds 7000 [1×].
Here, as a conventional automatic light system for automobile, there is a system that detects illuminance outside a vehicle by using a vehicle-mounted camera and automatically turns on or off lights, accordingly.
For example, Abstract of PTL 1 describes “An auto lighting system 1 includes imaging means 2 for capturing an image of the road ahead of a vehicle and lighting control means 3 for turning on or off lights such as headlights based on the image of the road ahead of the vehicle captured with the imaging means 2. The brightness ahead of the vehicle as perceived by the driver is recognized based on the image of the road ahead of the vehicle captured with the imaging means 2, and lighting control in which lights 101 are turned on or off is performed based on the brightness ahead of the vehicle.”, and discloses a technique to automatically turn on or off lights according to according to brightness of a front of an own vehicle.
PTL 1: JP 2010-6172 A
Here, in the conventional technique of PTL 1, as described in claim 8 or
First, if detection of a road region, such as influence of a road state, road color, or a preceding vehicle, cannot be performed correctly during normal travel, erroneous operation of automatic lighting may occur. For example, in a case where a steering angle is large such as when traveling on a sharp curve, a road surface state imaged by a vehicle-mounted camera changes significantly in a short period of time, and therefore judgment of brightness of surrounding environment may become inappropriate, causing a delay in timing of turning on headlights.
Furthermore, in a case where an inter-vehicle distance to a preceding vehicle is short and a proportion of the preceding vehicle occupying image data in the vehicle-mounted camera is large, also, the road surface cannot be imaged sufficiently, and a light mode set on the basis of the image data may not become appropriate.
The present invention has been made in view of the above problems, and an object of the present invention is to provide an automatic light system that is capable of setting an appropriate light mode even in a case where a vehicle travels on a sharp curve or where an inter-vehicle distance to a preceding vehicle is short.
In order to solve the above problems, an automatic light system of the present invention automatically turns on or turns off lights of a vehicle in accordance with surrounding environment, the automatic light system including an imaging device that images an area in front of the vehicle and generates image data, an image processing unit that generates light mode information, with which turning on or turning off of the lights is controlled, on the basis of brightness in an upper region and lower region of the image data, and a light control unit that controls the lights on the basis of the light mode information, in which the image processing unit sets a higher weight to a measurement region in a movement direction of the vehicle from among a plurality of measurement regions set in the upper region and a lower weight to another measurement region to calculate the brightness in the upper region, and generates the light mode information on the basis of the brightness in the upper region.
With an automatic light system according to the present invention, an appropriate light mode can be set even in a case where a vehicle travels on a sharp curve or where an inter-vehicle distance to a preceding vehicle is short.
Hereinafter, embodiments of an automatic light system of the present invention will be described in detail on the basis of the drawings.
An automatic light system according to a first embodiment of the present invention will be described by using
The imaging unit 1 generates image data 10 of an area similar to an area that a driver of the vehicle 100 visually recognizes while driving, and the imaging unit 1 is, for example, a monocular camera set at a tip of the vehicle 100. The image data 10 imaged by the imaging unit is transmitted to the image processing unit 2 and recorded.
The image processing unit 2 functions as an electronic control unit (ECU) for camera control also, and controls behavior of the imaging unit 1 by transmitting setting information of imaging timing, imaging time, or the like of the imaging unit 1. Furthermore, the image processing unit 2 generates light mode information corresponding to the image data 10 and transmits the light mode information to the light control unit 6, and details of processing of these will be described later.
The vehicle information acquisition unit 3 acquires vehicle information of speed, a steering angle, or the like of the vehicle 100 via the CAN, or the like, described above, and transmits the vehicle information to the vehicle control unit 4.
The vehicle control unit 4 outputs an appropriate command to each unit on the basis of each kind of information (steering angle, speed, inter-vehicle distance, brake, or the like) acquired from the image processing unit 2, the vehicle information acquisition unit 3, the light control unit 6, and the traveling system control unit 8.
The light switching unit 5 generates desired light mode information according to operation by the driver and transmits the light mode information to the light control unit 6.
The light control unit 6 is also referred to as an electronic control unit (ECU) for lamp control, and receives light mode information from the image processing unit 2 or the light switching unit 5. Furthermore, from the vehicle control unit 4, the light control unit 6 receives light mode information based on a control state such as a traveling state of the vehicle 100. The light control unit 6 sets an appropriate light mode for the lighting unit 7 on the basis of any of the light mode information.
The lighting unit 7 is headlights or small lights of the vehicle 100, and is set to a light mode corresponding to the light mode information input to the light control unit 6.
For example, the headlights are set to any one of low beam, high beam, or light-off, and the small lights are set to either light-on or light-off. It should be noted that the lighting unit 7 may use a halogen lamp or a plurality of light emitting diodes (LEDs), and in a case where LEDs are used, an irradiated region may be finely divided and the irradiated region may be specified in detail.
The traveling system control unit 8 is connected to an engine, a motor, a steering mechanism, a brake mechanism, or the like, which is not illustrated, transfers steering operation or acceleration/deceleration operation of the driver to the vehicle control unit 4, and operates the steering mechanism, the brake mechanism, or the like, according to a steering command or an acceleration/deceleration command from the vehicle control unit 4.
The image data 10 of an area in front of a vehicle imaged by the imaging unit 1 is stored in the auxiliary storage device 23 of the image processing unit 2 via the imaging I/F 24, and the calculation device 21 performs processing that a program specifies on the stored image data 10. As a result, the calculation device 21 generates light mode information corresponding to a recognition result of brightness of an area in front of the vehicle 100, and transmits the light mode information to the light control unit 6 via the lighting I/F 25. The light control unit 6 in
Next, processing content of the image processing unit 2 of the present embodiment will be described in detail by using
<Processing in S31 “Acquire Image Data and Vehicle Information”>
In S31, the image processing unit 2 acquires the image data 10 from the imaging unit 1 via an imaging I/F 25, and acquires the vehicle information (speed, steering angle, or the like) from the vehicle information acquisition unit 3 via the vehicle control unit 4 and a vehicle I/F 27.
<Processing in S32 “Recognize Light in Surrounding Environment”>
Subsequently, in S32, the image processing unit 2 uses the image data 10 and vehicle information acquired in S31 to recognize light in surrounding environment of an area in front of an own vehicle (S32). Recognition of light in surrounding environment here allows estimation of a luminance value on the basis of an exposure characteristic and an exposure value (luminance, shutter speed, or gain of an acquired image), and the luminance value is detectable as detected by a luminance meter.
The processing in S32 will be described in detail by using the flowchart in
First, in S41, a judgment region is set to each of an upper part and lower part of the image data 10.
For the upper region 11, it is preferable to process so that not a surrounding tree, a signboard, a building, or the like, but only the sky is viewed, because the upper region 11 is a region for measuring light in surrounding environment in a long distance on the basis of brightness of the sky, or the like. Therefore, in the present embodiment, influence of a surrounding tree, a signboard, a building, or the like can be reduced, and only brightness of the sky in an area in front of a vehicle can be extracted by setting a plurality of upper measurement regions A0 to A4, which have a height of about one-fifth of a vertical size of an image, in an upper part of the upper region 11. However, accuracy in calculating the brightness of the sky may degrade in a case where a vertical size of each of the upper measurement regions is made too small (for example, one-tenth, one-twentieth), and therefore the height is set to about one-fifth of a vertical size of an entire image.
Meanwhile, for the lower region 12, it is preferable to process so that not a surrounding tree, a white line, or the like, but only a road in a short distance in a movement direction is viewed, because the lower region 12 is a region for measuring light in surrounding environment in a short distance on the basis of brightness, or the like, of a road surface in an area 3 to 5 m in front of the vehicle 100. Therefore, in the present embodiment, influence of a surrounding tree, a white line, or the like can be reduced, and only brightness of a road surface in an area in front of a vehicle can be extracted by setting a lower measurement region 120, which have a height of about one-seventh of a vertical size of an image and a width of about one-third of a horizontal size of the image, in a lower part of the lower region 12.
Subsequently, in S42, a steering angle θ is acquired from the vehicle information, and the movement direction of the vehicle 100 is judged. In S43, in consideration of the movement direction of the vehicle 100, a weight w in consideration of deviation from the movement direction is assigned to each of the upper measurement regions A0 to A4. In S44, calculation is performed by weighting environment light in the movement direction of the vehicle 100.
Here, the processing in S42 and S43 will be specifically described by using
For example, as illustrated by an arrow in
When an upper measurement region corresponding to a movement direction of the vehicle 100 is identified, a weight in consideration of deviation from the movement direction is assigned to each of the upper measurement regions. This is to assign a maximum weight to a region corresponding to a movement direction and to assign a small weight to a region away from the movement direction. For example, in an example in
In S44 in
In a case where light in surrounding environment in a long distance is estimated in S44, first, a normal average value of the brightness in each of the upper measurement regions A0 to A4 is obtained, a darkest region is excluded, and then a weighted average value, which is to be brightness of light in surrounding environment in a long distance, is calculated. In this way, influence of noise can be eliminated by excluding data of the darkest region, accuracy in calculating light in surrounding environment in a long distance can be enhanced by obtaining a weighted average value in consideration of an amount of deviation from the movement direction.
Specific calculation of light in surrounding environment in a long distance in S44 is based on Formula 1.
In Formula 1, the left-hand side represents brightness of light in surrounding environment, and on the right-hand side, xi represents an average value of brightness of each of the measurement regions, wi represents a weight of each of the measurement regions, n represents the number of upper measurement regions, and mi represents a noise removal coefficient for excluding a darkest upper measurement region.
Furthermore, a weight wi in Formula 1 is defined by Formula 2, and a noise removal coefficient mi is defined by Formula 3.
Furthermore, as the example in
It should be noted that, although an example in which the number of the upper measurement regions is set to five is described in
Meanwhile, in a case where light in surrounding environment in a short distance is estimated on the basis of the lower region 12 of the image data 10 in S44, the brightness is calculated by a luminance average value of the lower measurement region 120.
After brightness of the light in surrounding environment is calculated for each region in the upper part and the lower part in S44, a light mode corresponding to the brightness of the light in surrounding environment of each is individually judged in S45.
Here,
Similarly,
That is, in both cases of
Furthermore, a method other than the methods in
After that, whether the obtained average hue is within a predetermined hue range (180<H<270) is judged in S93, and if this is satisfied, the light in surrounding environment is judged to be blue, that is, estimated to be daytime and fine weather, and the light mode is set to “lights OFF” (S97). Furthermore, in a case where a condition in S93 is not satisfied, the light in surrounding environment is judged to be red or orange, that is, estimated to be early evening, and the light mode is set to “small lights ON” (S98).
Furthermore, whether brightness of the light in surrounding environment is less than the upper threshold th_u1 and equal to or more than the upper threshold th_u2 is judged in S94, and if this is satisfied, the light in surrounding environment is judged to be dim, that is, estimated to be rainy weather, and the light mode is set to “small lights ON” (S98). In a case where a condition in S94 is not satisfied, the light in surrounding environment is judged to be little, that is, estimated to be night time, and the light mode is set to “headlights ON” (S99).
Meanwhile,
When a light mode is judged for each of the upper region 11 and the lower region 12 by any one of the methods determined above, and processing in S45 is completed, all the processing in S32 in
<Processing in S33 “Determine Light Mode”
First, in S111 “Prevent hunting for light mode”, processing to prevent hunting (a phenomenon in which a light mode switches in succession in a short period of time) for a light mode is performed in consideration of vehicle speed of the vehicle 100.
As described above, the safety standards revised in October 2016 requires automatic lighting within 2 seconds in a case where surrounding illuminance is less than 1000 [1×], and therefore, there is no problem even if a lighting necessity judgment cycle is fixed to 2 seconds. However, in a case where the judgment cycle is fixed at 2 seconds, a problem regarding safety or sense of the driver may arise in that, for example, even if a vehicle enters a tunnel when traveling on a highway, headlights do not turn on until the vehicle travels a considerable distance.
Therefore, in the present embodiment, the judgment cycle is shortened during high-speed traveling to accelerate a response of the lighting unit 7 so that the driver does not feel unnatural.
In order to attain this, a judgment cycle t, which is substantially inversely proportional to speed of the vehicle 100, is calculated in S121 in
[Formula 5]
t=T·e−α(s/120)
Here, on the right-hand side, s represents vehicle speed [m/sec.], T represents 2 [sec.], and a represents a variable for adjusting a shortest cycle, and t on the left-hand side represents a judgment cycle to be set.
If a judgment cycle t corresponding to speed is calculated in S121, whether a light mode has changed in the upper and lower regions within the judgment cycle t is determined in S122. Then, in a case where a light mode has changed, it is determined that judgment in the region is unstable, and a light mode in a previous judgment cycle is maintained (S123). Meanwhile, in a case where the light mode does not change within the judgment cycle t, it is determined that the judgment in the region is stable, and the light mode is updated to a current judgment cycle (S124). It should be noted that, because determination in S122 is performed independently for each of the upper and lower regions, one region may maintain a light mode in a previous cycle and a light mode of another region may be updated to a current cycle.
Next, in S125, the light modes of the upper and lower regions after the processing in S122 to S124 are compared to determine whether the both are the same. If the both are the same, the light mode is set as an output in S111 (S126). Meanwhile, if the both are different, the light mode in the previous judgment cycle is set as the output of S111 (S127).
Next, in S112 “Judge light mode during short inter-vehicle distance” in
In general, there is a relation between speed of the vehicle 100 and an inter-vehicle distance. If the speed is high, the inter-vehicle distance tends to be long, and if the speed is low, the inter-vehicle distance tends to be short. In a situation where there is no preceding vehicle and speed is high as illustrated in
In a case where the conditional statement 1 is not satisfied, it is estimated that a proportion of the preceding vehicle 13 that occupies the image data 10 is small, and light in surrounding environment can be accurately detected from the upper and lower regions, or that the surrounding environment is dark and the lighting unit 7 is already on, and therefore, the processing in S112 is not performed. Meanwhile, in a case where the conditional statement 1 is satisfied, there is a possibility that a proportion of the preceding vehicle 13 that occupies the image data 10 is large due to traffic congestion, and therefore a threshold for judgment of a light mode is corrected so as not to erroneously turn on the lighting unit 7 (S142).
When correcting a judgment threshold in S142, speed of the vehicle 100 and the number of low-luminance pixels in the image data 10 are considered. The number of low-luminance pixels described here is the number of low-luminance pixels included in the upper measurement regions A0 to A4, the low-luminance pixels having a pixel value of (8 bits) or less. A reason for correcting the judgment threshold in consideration of the number of low-luminance pixels is that light mode judgment is not adversely influenced even if the vehicle 100 goes close to the preceding vehicle 13 due to traffic congestion, or the like, in a case where vehicle body color of the preceding vehicle is white or bright color having a high reflectance, while suitability of light mode judgment is greatly influenced by an inter-vehicle distance in a case where the vehicle body color of the preceding vehicle 13 is black or dark color having a low reflectance. For this reason, an inter-vehicle distance to the preceding vehicle 13 in black or dark color is estimated from the number of low-luminance pixels in the upper measurement regions A0 to A4, and the upper thresholds th_u1 and th_u2 are corrected according to a predicted inter-vehicle distance (S142). A correction formula for the judgment threshold used here is as shown in Formula 6.
Here, th_new represents a corrected judgment threshold, th represents an initial judgment threshold, s represents vehicle speed, R represents a threshold adjustment range, D represents the number of low-luminance pixels in the upper measurement regions A0 to A4, and a represents an adjustment variable.
If the judgment threshold th is corrected in S142, a light mode for the upper region 11 is judged by a method similar to S45 in
Next, in S113 “Determine light mode” in
Because it is time to switch the lighting unit 7 from light-on to light-off when both the conditional statements are satisfied, time required to turn off the lighting unit 7 is calculated corresponding to the vehicle speed (S152). Although light-off time calculated here can be calculated by using Formula 5 as in calculating light-on time, any value between 5 and 300 seconds is set for an initial value T of the light-off time, in consideration of automatic light-off time stipulated by the safety standards.
After the processing in S152, or when the conditional statement 2 is not satisfied (maintaining light-off, maintaining light-on, or light-off to light-on), it is judged whether current final judgment is “lights OFF” (S153). In a case where the final judgment is “lights OFF”, an elapsed time is counted up (S154), and when the elapsed time matches the light-off time set in S152 (S155), the light-off time is initialized (S156), and “lights OFF” is output (S157). With this arrangement, light-on can be maintained for the lighting unit 7 before a predetermined time elapses, and the lighting unit 7 can be turned off after the predetermined time elapses.
Meanwhile, in a case where the judgment in S153 is No, that is, in a case where the final judgment is either “small lights ON” or “headlights ON”, it is not necessary to turn off the lights for a while. Therefore, even in a case where light-off time has been set in S152, the light-off time is initialized (S158), and either “small lights ON” or “headlights ON” is output (S159).
When a light mode is determined after a series of processing (S113) in
<Processing in S34 “Transmit to Light Control Unit”>
In the processing S34 “Transmit to light control unit” in
According to the above-described automatic light system of the present embodiment, brightness of light in surrounding environment can be accurately evaluated even in a case of traveling on a sharp curve or a short inter-vehicle distance to a preceding vehicle, for which the brightness of the light in surrounding environment cannot be accurately evaluated with a conventional method. Therefore, appropriate automatic light control that a driver does not feel unnatural can be attained.
it should be noted that, although an example in which brightness of light in surrounding environment is measured by using the imaging unit 1 that images an area in front is described in the present embodiment, a rear camera of the vehicle 100 may be used to measure brightness of the light in surrounding environment and generate light mode information.
Next, an automatic light system according to a second embodiment of the present invention will be described. Although a monocular camera is used as an imaging unit 1 in the first embodiment, the present embodiment is different from the first embodiment in that a stereo camera is used as an imaging unit 1, an inter-vehicle distance to a preceding vehicle is accurately measured by using the stereo camera, and a light mode is determined on the basis of the measured inter-vehicle distance. It should be noted that duplicate description will be omitted for a point in common with the first embodiment.
As illustrated in
To obtain a distance to a measurement object 171 such as a preceding vehicle by using this stereo camera, the principle of triangulation is used as illustrated in
In this way, in the present embodiment, a distance Ds to a preceding vehicle is measured with a stereo camera, and in a case where the distance Ds is shorter than a predetermined threshold, the preceding vehicle is determined to be close, and thresholds (upper thresholds th_u1 and th_u2) of brightness of light in surrounding environment used for determination of light-on or light-off are corrected.
Specific control according to the present embodiment will be described by using
The automatic light system of the second embodiment also acquires image data 10 and vehicle information (S31) and recognizes light in surrounding environment (S32), as in
The “light mode determination” processing of the present embodiment illustrated in
Meanwhile, in a case where the light mode judged in the previous judgment cycle is “lights OFF” in S192, a judgment threshold is recalculated by using Formula 8 in S193.
[Formula 8]
th_new=th−R·(1−e−α(D
Here, in Formula 8, th_new represents a recalculated judgment threshold, th represents an initial judgment threshold, R represents a threshold adjustment range, Ds represents an inter-vehicle distance, Df represents a preset distance range (timing of approaching vehicle determination processing), and a represents an adjustment variable for calculation.
Subsequently, in S194, light in upper part surrounding environment when the preceding vehicle is close is recalculated by using Formula 9.
In Formula 9, the left-hand side represents light in upper part surrounding environment, and on the right-hand side, xi represents average brightness of each measurement region, wi represents weight distribution of each measurement region by an approaching vehicle, and n represents the number of set measurement regions.
It should be noted that, in the present embodiment, the weight distribution wi of each measurement region is defined with Formula 10.
After that, in S195, a light mode of an upper region is selected by using a threshold recalculated in S193 and the light in upper part surrounding environment calculated in S194, and in S196, after hunting prevention processing is performed by a method similar to the method in
In the processing in
Next, an automatic light system according to a third embodiment of the present invention will be described. It should be noted that duplicate description will be omitted for a point in common with the above-described embodiments.
The radar distance measurement unit 1A performs distance measurement with light irradiation by a millimeter wave radar, Light Detection and Ranging (LIDAR), or the like, and reception of reflected light from the light irradiation. It should be noted that an image processing unit 2a of the present embodiment is the one in which a radar distance measurement I/F 27 for connection to the radar distance measurement unit 1A is added to a configuration of the image processing unit 2 illustrated in
As described above, also in a configuration of the present embodiment using the radar distance measurement unit 1A, similarly to the second embodiment, a light mode can be set more appropriately by accurately measuring an inter-vehicle distance, and taking into account the accurate inter-vehicle distance. As a result, it is possible to reduce waste of a battery, and perform light control that matches sense of a driver.
Next, an automatic light system according to a fourth embodiment of the present invention will be described. The present embodiment is different from the above-mentioned embodiments in that map information and own company position information are used for determination of a light mode. It should be noted that duplicate description will be omitted for a point in common with the above-described embodiments.
This map linkage unit 3a forwards to and stores in the auxiliary storage device 243 own vehicle information (speed, or the like), GPS satellite reception information, or another sensing information (for example, image data or radar distance measurement of an area in front of an own vehicle, or the like), and estimates a position and movement direction of the own vehicle on the basis of information stored in the auxiliary storage device 243. Furthermore, the calculation device 241 transmits a control signal to a desired interface on the basis of a program stored in the main storage device 242. The position and movement direction of the own vehicle estimated by the calculation device 241 are transmitted to an image and display I/F 234 along with the map information.
Then, in the map linkage unit 3a, the position of the own vehicle is calculated (positioned) as absolute coordinates on the basis of the positioning signal, and the position of the own vehicle is calculated as a relative change from a positioning position on the basis of the external information. Further, map matching processing is performed in which the calculated position of the own vehicle is corrected to a position on a road in a map by using the map database. As a result of these processing, in the map linkage unit 3a, a movement direction can be estimated with high accuracy on the basis of an estimated position of the own vehicle on a road and direction information.
In addition to a shape of a road in a movement direction of the vehicle 100 and a position of a tunnel, positions of lights in the tunnel is input in the image processing unit 2b via the map linkage I/F 28. By using these, weights for a plurality of upper measurement regions or a threshold for light mode judgment can be previously switched, and headlights can be previously turned on at a place, such as a tunnel, where lighting is required. Therefore, compared with the above-described embodiments, switching of a light mode can be further appropriate, and comfort of a driver can be further enhanced.
It should be noted that the present invention is not limited to the above-described embodiments, but also includes various variations. For example, the description of the embodiments, which has been provided above in detail, is intended to describe the present invention in an easily understandable manner and accordingly, the above-described embodiments are not necessarily limited to the one that includes all the configurations described above. Furthermore, it is possible to replace a part of the configuration of an embodiment with the configuration of another embodiment, and also possible to add, to the configuration of an embodiment, the configuration of another embodiment. Furthermore, it is also possible to add another configuration to a part of the configuration of each embodiment, delete a part of the configuration of each embodiment, and replace a part of the configuration of each embodiment with another configuration.
Number | Date | Country | Kind |
---|---|---|---|
JP2018-046236 | Mar 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/008272 | 3/4/2019 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/176604 | 9/19/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20060086892 | Chen | Apr 2006 | A1 |
20080007180 | Kesterson | Jan 2008 | A1 |
20080100225 | Fujie | May 2008 | A1 |
20090152449 | Goto | Jun 2009 | A1 |
20090323366 | Furusawa | Dec 2009 | A1 |
20100295450 | Oishi | Nov 2010 | A1 |
20130116857 | Mitsugi | May 2013 | A1 |
20170355300 | Kurata | Dec 2017 | A1 |
Number | Date | Country |
---|---|---|
2005-075304 | Mar 2005 | JP |
2008-110715 | May 2008 | JP |
2009-255722 | Nov 2009 | JP |
2010-006172 | Jan 2010 | JP |
2015-071403 | Apr 2015 | JP |
WO-2012066609 | May 2012 | WO |
Entry |
---|
International Search Report with English translation and Written Opinion issued in corresponding application No. PCT/JP2019/008272 dated Jun. 11, 2019. |
Number | Date | Country | |
---|---|---|---|
20210001767 A1 | Jan 2021 | US |