The present invention relates to vehicle detecting devices, light distribution controlling devices, and vehicle detecting methods.
Adaptive driving beam (ADB) control is proposed that dynamically and adaptively controls light distribution patterns based on the circumstances surrounding the vehicle. ADB control detects, with a camera, the presence of a front vehicle that should not be illuminated with high-illuminance light and blocks the region of light that corresponds to the front vehicle (see, for example, Patent Literature 1). Blocking the region of light that corresponds to the front vehicle can reduce glare caused on the driver of the front vehicle and can also improve the visibility of the driver of the host vehicle.
In implementation of ADB control, front vehicles should desirably be detected with high accuracy. The present inventors have conducted diligent investigations into the techniques for detecting front vehicles and found that the accuracy of detecting front vehicles can decrease when the host vehicle become tilted.
The present invention has been made in the light of such circumstances and is, in one aspect, directed to providing a technique for increasing the accuracy of detecting a front vehicle.
1. To address the aforementioned issue, one aspect of the present invention provides a vehicle detecting device. This vehicle detecting device performs a process of detecting a front vehicle using an image that is based on an imaging device provided in a vehicle, and in accordance with a change in an angle of tilt of the vehicle's right-left axis, modifies an imaging region of the imaging device so as to mitigate a shift in the imaging region caused by the change.
Another aspect of the present invention provides a light distribution controlling device. This light distribution controlling device, based on a detection result of the vehicle detecting device of the aspect above, sets a light distribution pattern that includes a shaded region to overlap the front vehicle.
Yet another aspect of the present invention provides a vehicle detecting method. This vehicle detecting method includes performing a process of detecting a front vehicle using an image that is based on an imaging device provided in a vehicle, and in accordance with a change in an angle of tilt of the vehicle's right-left axis, modifying an imaging region of the imaging device so as to mitigate a shift in the imaging region caused by the change.
2. To address the aforementioned issue, one aspect of the present invention provides a vehicle detecting device. This vehicle detecting device sets a processing region in an image that is based on an imaging device provided in a vehicle, performs a process of detecting a front vehicle from the processing region, and in accordance with a change in an angle of tilt of the vehicle's right-left axis, modifies the processing region so as to mitigate a shift in the processing region caused by the change. Another aspect of the present invention provides
a light distribution controlling device. This light distribution controlling device, based on a detection result of the vehicle detecting device of the aspect above, sets a light distribution pattern that includes a shaded region to overlap the front vehicle.
Yet another aspect of the present invention provides a vehicle detecting method. This vehicle detecting method includes setting a processing region in an image that is based on an imaging device provided in a vehicle, performing a process of detecting a front vehicle from the processing region, and in accordance with a change in an angle of tilt of the vehicle's right-left axis, modifying the processing region so as to mitigate a shift in the processing region caused by the change.
3. To address the aforementioned issue, one aspect of the present invention provides a vehicle detecting device. This vehicle detecting device, using a structuring element of a predetermined shape, performs a dilation process on a luminous point included in an image that is based on an imaging device provided in a vehicle; when there is another dilated luminous point to be coupled to the dilated luminous point, detects a front vehicle based on the two luminous points; and in accordance with a change in an angle of tilt of the vehicle's right-left axis, modifies the structuring element so as to absorb an offset between the luminous points in the image caused by the change.
Another aspect of the present invention provides a light distribution controlling device. This light distribution controlling device, based on a detection result of the vehicle detecting device of the aspect above, sets a light distribution pattern that includes a shaded region to overlap the front vehicle.
Yet another aspect of the present invention provides a vehicle detecting method. This vehicle detecting method includes, by using a structuring element of a predetermined shape, performing a dilation process on a luminous point included in an image that is based on an imaging device provided in a vehicle; when there is another dilated luminous point to be coupled to the dilated luminous point, detecting a front vehicle based on the two luminous points; and in accordance with a change in an angle of tilt of the vehicle's right-left axis, modifying the structuring element so as to absorb an offset between the luminous points in the image caused by the change.
4. To address the aforementioned issue, one aspect of the present invention provides a vehicle detecting device. This vehicle detecting device, when two luminous points included in an image that is based on an imaging device provided in a vehicle satisfy a predetermined pair determining condition, detects a front vehicle based on the two luminous points, and in accordance with a change in an angle of tilt of the vehicle's right-left axis, relaxes the pair determining condition so as to absorb an offset between the luminous points in the image caused by the change.
Another aspect of the present invention provides a light distribution controlling device. This light distribution controlling device, based on a detection result of the vehicle detecting device of the aspect above, sets a light distribution pattern that includes a shaded region to overlap the front vehicle.
Yet another aspect of the present invention provides a vehicle detecting method. This vehicle detecting method includes, when two luminous points included in an image that is based on an imaging device provided in a vehicle satisfy a predetermined pair determining condition, detecting a front vehicle based on the two luminous points, and in accordance with a change in an angle of tilt of the vehicle's right-left axis, relaxing the pair determining condition so as to absorb an offset between the luminous points in the image caused by the change.
It is to be noted that any combinations of the constituent elements above or an embodiment obtained by converting what is expressed by the present invention between a method, an apparatus, a system, and so forth is also effective as an aspect of the present invention.
Embodiments will now be described, by way of example only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several Figures, in which:
Hereinafter, the present invention will be described based on some preferred embodiments and with reference to the drawings. The embodiments are illustrative in nature and are not intended to limit the invention. Not all the features and combinations thereof described according to the embodiments are necessarily essential to the invention. Identical or equivalent constituent elements, members, and processes illustrated in the drawings are given identical reference characters, and their duplicate description will be omitted as appropriate.
The scales and shapes of the components illustrated in the drawings are set merely for convenience in order to facilitate description and are not to be interpreted as limiting, unless specifically indicated otherwise. When terms such as “first” and “second” are used in the present specification or in the claims, these terms do not indicate the order or the levels of importance in any way and are merely used to distinguish one component from another component, unless specifically indicated otherwise. One or more of the members that are not important in describing the embodiments are omitted from the drawings.
The vehicle lamp system 1 includes a light distribution variable lamp 2, an imaging device 4, a vehicle detecting device 6, a light distribution controlling device 8, and an attitude sensor 10. These components are provided in a vehicle. A vehicle in which the vehicle lamp system 1 is provided according to the present embodiment is a straddle-type vehicle, such as a motorcycle. The light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may all be contained within a single housing, or some of these members may be provided outside such a housing.
For example, the light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 are housed in a lamp room. The lamp room is defined by a lamp body having an opening that opens to the front side of the vehicle and a light-transmissive cover attached so as to cover the opening of the lamp body. The imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may be provided outside the lamp room, such as in the vehicle. In this case, the imaging device 4 may be an onboard camera. The vehicle detecting device 6 and the light distribution controlling device 8 may be, for example, entirely or partly constituted by a vehicle ECU. An imaging adjusting unit 30 and so forth, described later, of the vehicle detecting device 6 may be embedded in the imaging device 4.
The light distribution variable lamp 2 is capable of illuminating a region ahead of the host vehicle with a visible light beam L1 of a variable intensity distribution. The light distribution variable lamp 2 is enabled to vary the illuminance of light in a plurality of illuminated individual regions R arrayed ahead of the vehicle independently of each other. The plurality of individual regions R are arrayed, for example, in a matrix. The light distribution variable lamp 2 receives information instructing on a light distribution pattern PTN from the light distribution controlling device 8 and emits a visible light beam L1 having an intensity distribution corresponding to the light distribution pattern PTN. Thus, the light distribution pattern PTN is formed ahead of the host vehicle. A light distribution pattern PTN can be understood as a two-dimensional illuminance distribution of an illumination pattern 902 that the light distribution variable lamp 2 forms on an imaginary vertical screen 900 ahead of the host vehicle.
There is no particular limitation on the configuration of the light distribution variable lamp 2, and the light distribution variable lamp 2 includes, for example, a plurality of light sources arrayed in a matrix and a lighting circuit that drives the light sources to turn them on independently of each other. Some preferred examples of such a light source include a semiconductor light source, such as a light-emitting diode (LED), a laser diode (LD), or an organic or inorganic electroluminescence (EL) light source. The light sources are mapped to the respective individual regions R, and each individual region R is individually illuminated with light from the corresponding light source. The light distribution variable lamp 2 has a resolution, that is, a light distribution resolving power of, for example, from 1,000 pixels to 2,000,000 pixels. The resolution of the light distribution variable lamp 2 means the number of the unit regions, of a light distribution pattern PTN, whose illuminance can be varied independently of each other.
In order to form an illuminance distribution corresponding to a light distribution pattern PTN, the light distribution variable lamp 2 may include, for example but not limited to, a pattern forming device of a matrix type, such as a digital mirror device (DMD) or a liquid-crystal device, or a pattern forming device of a scan optics type that scans the space ahead of the host vehicle with light from a light source. In an alternative configuration, illumination of light directed to the region ahead of the host vehicle may be blocked partly by a shading plate.
The imaging device 4 has a sensitivity to a visible light range and repeatedly captures an image of the region ahead of the host vehicle. The imaging device 4 captures an image of reflected light L2 of a visible light beam L1 reflected from an object located ahead of the vehicle. The imaging device 4 also captures an image of light from a front vehicle, including a leading vehicle and an oncoming vehicle. An image IMG that the imaging device 4 generates is sent to the vehicle detecting device 6.
An image IMG that the vehicle detecting device 6 acquires from the imaging device 4 may be RAW image data or image data subjected to predetermined image processing by the imaging device 4. When the vehicle detecting device 6 receives image data resulting from subjecting RAW image data generated by the imaging device 4 to image processing performed by a processing device other than the imaging device 4, this receiving also corresponds to the acquiring of an image IMG from the imaging device 4. In the following description, “an image IMG that is based on the imaging device 4” may mean either of RAW image data and data subjected to image processing. Furthermore, the expression “image IMG” may be used without any distinction between the two types of image data.
The vehicle detecting device 6 detects a front vehicle using an image IMG that is based on the imaging device 4. The vehicle detecting device 6 sends its detection result to the light distribution controlling device 8. Herein, the vehicle detecting device 6 may detect a target aside from a front vehicle. The vehicle detecting device 6 can be constituted by a digital processor, and for example, the vehicle detecting device 6 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The vehicle detecting device 6 includes, in one example, an image processing unit 12, a detection unit 14, and an imaging adjusting unit 30. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The light distribution controlling device 8 executes ADB control of dynamically and adaptively controlling the light distribution of the light distribution variable lamp 2 in accordance with a target present in the region ahead. The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region to overlap a front vehicle, based on a detection result of the vehicle detecting device 6. The light distribution controlling device 8 sends information instructing on a light distribution pattern PTN to the light distribution variable lamp 2. The light distribution controlling device 8 can be constituted by a digital processor, and for example, the light distribution controlling device 8 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The light distribution controlling device 8 includes, in one example, a pattern determining unit 18 and a lamp controlling unit 20. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The attitude sensor 10 measures the tilt of the host vehicle in the right-left direction. That is, the attitude sensor 10 measures the angle of tilt of the vehicle's right-left axis with respect to a horizontal line or a vertical line (in other words, the angle of tilt of the vehicle's up-down axis with respect to a horizontal line or a vertical line). The attitude sensor 10 is constituted, for example but not limited to, by a known inertial measurement unit (IMU). The attitude sensor 10 sends its measurement result to the vehicle detecting device 6.
Next, an operation of the vehicle detecting device 6 and of the light distribution controlling device 8 will be described.
When a front vehicle 200 is in the imaging region 4a of the imaging device 4, an image IMG that includes the front vehicle 200 is generated by the imaging device 4 and sent to the vehicle detecting device 6. The imaging region 4a, of the imaging device 4, used for vehicle detection or light distribution control, that is, a sensing region of the imaging device 4 is typically set larger in the right-left direction of the host vehicle 100 than in its up-down direction. This is because the range in which a front vehicle 200 to be detected appears or the shape of a light distribution pattern PTN is larger in the right-left direction than in the up-down direction. Herein, the imaging region 4a covers a range smaller than the maximum imageable range of the imaging device 4.
The image IMG sent to the vehicle detecting device 6 is acquired by the image processing unit 12. The image processing unit 12 performs known image processing, such as binarization, on the image IMG. This image processing generates a luminous point image IMGa in which two luminous points 202 corresponding to lamps of the front vehicle 200 are extracted, as shown in
The detection unit 14 determines the presence of a front vehicle 200 using a luminous point image IMGa. When a luminous point image IMGa includes luminous points 202, the detection unit 14 detects a front vehicle 200 based on these luminous points 202. For example, the detection unit 14 compares luminous points 202 against a known pair determining condition, and when having identified two luminous points 202 that satisfy the pair determining condition, the detection unit 14 determines these two luminous points 202 as a front vehicle 200. The detection unit 14 sends its detection result to the light distribution controlling device 8. Herein, there is no particular limitation on the vehicle detecting method that the detection unit 14 performs. For example, without generating a luminous point image IMGa, the detection unit 14 may detect a front vehicle 200 directly from an image IMG using a known method including, for example, algorithm recognition or deep learning.
The detection result sent to the light distribution controlling device 8 is acquired by the pattern determining unit 18. When a front vehicle 200 is detected, the pattern determining unit 18 sets a shaded region 24 to overlap the front vehicle 200 in a light distribution pattern that serves as a base, as shown in
A light distribution pattern that serves as a base is selected in accordance with a light distribution mode that is set based, for example, on the driver's instruction input by operating the light switch (not shown), the traveling state of the host vehicle, or the environment surrounding the host vehicle. For example, light distribution modes include, for example, a high beam mode of forming a high beam light distribution pattern, a low beam mode of forming a low beam light distribution pattern, and a town mode of forming a light distribution pattern suitable for traveling in a city area.
The lamp controlling unit 20 instructs the light distribution variable lamp 2 to form the light distribution pattern PTN. The lamp controlling unit 20 is constituted, for example, by a known LED driver module (LDM). If the light sources of the light distribution variable lamp 2 are controlled by analog dimming, the lamp controlling unit 20 adjusts the direct current level of the driving current flowing in the light sources. Meanwhile, if the light sources are controlled by pulse width modulation (PWM) dimming, the lamp controlling unit 20 adjusts the mean level of the driving current by switching the current that flows in the light sources to adjust the ratio of the on periods. If the light distribution variable lamp 2 includes a DMD, the lamp controlling unit 20 may control the on/off switching of each mirror element constituting the DMD. If the lamp controlling unit 20 includes a liquid-crystal device, the lamp controlling unit 20 may control the optical transmittance of the liquid-crystal device. Thus, the light distribution pattern PTN is formed ahead of the host vehicle.
Next, how an imaging region 4a is adjusted when a host vehicle 100 becomes tilted sideways will be described.
An image IMG obtained when the imaging region 4a is tilted is an image that is tilted with respect to the ground surface. With an image IMG tilted with respected to the ground surface, a lamp or lamps of a front vehicle 200 may lie outside the image IMG. Therefore, as to a luminous point image IMGa obtained from such an image IMG, a luminous point image IMGa that fails to include a luminous point 202 can be generated, as shown in
Accordingly, the vehicle detecting device 6 according to the present embodiment, in accordance with a change in the angle of tilt (bank angle) of the vehicle's right-left axis relative to a horizontal line, modifies the imaging region 4a so as to mitigate a shift in the imaging region 4a caused by the change.
Specifically, the measurement result sent from the attitude sensor 10 to the vehicle detecting device 6 is acquired by the imaging adjusting unit 30. Based on the acquired measurement result, the imaging adjusting unit 30 can detect a change in the angle of tilt of the host vehicle 100. In response to detecting a change in the angle of tilt of the host vehicle 100, the imaging adjusting unit 30 modifies the imaging region 4a so as to mitigate a shift in the imaging region 4a with respect to the region ahead. In other words, with the portion of the region ahead that overlaps the imaging region 4a when the host vehicle 100 is not tilted set as a reference portion, the imaging adjusting unit 30 modifies the imaging region 4a so that the overlap (the area) between the reference portion and the imaging region 4a held when the host vehicle 100 is tilted becomes larger than when such an adjustment is not made to the imaging region 4a. This adjustment makes a front vehicle 200 more likely to be included in the imaging region 4a even when the host vehicle 100 becomes tilted.
The imaging adjusting unit 30 modifies at least one of a region range of the imaging region 4a in the up-down direction of the vehicle or a region angle of the imaging region 4a. First, a case in which the imaging adjusting unit 30 modifies the region range of the imaging region 4a will be described.
When the host vehicle 100 becomes tilted sideways and the angle of tilt of its right-left axis becomes a second angle greater than the first angle with respect to a horizontal line, the imaging adjusting unit 30 sets the region range in the up-down direction of the vehicle to a second range R2 greater than the first range R1, as shown in
For example, the imaging adjusting unit 30 holds, in advance, a threshold of the angle of tilt of the vehicle's right-left axis, and when the angle of tilt of the host vehicle 100 has exceeded the threshold, the imaging adjusting unit 30 expands the region range of the imaging region 4a from the first range R1 to the second range R2. This threshold can be set as appropriate based on an experiment or simulation conducted by an engineer. Expanding the range of the imaging region 4a in the up-down direction from the first range R1 to the second range R2 corresponds to the mitigating of a shift in the imaging region 4a. Herein, as long as either at least a portion of the upper side of the imaging region 4a is expanded upward or at least a portion of the lower side of the imaging region 4a is expanded downward, such expanding corresponds to the shift mitigation.
As the imaging region 4a is expanded from the first range R1 to the second range R2, even when the host vehicle 100 becomes tilted sideways, a front vehicle 200 lying outside the imaging region 4a can be suppressed. Accordingly, it becomes more likely that a luminous point image IMGa includes two luminous points 202 corresponding to lamps of a front vehicle 200, as shown in
Herein, when the host vehicle 100 is tilted at a first angle greater than 0 degrees, vehicle detection may be performed with the use of the imaging region 4a of the first range R1, and when the host vehicle 100 becomes further tilted at a second angle greater than the first angle, vehicle detection may be performed with the use of the imaging region 4a of the second range R2. Furthermore, the imaging adjusting unit 30 may modify the region range in the up-down direction continuously or stepwise including three or more steps, in accordance with a change in the angle of tilt of the vehicle's right-left axis. In other words, the region range may be increased as the angle of tilt of the host vehicle 100 increases.
Next, a case in which the imaging adjusting unit 30 modifies the region angle of the imaging region 4a will be described.
When the angle of tilt of the right-left axis of the host vehicle 100 is a first angle with respect to a horizontal line, the region angle of the imaging region 4a is a third angle θ3. For example, the imaging region 4a of the third angle θ3 is held by default in the vehicle detecting device 6. When the host vehicle 100 is substantially not tilted, that is, when the angle of tilt of the vehicle's right-left axis is about 0 degrees (first angle≈0 degrees), the vehicle detecting device 6 detects a front vehicle 200 using the imaging region 4a of the third angle θ3 (θ3≈0 degrees), as shown in
When the host vehicle 100 becomes tilted sideways and the angle of tilt of its right-left axis becomes a second angle greater than the first angle with respect to a horizontal line, the imaging adjusting unit 30 sets the region angle of the imaging region 4a to a fourth angle θ4 greater than the third angle θ3, as shown in
For example, the imaging adjusting unit 30 holds, in advance, a threshold of the angle of tilt of the vehicle's right-left axis, and when the angle of tilt of the host vehicle 100 has exceeded the threshold, the imaging adjusting unit 30 modifies the region angle of the imaging region 4a from the third angle θ3 to the fourth angle θ4. This threshold can be set as appropriate based on an experiment or simulation conducted by an engineer. Increasing the angle of the imaging region 4a with respect to the vehicle's right-left axis from the third angle θ3 to the fourth angle θ4 corresponds to the mitigating of a shift in the imaging region 4a. Herein, as long as the angle of at least a portion of the imaging region 4a is modified, such modifying corresponds to the shift mitigation.
As the imaging region 4a is tilted from the third angle θ3 to the fourth angle θ4 with respect to the vehicle's right-left axis, even when the host vehicle 100 becomes tilted sideways, a front vehicle 200 lying outside the imaging region 4a can be suppressed. Accordingly, it becomes more likely that a luminous point image IMGa includes two luminous points 202 corresponding to lamps of a front vehicle 200, as shown in
Herein, when the host vehicle 100 is tilted at a first angle greater than 0 degrees, vehicle detection may be performed with the use of the imaging region 4a of the third angle θ3, and when the host vehicle 100 becomes further tilted at a second angle greater than the first angle, vehicle detection may be performed with the use of the imaging region 4a of the fourth angle θ4. Furthermore, the imaging adjusting unit 30 may modify the region angle continuously or stepwise including three or more steps, in accordance with a change in the angle of tilt of the vehicle's right-left axis. In other words, the region angle may be increased as the angle of tilt of the host vehicle 100 increases.
The imaging adjusting unit 30 may modify both the region range and the region angle of the imaging region 4a. This can further reduce the likelihood that a front vehicle 200 lies outside the imaging region 4a due to the tilting of the host vehicle 100. Accordingly, a front vehicle 200 can be detected with higher accuracy.
First, the vehicle detecting device 6 acquires the angle of tilt of the host vehicle 100 from the attitude sensor 10 (S101). Then, the vehicle detecting device 6 determines whether the angle of tilt of the right-left axis of the host vehicle 100 is greater than a threshold (S102). If the angle of tilt is greater than the threshold (Y at S102), the vehicle detecting device 6 adjusts at least one of the region range or the region angle of the imaging region 4a (S013), and proceeds to step S104. If the angle of tilt is not greater than the threshold (N at S102), the vehicle detecting device 6 proceeds to step S104 without adjusting the imaging region 4a.
Next, the vehicle detecting device 6 acquires an image IMG from the imaging device 4 (S104). Then, the vehicle detecting device 6 generates a luminous point image IMGa from the acquired image IMG (S105). The vehicle detecting device 6 performs a vehicle detecting process using the generated luminous point image IMGa (S106). Then, the vehicle detecting device 6 determines whether a front vehicle 200 is present based on the result of the vehicle detecting process (S107). If a front vehicle 200 is present (Y at S107), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that includes a shaded region 24 (S108), and terminates the present routine. If no front vehicle 200 is present (N at S107), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that does not include a shaded region 24 (S109), and terminates the present routine.
As described above, the vehicle detecting device 6 according to the present embodiment performs a process of detecting a front vehicle 200 using an image IMG that is based on the imaging device 4 provided in the host vehicle 100. Furthermore, the vehicle detecting device 6, in accordance with a change in the angle of tilt of the vehicle's right-left axis with respect to a horizontal line, modifies the imaging region 4a so as to mitigate a shift in the imaging region 4a (imaging angle) of the imaging device 4 caused by the change. Adjusting the imaging region 4a dynamically in accordance with the tilt of the host vehicle 100 makes it possible to avoid missing a front vehicle 200. Accordingly, the accuracy of detecting a front vehicle 200 can be increased.
The vehicle detecting device 6 according to the present embodiment modifies at least one of the region range of the imaging region 4a in the up-down direction of the vehicle or the region angle of the imaging region 4a. In other words, the vehicle detecting device 6 adjusts at least one of the shape or the orientation of the imaging region 4a.
When modifying the region range, the vehicle detecting device 6 sets the region range to the first range R1 when the angle of tilt is the first angle and sets the region range to the second range R2 greater than the first range R1 when the angle of tilt is the second angle greater than the first angle. As the imaging region 4a is made relatively small when the angle of tilt of the host vehicle 100 is relatively small and the imaging region 4a is made relatively large when the angle of tilt is relatively large as described above, the processing load on the vehicle detecting device 6 can be reduced, as compared to the case of performing vehicle detection using an imaging region 4a that is large by default regardless of the angle of tilt of the host vehicle 100. Furthermore, the above can suppress erroneous determination that a luminous point 202 that is not of a front vehicle 200 is determined to have come from a front vehicle 200.
Meanwhile, when modifying the region angle, the vehicle detecting device 6 sets the region angle with respect to the vehicle's right-left axis to the third angle θ3 when the angle of tilt is the first angle and sets the region angle with respect to the vehicle's right-left axis to the fourth angle θ4 greater than the third angle θ3 when the angle of tilt is the second angle greater than the first angle. Modifying the region angle in accordance with the tilt of the host vehicle 100 while retaining the shape of the imaging region 4a in this manner can suppress erroneous determination.
The host vehicle 100 provided with the vehicle detecting device 6 according to the present embodiment is a straddle-type vehicle. The vehicle's right-left axis of a straddle-type vehicle tends to become tilted to a greater extent than that of a typical four-wheeled automobile. Therefore, the vehicle detecting device 6 can exhibit its functionality effectively especially in straddle-type vehicles. Herein, the host vehicle 100 may be a vehicle other than a straddle-type vehicle, such as a four-wheeled automobile.
The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region 24 to overlap a front vehicle 200, based on a detection result of the vehicle detecting device 6. This configuration makes it possible to achieve, with higher accuracy, both the reduction of glare caused on the driver of a front vehicle 200 and the improvement in the visibility of the driver of the host vehicle 100. Accordingly, safety in vehicle driving can be improved.
Thus far, Embodiment 1 of the present invention has been described in detail. The embodiment described above merely illustrates a specific example for implementing the present invention. The contents of the embodiment do not limit the technical scope of the present invention, and a number of design changes, including modifications, additions, and deletions of constituent elements, can be made within the scope that does not depart from the sprit of the invention set forth in the claims. A new embodiment resulting from adding a design change has advantageous effects of the embodiments combined as well as the advantageous effects of the modification. With regard to the embodiment described above, the expressions “according to the present embodiment,” “in the present embodiment,” and so on are added for emphasis to the content that can be subjected to such a design change as described above, but such a design change is also permitted on the content without these expressions. Any desired combination of the constituent elements described above is also valid as an aspect of the present invention. Hatching added along a section in the drawings does not limit the material of such with hatching.
The invention according to Embodiment 1 described above may be identified through the items indicated below.
A vehicle detecting device (6) that:
The vehicle detecting device (6) according to First Item, wherein
The vehicle detecting device (6) according to First Item or Second Item, wherein
A light distribution controlling device (8) that:
A vehicle detecting method including:
The vehicle lamp system 1 includes a light distribution variable lamp 2, an imaging device 4, a vehicle detecting device 6, a light distribution controlling device 8, and an attitude sensor 10. These components are provided in a vehicle. A vehicle in which the vehicle lamp system 1 is provided according to the present embodiment is a straddle-type vehicle, such as a motorcycle. The light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may all be contained within a single housing, or some of these members may be provided outside such a housing.
For example, the light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 are housed in a lamp room. The lamp room is defined by a lamp body having an opening that opens to the front side of the vehicle and a light-transmissive cover attached so as to cover the opening of the lamp body. The imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may be provided outside the lamp room, such as in the vehicle. In this case, the imaging device 4 may be an onboard camera. The vehicle detecting device 6 and the light distribution controlling device 8 may be, for example, entirely or partly constituted by a vehicle ECU.
The light distribution variable lamp 2 is capable of illuminating a region ahead of the host vehicle with a visible light beam L1 of a variable intensity distribution. The light distribution variable lamp 2 is enabled to vary the illuminance of light in a plurality of illuminated individual regions R arrayed ahead of the vehicle independently of each other. The plurality of individual regions R are arrayed, for example, in a matrix. The light distribution variable lamp 2 receives information instructing on a light distribution pattern PTN from the light distribution controlling device 8 and emits a visible light beam L1 having an intensity distribution corresponding to the light distribution pattern PTN. Thus, the light distribution pattern PTN is formed ahead of the host vehicle. A light distribution pattern PTN can be understood as a two-dimensional illuminance distribution of an illumination pattern 902 that the light distribution variable lamp 2 forms on an imaginary vertical screen 900 ahead of the host vehicle.
There is no particular limitation on the configuration of the light distribution variable lamp 2, and the light distribution variable lamp 2 includes, for example, a plurality of light sources arrayed in a matrix and a lighting circuit that drives the light sources to turn them on independently of each other. Some preferred examples of such a light source include a semiconductor light source, such as a light-emitting diode (LED), a laser diode (LD), or an organic or inorganic electroluminescence (EL) light source. The light sources are mapped to the respective individual regions R, and each individual region R is individually illuminated with light from the corresponding light source. The light distribution variable lamp 2 has a resolution, that is, a light distribution resolving power of, for example, from 1,000 pixels to 2,000,000 pixels. The resolution of the light distribution variable lamp 2 means the number of the unit regions, of a light distribution pattern PTN, whose illuminance can be varied independently of each other.
In order to form an illuminance distribution corresponding to a light distribution pattern PTN, the light distribution variable lamp 2 may include, for example but not limited to, a pattern forming device of a matrix type, such as a digital mirror device (DMD) or a liquid-crystal device, or a pattern forming device of a scan optics type that scans the space ahead of the host vehicle with light from a light source. In an alternative configuration, illumination of light directed to the region ahead of the host vehicle may be blocked partly by a shading plate.
The imaging device 4 has a sensitivity to a visible light range and repeatedly captures an image of the region ahead of the host vehicle. The imaging device 4 captures an image of reflected light L2 of a visible light beam L1 reflected from an object located ahead of the vehicle. The imaging device 4 also captures an image of light from a front vehicle, including a leading vehicle and an oncoming vehicle. An image IMG that the imaging device 4 generates is sent to the vehicle detecting device 6.
An image IMG that the vehicle detecting device 6 acquires from the imaging device 4 may be RAW image data or image data subjected to predetermined image processing by the imaging device 4. When the vehicle detecting device 6 receives image data resulting from subjecting RAW image data generated by the imaging device 4 to image processing performed by a processing device other than the imaging device 4, this receiving also corresponds to the acquiring of an image IMG from the imaging device 4. In the following description, “an image IMG that is based on the imaging device 4” may mean either of RAW image data and data subjected to image processing. Furthermore, the expression “image IMG” may be used without any distinction between the two types of image data.
The vehicle detecting device 6 detects a front vehicle using an image IMG that is based on the imaging device 4. The vehicle detecting device 6 sends its detection result to the light distribution controlling device 8. Herein, the vehicle detecting device 6 may detect a target aside from a front vehicle. The vehicle detecting device 6 can be constituted by a digital processor, and for example, the vehicle detecting device 6 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The vehicle detecting device 6 includes, in one example, a region setting unit 26, an image processing unit 12, a detection unit 14, and a region adjusting unit 28. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The light distribution controlling device 8 executes ADB control of dynamically and adaptively controlling the light distribution of the light distribution variable lamp 2 in accordance with a target present in the region ahead. The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region to overlap a front vehicle, based on a detection result of the vehicle detecting device 6. The light distribution controlling device 8 sends information instructing on a light distribution pattern PTN to the light distribution variable lamp 2. The light distribution controlling device 8 can be constituted by a digital processor, and for example, the light distribution controlling device 8 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The light distribution controlling device 8 includes, in one example, a pattern determining unit 18 and a lamp controlling unit 20. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The attitude sensor 10 measures the tilt of the host vehicle in the right-left direction. That is, the attitude sensor 10 measures the angle of tilt of the vehicle's right-left axis with respect to a horizontal line or a vertical line (in other words, the angle of tilt of the vehicle's up-down axis with respect to a horizontal line or a vertical line). The attitude sensor 10 is constituted, for example but not limited to, by a known inertial measurement unit (IMU). The attitude sensor 10 sends its measurement result to the vehicle detecting device 6.
Next, an operation of the vehicle detecting device 6 and of the light distribution controlling device 8 will be described.
When a front vehicle 200 is present in the region ahead of the host vehicle 100, an image IMG that includes the front vehicle 200 is generated by the imaging device 4 and sent to the vehicle detecting device 6. The image IMG sent to the vehicle detecting device 6 is acquired by the region setting unit 26. The region setting unit 26 sets a processing region ROI in the image IMG. A processing region ROI is a region in which, for example, a front vehicle 200 is predicted to appear, and can be set in advance based on an experiment or simulation conducted by an engineer. In one example, a processing region ROI is a rectangular region that extends in the width direction of the vehicle and lies above a horizontal line. The region setting unit 26 sends the image IMG having the processing region ROI set therein to the image processing unit 12.
The image processing unit 12 extracts the processing region ROI from the image IMG, as shown in
The detection unit 14 determines the presence of the front vehicle 200 using the luminous point image IMGa. Performing a vehicle detecting process on the luminous point image IMGa generated from the processing region ROI corresponds to performing a vehicle detecting process on the processing region ROI. When the luminous point image IMGa includes luminous points 202, the detection unit 14 detects a front vehicle 200 based on these luminous points 202. The detection unit 14 according to the present embodiment determines that luminous points 202 present in a processing region ROI are derived from a front vehicle 200, or in other words, determines that a front vehicle 200 is present at the position where the luminous points 202 are present. The detection unit 14 sends its detection result to the light distribution controlling device 8. Herein, there is no particular limitation on the vehicle detecting method that the detection unit 14 performs. For example, without generating a luminous point image IMGa, the detection unit 14 may detect a front vehicle 200 from a processing region ROI using a known method including, for example, algorithm recognition or deep learning.
The detection result sent to the light distribution controlling device 8 is acquired by the pattern determining unit 18. When a front vehicle 200 is detected, the pattern determining unit 18 sets a shaded region 24 to overlap the front vehicle 200 in a light distribution pattern that serves as a base, as shown in
A light distribution pattern that serves as a base is selected in accordance with a light distribution mode that is set based, for example, on the driver's instruction input by operating the light switch (not shown), the traveling state of the host vehicle, or the environment surrounding the host vehicle. For example, light distribution modes include, for example, a high beam mode of forming a high beam light distribution pattern, a low beam mode of forming a low beam light distribution pattern, and a town mode of forming a light distribution pattern suitable for traveling in a city area.
The lamp controlling unit 20 instructs the light distribution variable lamp 2 to form the light distribution pattern PTN. The lamp controlling unit 20 is constituted, for example, by a known LED driver module (LDM). If the light sources of the light distribution variable lamp 2 are controlled by analog dimming, the lamp controlling unit 20 adjusts the direct current level of the driving current flowing in the light sources. Meanwhile, if the light sources are controlled by pulse width modulation (PWM) dimming, the lamp controlling unit 20 adjusts the mean level of the driving current by switching the current that flows in the light sources to adjust the ratio of the on periods. If the light distribution variable lamp 2 includes a DMD, the lamp controlling unit 20 may control the on/off switching of each mirror element constituting the DMD. If the lamp controlling unit 20 includes a liquid-crystal device, the lamp controlling unit 20 may control the optical transmittance of the liquid-crystal device. Thus, the light distribution pattern PTN is formed ahead of the host vehicle.
Next, how a processing region ROI is adjusted when a host vehicle 100 becomes tilted sideways will be described.
An image IMG obtained when the imaging region 4a is tilted sideways is an image that is tilted with respect to the ground surface. Therefore, a processing region ROI extracted from this image IMG is also tilted with respect to the ground surface, as shown in
Accordingly, the vehicle detecting device 6 according to the present embodiment, in accordance with a change in the angle of tilt (bank angle) of the vehicle's right-left axis relative to a horizontal line, modifies a processing region ROI so as to mitigate a shift in the processing region ROI caused by the change. In other words, the vehicle detecting device 6 sets, in an image IMG, a processing region ROI modified so as to mitigate the shift.
Specifically, the measurement result sent from the attitude sensor 10 to the vehicle detecting device 6 is acquired by the region adjusting unit 28. Based on the acquired measurement result, the region adjusting unit 28 can detect a change in the angle of tilt of the host vehicle 100. In response to detecting a change in the angle of tilt of the host vehicle 100, the region adjusting unit 28 modifies the processing region ROI so as to mitigate a shift in the processing region ROI with respect to the region ahead. In other words, with the portion of the region ahead that overlaps the processing region ROI when the host vehicle 100 is not tilted set as a reference portion, the region adjusting unit 28 modifies the processing region ROI so that the overlap (the area) between the reference portion and the processing region ROI held when the host vehicle 100 is tilted becomes larger than when such an adjustment is not made to the processing region ROI. This adjustment makes a front vehicle 200 more likely to be included in the processing region ROI even when the host vehicle 100 becomes tilted.
The region adjusting unit 28 modifies at least one of a region range of the processing region ROI in the up-down direction of the vehicle or a region angle of the processing region ROI. First, a case in which the region adjusting unit 28 modifies the region range of the processing region ROI will be described.
When the host vehicle 100 becomes tilted sideways and the angle of tilt of its right-left axis becomes a second angle greater than the first angle with respect to a horizontal line, the region adjusting unit 28 sets the region range in the up-down direction of the vehicle to a second range R2 greater than the first range R1, as shown in
For example, the region adjusting unit 28 holds, in advance, a threshold of the angle of tilt of the vehicle's right-left axis, and when the angle of tilt of the host vehicle 100 has exceeded the threshold, the region adjusting unit 28 expands the region range of the processing region ROI from the first range R1 to the second range R2. This threshold can be set as appropriate based on an experiment or simulation conducted by an engineer. Expanding the range of the processing region ROI in the up-down direction from the first range R1 to the second range R2 corresponds to the mitigating of a shift in the processing region ROI. Herein, as long as either at least a portion of the upper side of the processing region ROI is expanded upward or at least a portion of the lower side of the processing region ROI is expanded downward, such expanding corresponds to the shift mitigation.
The image processing unit 12 extracts the processing region ROI of the second range R2 from the image IMG, as shown in
Herein, when the host vehicle 100 is tilted at a first angle greater than 0 degrees, vehicle detection may be performed with the use of the processing region ROI of the first range R1, and when the host vehicle 100 becomes further tilted at a second angle greater than the first angle, vehicle detection may be performed with the use of the processing region ROI of the second range R2. Furthermore, the region adjusting unit 28 may modify the region range in the up-down direction continuously or stepwise including three or more steps, in accordance with a change in the angle of tilt of the vehicle's right-left axis. In other words, the region range may be increased as the angle of tilt of the host vehicle 100 increases.
Next, a case in which the region adjusting unit 28 modifies the region angle of the processing region ROI will be described.
When the angle of tilt of the right-left axis of the host vehicle 100 is a first angle with respect to a horizontal line, the region angle of the processing region ROI is a third angle θ3. For example, the processing region ROI of the third angle θ3 is held by default in the vehicle detecting device 6. When the host vehicle 100 is substantially not tilted, that is, when the angle of tilt of the vehicle's right-left axis is about 0 degrees (first angle≈0 degrees), the vehicle detecting device 6 detects a front vehicle 200 using the processing region ROI of the third angle θ3 (θ3≈0 degrees), as shown in
When the host vehicle 100 becomes tilted sideways and the angle of tilt of its right-left axis becomes a second angle greater than the first angle with respect to a horizontal line, the region adjusting unit 28 sets the region angle of the processing region ROI to a fourth angle θ4 greater than the third angle θ3, as shown in
For example, the region adjusting unit 28 holds, in advance, a threshold of the angle of tilt of the vehicle's right-left axis, and when the angle of tilt of the host vehicle 100 has exceeded the threshold, the region adjusting unit 28 modifies the region angle of the processing region ROI from the third angle θ3 to the fourth angle θ4. This threshold can be set as appropriate based on an experiment or simulation conducted by an engineer. Increasing the angle of the processing region ROI with respect to the vehicle's right-left axis from the third angle θ3 to the fourth angle θ4 corresponds to the mitigating of a shift in the processing region ROI. Herein, as long as the angle of at least a portion of the processing region ROI is modified, such modifying corresponds to the shift mitigation.
The image processing unit 12 extracts the processing region ROI of the fourth angle θ4 from the image IMG, as shown in
Herein, when the host vehicle 100 is tilted at a first angle greater than 0 degrees, vehicle detection may be performed with the use of the processing region ROI of the third angle θ3, and when the host vehicle 100 becomes further tilted at a second angle greater than the first angle, vehicle detection may be performed with the use of the processing region ROI of the fourth angle θ4. Furthermore, the region adjusting unit 28 may modify the region angle continuously or stepwise including three or more steps, in accordance with a change in the angle of tilt of the vehicle's right-left axis. In other words, the region angle may be increased as the angle of tilt of the host vehicle 100 increases.
The region adjusting unit 28 may modify both the region range and the region angle of the processing region ROI. This can further reduce the likelihood that a front vehicle 200 lies outside the processing region ROI due to the tilting of the host vehicle 100. Accordingly, a front vehicle 200 can be detected with higher accuracy.
First, the vehicle detecting device 6 acquires an image IMG from the imaging device 4. The vehicle detecting device 6 also acquires the angle of tilt of the host vehicle 100 from the attitude sensor 10 (S101). Then, the vehicle detecting device 6 determines whether the angle of tilt of the right-left axis of the host vehicle 100 is greater than a threshold (S102). If the angle of tilt is greater than the threshold (Y at S102), the vehicle detecting device 6 adjusts at least one of the region range or the region angle of the processing region ROI (S103), and proceeds to step S104. If the angle of tilt is not greater than the threshold (N at S102), the vehicle detecting device 6 proceeds to step S104 without adjusting the processing region ROI.
Next, the vehicle detecting device 6 extracts the processing region ROI from the image IMG and generates a luminous point image IMGa from the processing region ROI (S104). The vehicle detecting device 6 performs a vehicle detecting process using the generated luminous point image IMGa (S105). Then, the vehicle detecting device 6 determines whether a front vehicle 200 is present based on the result of the vehicle detecting process (S106). If a front vehicle 200 is present (Y at S106), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that includes a shaded region 24 (S107), and terminates the present routine. If no front vehicle 200 is present (N at S106), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that does not include a shaded region 24 (S108), and terminates the present routine.
As described above, the vehicle detecting device 6 according to the present embodiment sets a processing region ROI in an image IMG that is based on the imaging device 4 provided in the host vehicle 100 and performs a process of detecting a front vehicle 200 from the processing region ROI. According to the present embodiment, the vehicle detecting process is performed after a processing region ROI has been converted to a luminous point image IMGa. As the target of vehicle detection is narrowed to a processing region ROI, the load on the vehicle detecting device 6 can be reduced, as compared to the case of detecting a vehicle from an entire image IMG.
Furthermore, the vehicle detecting device 6, in accordance with a change in the angle of tilt of the vehicle's right-left axis relative to a horizontal line, modifies the processing region ROI so as to mitigate a shift in the processing region ROI caused by the change. Adjusting the processing region ROI dynamically in accordance with the tilt of the host vehicle 100 in this manner makes it possible to avoid missing a front vehicle 200. Accordingly, the accuracy of detecting a front vehicle 200 can be increased. The vehicle detecting device 6 according to the present embodiment modifies at least one of the region range of the processing region ROI in the up-down direction of the vehicle or the region angle of the processing region ROI. In other words, the vehicle detecting device 6 adjusts at least one of the shape or the orientation of the processing region ROI.
When modifying the region range, the vehicle detecting device 6 sets the region range to the first range R1 when the angle of tilt is the first angle and sets the region range to the second range R2 greater than the first range R1 when the angle of tilt is the second angle greater than the first angle. As the processing region ROI is made relatively small when the angle of tilt of the host vehicle 100 is relatively small and the processing region ROI is made relatively large when the angle of tilt is relatively large as described above, the processing load on the vehicle detecting device 6 can be reduced, as compared to the case of performing vehicle detection using a processing region ROI that is large by default regardless of the angle of tilt of the host vehicle 100. Furthermore, the above can suppress erroneous determination that a luminous point 202 that is not of a front vehicle 200 is determined to have come from a front vehicle 200.
Meanwhile, when modifying the region angle, the vehicle detecting device 6 sets the region angle with respect to the vehicle's right-left axis to the third angle θ3 when the angle of tilt of the first angle and sets the region angle with respect to the vehicle's right-left axis to the fourth angle θ4 greater than the third angle θ3 when the angle of tilt is the second angle greater than the first angle. Modifying the region angle in accordance with the tilt of the host vehicle 100 while retaining the shape of the processing region ROI in this manner can suppress erroneous determination.
The host vehicle 100 provided with the vehicle detecting device 6 according to the present embodiment is a straddle-type vehicle. The vehicle's right-left axis of a straddle-type vehicle tends to become tilted to a greater extent than that of a typical four-wheeled automobile. Therefore, the vehicle detecting device 6 can exhibit its functionality effectively especially in straddle-type vehicles. Herein, the host vehicle 100 may be a vehicle other than a straddle-type vehicle, such as a four-wheeled automobile.
The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region 24 to overlap a front vehicle 200, based on a detection result of the vehicle detecting device 6. This configuration makes it possible to achieve, with higher accuracy, both the reduction of glare caused on the driver of a front vehicle 200 and the improvement in the visibility of the driver of the host vehicle 100. Accordingly, safety in vehicle driving can be improved.
Thus far, Embodiment 2 according to the present invention has been described in detail. The embodiment described above merely illustrates a specific example for implementing the present invention. The contents of the embodiment do not limit the technical scope of the present invention, and a number of design changes, including modifications, additions, and deletions of constituent elements, can be made within the scope that does not depart from the sprit of the invention set forth in the claims. A new embodiment resulting from adding a design change has advantageous effects of the embodiments combined as well as the advantageous effects of the modification. With regard to the embodiment described above, the expressions “according to the present embodiment,” “in the present embodiment,” and so on are added for emphasis to the content that can be subjected to such a design change as described above, but such a design change is also permitted on the content without these expressions. Any desired combination of the constituent elements described above is also valid as an aspect of the present invention. Hatching added along a section in the drawings does not limit the material of such with hatching.
The invention according to Embodiment 2 described above may be identified through the items indicated below.
A vehicle detecting device (6) that:
The vehicle detecting device (6) according to First Item, wherein
The vehicle detecting device (6) according to First Item or Second Item, wherein
A light distribution controlling device (8) that:
A vehicle detecting method including:
The vehicle lamp system 1 includes a light distribution variable lamp 2, an imaging device 4, a vehicle detecting device 6, a light distribution controlling device 8, and an attitude sensor 10. These components are provided in a vehicle. A vehicle in which the vehicle lamp system 1 is provided according to the present embodiment is a straddle-type vehicle, such as a motorcycle. The light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may all be contained within a single housing, or some of these members may be provided outside such a housing.
For example, the light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 are housed in a lamp room. The lamp room is defined by a lamp body having an opening that opens to the front side of the vehicle and a light-transmissive cover attached so as to cover the opening of the lamp body. The imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may be provided outside the lamp room, such as in the vehicle. In this case, the imaging device 4 may be an onboard camera. The vehicle detecting device 6 and the light distribution controlling device 8 may be, for example, entirely or partly constituted by a vehicle ECU.
The light distribution variable lamp 2 is capable of illuminating a region ahead of the host vehicle with a visible light beam L1 of a variable intensity distribution. The light distribution variable lamp 2 is enabled to vary the illuminance of light in a plurality of illuminated individual regions R arrayed ahead of the vehicle independently of each other. The plurality of individual regions R are arrayed, for example, in a matrix. The light distribution variable lamp 2 receives information instructing on a light distribution pattern PTN from the light distribution controlling device 8 and emits a visible light beam L1 having an intensity distribution corresponding to the light distribution pattern PTN. Thus, the light distribution pattern PTN is formed ahead of the host vehicle. A light distribution pattern PTN can be understood as a two-dimensional illuminance distribution of an illumination pattern 902 that the light distribution variable lamp 2 forms on an imaginary vertical screen 900 ahead of the host vehicle.
There is no particular limitation on the configuration of the light distribution variable lamp 2, and the light distribution variable lamp 2 includes, for example, a plurality of light sources arrayed in a matrix and a lighting circuit that drives the light sources to turn them on independently of each other. Some preferred examples of such a light source include a semiconductor light source, such as a light-emitting diode (LED), a laser diode (LD), or an organic or inorganic electroluminescence (EL) light source. The light sources are mapped to the respective individual regions R, and each individual region R is individually illuminated with light from the corresponding light source. The light distribution variable lamp 2 has a resolution, that is, a light distribution resolving power of, for example, from 1,000 pixels to 2,000,000 pixels. The resolution of the light distribution variable lamp 2 means the number of the unit regions, of a light distribution pattern PTN, whose illuminance can be varied independently of each other.
In order to form an illuminance distribution corresponding to a light distribution pattern PTN, the light distribution variable lamp 2 may include, for example but not limited to, a pattern forming device of a matrix type, such as a digital mirror device (DMD) or a liquid-crystal device, or a pattern forming device of a scan optics type that scans the space ahead of the host vehicle with light from a light source. In an alternative configuration, illumination of light directed to the region ahead of the host vehicle may be blocked partly by a shading plate.
The imaging device 4 has a sensitivity to a visible light range and repeatedly captures an image of the region ahead of the host vehicle. The imaging device 4 captures an image of reflected light L2 of a visible light beam L1 reflected from an object located ahead of the vehicle. The imaging device 4 also captures an image of light from a front vehicle, including a leading vehicle and an oncoming vehicle. An image IMG that the imaging device 4 generates is sent to the vehicle detecting device 6.
An image IMG that the vehicle detecting device 6 acquires from the imaging device 4 may be RAW image data or image data subjected to predetermined image processing by the imaging device 4. When the vehicle detecting device 6 receives image data resulting from subjecting RAW image data generated by the imaging device 4 to image processing performed by a processing device other than the imaging device 4, this receiving also corresponds to the acquiring of an image IMG from the imaging device 4. In the following description, “an image IMG that is based on the imaging device 4” may mean either of RAW image data and data subjected to image processing. Furthermore, the expression “image IMG” may be used without any distinction between the two types of image data.
The vehicle detecting device 6 detects a front vehicle using an image IMG that is based on the imaging device 4. The vehicle detecting device 6 sends its detection result to the light distribution controlling device 8. Herein, the vehicle detecting device 6 may detect a target aside from a front vehicle. The vehicle detecting device 6 can be constituted by a digital processor, and for example, the vehicle detecting device 6 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The vehicle detecting device 6 includes, in one example, an image processing unit 12, a detection unit 14, and an element adjusting unit 32. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The light distribution controlling device 8 executes ADB control of dynamically and adaptively controlling the light distribution of the light distribution variable lamp 2 in accordance with a target present in the region ahead. The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region to overlap a front vehicle, based on a detection result of the vehicle detecting device 6. The light distribution controlling device 8 sends information instructing on a light distribution pattern PTN to the light distribution variable lamp 2. The light distribution controlling device 8 can be constituted by a digital processor, and for example, the light distribution controlling device 8 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The light distribution controlling device 8 includes, in one example, a pattern determining unit 18 and a lamp controlling unit 20. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The attitude sensor 10 measures the tilt of the host vehicle in the right-left direction. That is, the attitude sensor 10 measures the angle of tilt of the vehicle's right-left axis with respect to a horizontal line or a vertical line (in other words, the angle of tilt of the vehicle's up-down axis with respect to a horizontal line or a vertical line). The attitude sensor 10 is constituted, for example but not limited to, by a known inertial measurement unit (IMU). The attitude sensor 10 sends its measurement result to the vehicle detecting device 6.
Next, an operation of the vehicle detecting device 6 and of the light distribution controlling device 8 will be described.
When a front vehicle 200 is present in the region ahead of the host vehicle 100, an image IMG that includes the front vehicle 200 is generated by the imaging device 4 and sent to the vehicle detecting device 6. The image IMG sent to the vehicle detecting device 6 is acquired by the image processing unit 12. The image processing unit 12 performs known image processing, such as binarization, on the image IMG. This image processing generates a luminous point image IMGa in which two luminous points 202 corresponding to lamps of the front vehicle 200 are extracted, as shown in
The detection unit 14 determines the presence of the front vehicle 200 using the luminous point image IMGa. The detection unit 14 stores therein a first structuring element 34 (kernel) of a predetermined shape set in advance, as shown in
In the horizontal dilation process, when any one of the pixels that the first structuring element 34 overlaps is a pixel of high pixel value, the pixel value of the pixel of interest is converted to this high pixel value. Thus, each luminous point 202 in a luminous point image IMGa is dilated horizontally, as shown in
When there are two dilated luminous points 202 to be coupled to each other through the horizontal dilation process performed with the use of the first structuring element 34, the detection unit 14 detects a front vehicle 200 based on these two luminous points 202. When two dilated luminous points 202 to be coupled to each other are present in a luminous point image IMGa, the detection unit 14 according to the present embodiment determines that these two luminous points 202 are derived from a front vehicle 200, or in other words, determines that a front vehicle 200 is present at the position where these two luminous points 202 are present. Normally, when an image of a leading vehicle's rear lamps or an oncoming vehicle's headlamps is captured, these lamps appear, in an image IMG, as two luminous points 202 that are aligned at the same height while horizontally spaced apart by a predetermined gap therebetween. Therefore, a front vehicle 200 can be detected based on the positional relationship of these two luminous points 202 within the image IMG.
The detection unit 14 also performs a horizontal erosion process on a luminous point image IMGa with the use of the first structuring element 34. In the horizontal erosion process, when any one of the pixels that the first structuring element 34 overlaps is a pixel of low pixel value, the pixel value of the pixel of interest is converted to this low pixel value. Thus, the horizontally dilated region 36 is eroded in the horizontal direction, and a horizontally eroded region 38 is generated, as shown in
The detection result sent to the light distribution controlling device 8 is acquired by the pattern determining unit 18. When the front vehicle 200 is detected, the pattern determining unit 18 sets a light distribution pattern PTN that includes a shaded region to overlap the front vehicle 200. The shaded region is a portion of the light distribution pattern PTN where the luminance (illuminance) is zero or a portion of the light distribution pattern PTN where the luminance is reduced to higher than zero but lower than that held prior to shading. When no front vehicle 200 is detected, the pattern determining unit 18 sets a light distribution pattern PTN that does not include a shaded region.
The pattern determining unit 18 of one example acquires a luminous point image IMGa subjected to a horizontal erosion process as the detection result of the vehicle detecting device 6. Furthermore, the pattern determining unit 18 stores therein a second structuring element 40 of a predetermined shape set in advance, as shown in
The pattern determining unit 18 sets a shaded region 24 in the light distribution pattern that serves as a base, based on the position of the vertically dilated region 42 in the luminous point image IMGa, as shown in
Herein, the pattern determining unit 18 may reverse the order of the generating the inverted image and the image processing involving the use of the second structuring element 40. In other words, the pattern determining unit 18 generates an inverted image from a luminous point image IMGa. Then, the pattern determining unit 18 performs a vertical erosion process on the inverted image using the second structuring element 40. Thus, the region of high pixel value located over the horizontally eroded region 38 erodes in the vertical direction, and a vertically eroded region is generated. As the pattern determining unit 18 sets this vertical erosion region as a shaded region 24, the pattern determining unit 18 can determine a light distribution pattern PTN that includes the shaded region 24. Meanwhile, the pattern determining unit 18 may acquire only the position information (e.g., the angle relative to the host vehicle 100) of the front vehicle 200 as the detection result of the vehicle detecting device 6 and set a shaded region 24 based on this position information.
A light distribution pattern that serves as a base is selected in accordance with a light distribution mode that is set based, for example, on the driver's instruction input by operating the light switch (not shown), the traveling state of the host vehicle, or the environment surrounding the host vehicle. For example, light distribution modes include, for example, a high beam mode of forming a high beam light distribution pattern, a low beam mode of forming a low beam light distribution pattern, and a town mode of forming a light distribution pattern suitable for traveling in a city area.
The lamp controlling unit 20 instructs the light distribution variable lamp 2 to form the light distribution pattern PTN. The lamp controlling unit 20 is constituted, for example, by a known LED driver module (LDM). If the light sources of the light distribution variable lamp 2 are controlled by analog dimming, the lamp controlling unit 20 adjusts the direct current level of the driving current flowing in the light sources. Meanwhile, if the light sources are controlled by pulse width modulation (PWM) dimming, the lamp controlling unit 20 adjusts the mean level of the driving current by switching the current that flows in the light sources to adjust the ratio of the on periods. If the light distribution variable lamp 2 includes a DMD, the lamp controlling unit 20 may control the on/off switching of each mirror element constituting the DMD. If the lamp controlling unit 20 includes a liquid-crystal device, the lamp controlling unit 20 may control the optical transmittance of the liquid-crystal device. Thus, the light distribution pattern PTN is formed ahead of the host vehicle.
Next, how a structuring element is adjusted when the host vehicle 100 becomes tilted sideways will be described.
An image IMG obtained when the imaging region 4a is tilted sideways is an image that is tilted with respect to the ground surface. Therefore, in a luminous point image IMGa obtained from this image IMG, two luminous points 202 corresponding to lamps of the front vehicle 200 become offset from each other in the up-down direction, as shown in
Furthermore, the process of setting a shaded region 24 described above may be executed even when no front vehicle 200 can be detected. In this case, however, the reduction of glare caused on the driver of the front vehicle 200 may be insufficient. In other words, when a horizontal erosion process is performed on a luminous point image IMGa including two horizontally dilated regions 36 with the use of the first structuring element 34, a luminous point image IMGa that includes two horizontally eroded regions 38 is generated, as shown in
Accordingly, the vehicle detecting device 6 according to the present embodiment, in accordance with a change in the angle of tilt (bank angle) of the vehicle's right-left axis with respect to a horizontal line, modifies the first structuring element 34 so as to absorb an offset between luminous points 202 in a luminous point image IMGa caused by the change.
Specifically, the measurement result sent from the attitude sensor 10 to the vehicle detecting device 6 is acquired by the element adjusting unit 32. Based on the acquired measurement result, the element adjusting unit 32 can detect a change in the angle of tilt of the host vehicle 100. The element adjusting unit 32, in response to detecting a change in the angle of tilt of the host vehicle 100, modifies the first structuring element 34 so as to increase the permitted offset amount between two luminous points 202 in the up-down direction in a luminous point image IMGa (i.e., the offset amount between two luminous points 202, in the up-down direction, that can be coupled to each other by the first structuring element 34). Thus, two luminous points 202 offset in the up-down direction due to the tilting of the host vehicle 100 can more easily be coupled by the first structuring element 34.
When the host vehicle 100 becomes tilted sideways and the angle of tilt of its right-left axis becomes a second angle greater than the first angle with respect to a horizontal line, the element adjusting unit 32 sets the dimension of the first structuring element 34 in the up-down direction of the vehicle to a second dimension H2 greater than the first dimension H1, as shown in
The detection unit 14 performs a horizontal dilation process on a luminous point image IMGa using the first structuring element 34 of the second dimension H2. In this horizontal dilation process, for example, the pixel at the center of the first structuring element 34 in the right-left direction and also at the center in the up-down direction is mapped to each pixel of interest of the luminous point image IMGa. Then, if any one of the pixels that the first structuring element 34 overlaps is a pixel of high pixel value, the pixel value of the pixel of interest is converted to this high pixel value. Thus, two luminous points 202 become dilated in the right-left direction as well as in the up-down direction, as shown in
The detection unit 14 also performs a horizontal erosion process on a luminous point image IMGa using the first structuring element 34 of the second dimension H2. Thus, a luminous point image IMGa that includes a single horizontally eroded region 38 is generated, as shown in
Herein, when the host vehicle 100 is tilted at a first angle greater than 0 degrees, vehicle detection may be performed with the use of the first structuring element 34 of the first dimension H1, and when the host vehicle 100 becomes further tilted at a second angle greater than the first angle, vehicle detection may be performed with the use of the first structuring element 34 of the second dimension H2. Furthermore, the element adjusting unit 32 may expand the dimension of the first structuring element 34 in the up-down direction continuously or stepwise including three or more steps, in accordance with a change in the angle of tilt of the vehicle's right-left axis. In other words, the up-down dimension of the first structuring element 34 may be increased as the angle of tilt of the host vehicle 100 increases.
First, the vehicle detecting device 6 acquires an image IMG from the imaging device 4. The vehicle detecting device 6 also acquires the angle of tilt of the host vehicle 100 from the attitude sensor 10 (S101). Then, the vehicle detecting device 6 generates a luminous point image IMGa from the image IMG (S102). Next, the vehicle detecting device 6 determines whether the angle of tilt of the right-left axis of the host vehicle 100 is greater than a threshold (S103). If the angle of tilt is greater than the threshold (Y at S103), the vehicle detecting device 6 adjusts the up-down dimension of the first structuring element 34 (S104), and proceeds to step S105. If the angle of tilt is not greater than the threshold (N at S103), the vehicle detecting device 6 proceeds to step S105 without adjusting the first structuring element 34.
Next, the vehicle detecting device 6 performs a vehicle detecting process on the luminous point image IMGa using the first structuring element 34 (S105). Then, the vehicle detecting device 6 determines whether a front vehicle 200 is present based on the result of the vehicle detecting process (S106). If a front vehicle 200 is present (Y at S106), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that includes a shaded region 24 (S107), and terminates the present routine. If no front vehicle 200 is present (N at S106), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that does not include a shaded region 24 (S108), and terminates the present routine.
As described above, the vehicle detecting device 6 according to the present embodiment performs, using a first structuring element 34 of a predetermined shape, a dilation process on a luminous point 202 included in an image IMG (a luminous point image IMGa according to the present embodiment) that is based on the imaging device 4 provided in the host vehicle 100, and when there is another dilated luminous point 202 to be coupled to the dilated luminous point 202, the vehicle detecting device 6 detects a front vehicle 200 based on these two luminous points 202. Furthermore, the vehicle detecting device 6, in accordance with a change in the angle of tilt of the vehicle's right-left axis with respect to a horizontal line, modifies the first structuring element 34 so as to absorb an offset between two luminous points 202 in the image IMG caused by the change. As the first structuring element 34 for coupling two luminous points 202 is adjusted dynamically in accordance with the tilt of the host vehicle 100 in this manner, failing to detect a front vehicle 200 can be suppressed. Accordingly, the accuracy of detecting a front vehicle 200 can be increased.
The vehicle detecting device 6 according to the present embodiment sets the dimension of the first structuring element 34 in the up-down direction of the vehicle to the first dimension H1 when the angle of tilt of the host vehicle 100 is the first angle, and sets the dimension of the first structuring element 34 in the up-down direction of the vehicle to the second dimension H2 greater than the first dimension H1 when the angle of tilt is the second angle greater than the first angle. As the first structuring element 34 is made relatively small when the angle of tilt of the host vehicle 100 is relatively small and the first structuring element 34 is made relatively large when the angle of tilt is relatively large as described above, the processing load on the vehicle detecting device 6 can be reduced, as compared to the case of performing vehicle detection using the first structuring element 34 that is large by default regardless of the angle of tilt of the host vehicle 100. Furthermore, the above can suppress erroneous determination that a luminous point 202 that is not of a front vehicle 200 is determined to have come from a front vehicle 200.
The host vehicle 100 provided with the vehicle detecting device 6 according to the present embodiment is a straddle-type vehicle. The vehicle's right-left axis of a straddle-type vehicle tends to become tilted to a greater extent than that of a typical four-wheeled automobile. Therefore, the vehicle detecting device 6 can exhibit its functionality effectively especially in straddle-type vehicles. Herein, the host vehicle 100 may be a vehicle other than a straddle-type vehicle, such as a four-wheeled automobile.
The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region 24 to overlap a front vehicle 200, based on a detection result of the vehicle detecting device 6. This configuration makes it possible to achieve, with higher accuracy, both the reduction of glare caused on the driver of a front vehicle 200 and the improvement in the visibility of the driver of the host vehicle 100. Accordingly, safety in vehicle driving can be improved.
Thus far, Embodiment 3 according to the present invention has been described in detail. The embodiment described above merely illustrates a specific example for implementing the present invention. The contents of the embodiment do not limit the technical scope of the present invention, and a number of design changes, including modifications, additions, and deletions of constituent elements, can be made within the scope that does not depart from the sprit of the invention set forth in the claims. A new embodiment resulting from adding a design change has advantageous effects of the embodiments combined as well as the advantageous effects of the modification. With regard to the embodiment described above, the expressions “according to the present embodiment,” “in the present embodiment,” and so on are added for emphasis to the content that can be subjected to such a design change as described above, but such a design change is also permitted on the content without these expressions. Any desired combination of the constituent elements described above is also valid as an aspect of the present invention. Hatching added along a section in the drawings does not limit the material of such with hatching.
The invention according to Embodiment 3 described above may be identified by the items indicated below.
A vehicle detecting device (6) that:
The vehicle detecting device (6) according to First Item, wherein
The vehicle detecting device (6) according to First Item or Second Item, wherein
A light distribution controlling device (8) that:
A vehicle detecting method including:
The vehicle lamp system 1 includes a light distribution variable lamp 2, an imaging device 4, a vehicle detecting device 6, a light distribution controlling device 8, and an attitude sensor 10. These components are provided in a vehicle. A vehicle in which the vehicle lamp system 1 is provided according to the present embodiment is a straddle-type vehicle, such as a motorcycle. The light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may all be contained within a single housing, or some of these members may be provided outside such a housing.
For example, the light distribution variable lamp 2, the imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 are housed in a lamp room. The lamp room is defined by a lamp body having an opening that opens to the front side of the vehicle and a light-transmissive cover attached so as to cover the opening of the lamp body. The imaging device 4, the vehicle detecting device 6, the light distribution controlling device 8, and the attitude sensor 10 may be provided outside the lamp room, such as in the vehicle. In this case, the imaging device 4 may be an onboard camera. The vehicle detecting device 6 and the light distribution controlling device 8 may be, for example, entirely or partly constituted by a vehicle ECU.
The light distribution variable lamp 2 is capable of illuminating a region ahead of the host vehicle with a visible light beam L1 of a variable intensity distribution. The light distribution variable lamp 2 is enabled to vary the illuminance of light in a plurality of illuminated individual regions R arrayed ahead of the vehicle independently of each other. The plurality of individual regions R are arrayed, for example, in a matrix. The light distribution variable lamp 2 receives information instructing on a light distribution pattern PTN from the light distribution controlling device 8 and emits a visible light beam L1 having an intensity distribution corresponding to the light distribution pattern PTN. Thus, the light distribution pattern PTN is formed ahead of the host vehicle. A light distribution pattern PTN can be understood as a two-dimensional illuminance distribution of an illumination pattern 902 that the light distribution variable lamp 2 forms on an imaginary vertical screen 900 ahead of the host vehicle.
There is no particular limitation on the configuration of the light distribution variable lamp 2, and the light distribution variable lamp 2 includes, for example, a plurality of light sources arrayed in a matrix and a lighting circuit that drives the light sources to turn them on independently of each other. Some preferred examples of such a light source include a semiconductor light source, such as a light-emitting diode (LED), a laser diode (LD), or an organic or inorganic electroluminescence (EL) light source. The light sources are mapped to the respective individual regions R, and each individual region R is individually illuminated with light from the corresponding light source. The light distribution variable lamp 2 has a resolution, that is, a light distribution resolving power of, for example, from 1,000 pixels to 2,000,000 pixels. The resolution of the light distribution variable lamp 2 means the number of the unit regions, of a light distribution pattern PTN, whose illuminance can be varied independently of each other.
In order to form an illuminance distribution corresponding to a light distribution pattern PTN, the light distribution variable lamp 2 may include, for example but not limited to, a pattern forming device of a matrix type, such as a digital mirror device (DMD) or a liquid-crystal device, or a pattern forming device of a scan optics type that scans the space ahead of the host vehicle with light from a light source. In an alternative configuration, illumination of light directed to the region ahead of the host vehicle may be blocked partly by a shading plate.
The imaging device 4 has a sensitivity to a visible light range and repeatedly captures an image of the region ahead of the host vehicle. The imaging device 4 captures an image of reflected light L2 of a visible light beam L1 reflected from an object located ahead of the vehicle. The imaging device 4 also captures an image of light from a front vehicle, including a leading vehicle and an oncoming vehicle. An image IMG that the imaging device 4 generates is sent to the vehicle detecting device 6.
An image IMG that the vehicle detecting device 6 acquires from the imaging device 4 may be RAW image data or image data subjected to predetermined image processing by the imaging device 4. When the vehicle detecting device 6 receives image data resulting from subjecting RAW image data generated by the imaging device 4 to image processing performed by a processing device other than the imaging device 4, this receiving also corresponds to the acquiring of an image IMG from the imaging device 4. In the following description, “an image IMG that is based on the imaging device 4” may mean either of RAW image data and data subjected to image processing. Furthermore, the expression “image IMG” may be used without any distinction between the two types of image data.
The vehicle detecting device 6 detects a front vehicle using an image IMG that is based on the imaging device 4. The vehicle detecting device 6 sends its detection result to the light distribution controlling device 8. Herein, the vehicle detecting device 6 may detect a target aside from a front vehicle. The vehicle detecting device 6 can be constituted by a digital processor, and for example, the vehicle detecting device 6 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The vehicle detecting device 6 includes, in one example, an image processing unit 12, a detection unit 14, and a condition adjusting unit 16. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The light distribution controlling device 8 executes ADB control of dynamically and adaptively controlling the light distribution of the light distribution variable lamp 2 in accordance with a target present in the region ahead. The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region to overlap a front vehicle, based on a detection result of the vehicle detecting device 6. The light distribution controlling device 8 sends information instructing on a light distribution pattern PTN to the light distribution variable lamp 2. The light distribution controlling device 8 can be constituted by a digital processor, and for example, the light distribution controlling device 8 may be constituted by a combination of a microcomputer including a CPU and a software program or by a field programmable gate array (FPGA), an application specific IC (ASIC), or the like. The light distribution controlling device 8 includes, in one example, a pattern determining unit 18 and a lamp controlling unit 20. Each of these units operates as an integrated circuit constituting itself executes a program stored in a memory. An operation of each of these units will be described later.
The attitude sensor 10 measures the tilt of the host vehicle in the right-left direction. That is, the attitude sensor 10 measures the angle of tilt of the vehicle's right-left axis with respect to a horizontal line or a vertical line (in other words, the angle of tilt of the vehicle's up-down axis with respect to a horizontal line or a vertical line). The attitude sensor 10 is constituted, for example but not limited to, by a known inertial measurement unit (IMU). The attitude sensor 10 sends its measurement result to the vehicle detecting device 6.
Next, an operation of the vehicle detecting device 6 and of the light distribution controlling device 8 will be described.
When a front vehicle 200 is present in the region ahead of the host vehicle 100, an image IMG that includes the front vehicle 200 is generated by the imaging device 4 and sent to the vehicle detecting device 6. The image IMG sent to the vehicle detecting device 6 is acquired by the image processing unit 12. The image processing unit 12 performs known image processing, such as binarization, on the image IMG. This image processing generates a luminous point image IMGa in which two luminous points 202 corresponding to lamps of the front vehicle 200 are extracted, as shown in
The detection unit 14 determines the presence of the front vehicle 200 using the luminous point image IMGa. The detection unit 14 stores therein a predetermined pair determining condition set in advance. Then, as shown in
The detection unit 14 according to the present embodiment holds therein information about a rectangular frame 22 elongated in the horizontal direction of the luminous point image IMGa, as a determination element of the pair determining condition. The pair determining condition is satisfied when two luminous points 202 are contained within the frame 22 in a luminous point image IMGa. In other words, when two luminous points 202 are contained within the frame 22, these two luminous points 202 are determined to be of a front vehicle 200. Normally, when an image of a leading vehicle's rear lamps or an oncoming vehicle's headlamps is captured, these lamps appear, in an image IMG, as two luminous points 202 that are present at the same height while spaced apart horizontally by a predetermined gap therebetween. Therefore, a front vehicle 200 can be detected based on the positional relationship of two luminous points 202 within an image IMG.
Specifically, the detection unit 14 superposes the frame 22 on each luminous point 202 in the luminous point image IMGa so that the luminous point 202 is contained within the frame 22. Then, the detection unit 14 moves each frame 22 in the right-left direction through raster processing. When two luminous points 202 are contained within a single frame 22, the detection unit 14 determines these two luminous points 202 as a front vehicle 200.
The detection result sent to the light distribution controlling device 8 is acquired by the pattern determining unit 18. When a front vehicle 200 has been detected, the pattern determining unit 18 sets a shaded region 24 to overlap the front vehicle 200 in a light distribution pattern that serves as a base, as shown in
A light distribution pattern that serves as a base is selected in accordance with a light distribution mode that is set based, for example, on the driver's instruction input by operating the light switch (not shown), the traveling state of the host vehicle, or the environment surrounding the host vehicle. For example, light distribution modes include, for example, a high beam mode of forming a high beam light distribution pattern, a low beam mode of forming a low beam light distribution pattern, and a town mode of forming a light distribution pattern suitable for traveling in a city area.
The lamp controlling unit 20 instructs the light distribution variable lamp 2 to form the light distribution pattern PTN. The lamp controlling unit 20 is constituted, for example, by a known LED driver module (LDM). If the light sources of the light distribution variable lamp 2 are controlled by analog dimming, the lamp controlling unit 20 adjusts the direct current level of the driving current flowing in the light sources. Meanwhile, if the light sources are controlled by pulse width modulation (PWM) dimming, the lamp controlling unit 20 adjusts the mean level of the driving current by switching the current that flows in the light sources to adjust the ratio of the on periods. If the light distribution variable lamp 2 includes a DMD, the lamp controlling unit 20 may control the on/off switching of each mirror element constituting the DMD. If the lamp controlling unit 20 includes a liquid-crystal device, the lamp controlling unit 20 may control the optical transmittance of the liquid-crystal device. Thus, the light distribution pattern PTN is formed ahead of the host vehicle.
Next, how the pair determining condition is adjusted when the host vehicle 100 becomes tilted sideways will be described.
An image IMG obtained when the imaging region 4a is tilted sideways is an image that is tilted with respect to the ground surface. Therefore, in a luminous point image IMGa obtained from this image IMG, two luminous points 202 corresponding to lamps of the front vehicle 200 become offset from each other in the up-down direction, as shown in
Accordingly, the vehicle detecting device 6 according to the present embodiment, in accordance with a change in the angle of tilt (bank angle) of the vehicle's right-left axis with respect to a horizontal line, relaxes the pair determining condition so as to absorb an offset between luminous points 202 in a luminous point image IMGa caused by the change.
Specifically, the measurement result sent from the attitude sensor 10 to the vehicle detecting device 6 is acquired by the condition adjusting unit 16. Based on the acquired measurement result, the condition adjusting unit 16 can detect a change in the angle of tilt of the host vehicle 100. The condition adjusting unit 16, in response to detecting a change in the angle of tilt of the host vehicle 100, relaxes, among the constituent elements of the pair determining condition, the element concerning the up-down direction of the luminous point image IMGa. In other words, the condition adjusting unit 16 increases the permitted offset amount (the offset amount permitted to satisfy the pair determining condition) with respect to the offset between two luminous points 202 in the up-down direction in a luminous point image IMGa. Thus, two luminous points 202 offset in the up-down direction due to the tilting of the host vehicle 100 become more likely to satisfy the pair determining condition.
When a frame 22 is used, the pair determining condition is adjusted to be relaxed in the following manner.
When the host vehicle 100 becomes tilted sideways and the angle of tilt of its right-left axis becomes a second angle greater than the first angle with respect to a horizontal line, the condition adjusting unit 16 sets the dimension of the frame 22 in the up-down direction of the vehicle to a second dimension H2 greater than the first dimension H1, as shown in
The detection unit 14 superposes the frame 22 of the second dimension H2 on each luminous point 202 in the luminous point image IMGa, as shown in
Herein, when the host vehicle 100 is tilted at a first angle greater than 0 degrees, vehicle detection may be performed with the use of the frame 22 of the first dimension H1, and when the host vehicle 100 becomes further tilted at a second angle greater than the first angle, vehicle detection may be performed with the use of the frame 22 of the second dimension H2. Furthermore, the condition adjusting unit 16 may relax the element concerning the up-down direction of the image continuously or stepwise including three or more steps, in accordance with a change in the angle of tilt of the vehicle's right-left axis. In other words, the up-down dimension of the frame 22 may be increased as the angle of tilt of the host vehicle 100 increases.
First, the vehicle detecting device 6 acquires an image IMG from the imaging device 4. The vehicle detecting device 6 also acquires the angle of tilt of the host vehicle 100 from the attitude sensor 10 (S101). Then, the vehicle detecting device 6 generates a luminous point image IMGa from the image IMG (S102). Next, the vehicle detecting device 6 determines whether the angle of tilt of the right-left axis of the host vehicle 100 is greater than a threshold (S103). If the angle of tilt is greater than the threshold (Y at S103), the vehicle detecting device 6 relaxes the pair determining condition (S104), and proceeds to step S105. If the angle of tilt is not greater than the threshold (N at S103), the vehicle detecting device 6 proceeds to step S105 without relaxing the pair determining condition.
Next, the vehicle detecting device 6 performs a vehicle detecting process on the luminous point image IMGa using the pair determining condition (S105). Then, the vehicle detecting device 6 determines whether a front vehicle 200 is present based on the result of the vehicle detecting process (S106). If a front vehicle 200 is present (Y at S106), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that includes a shaded region 24 (S107), and terminates the present routine. If no front vehicle 200 is present (N at S106), the light distribution controlling device 8 controls the light distribution variable lamp 2 so as to form a light distribution pattern PTN that does not include a shaded region 24 (S108), and terminates the present routine.
As described above, when two luminous points 202 included in an image IMG (a luminous point image IMGa according to the present embodiment) that is based on the imaging device 4 provided in the host vehicle 100 satisfy a predetermined pair determining condition, the vehicle detecting device 6 according to the present embodiment detects a front vehicle 200 based on these two luminous points 202. Furthermore, the vehicle detecting device 6, in accordance with a change in the angle of tilt of the vehicle's right-left axis with respect to a horizontal line, relaxes the pair determining condition so as to absorb an offset between luminous points 202 in the image IMG caused by the change. Adjusting the pair determining condition of luminous points 202 dynamically in accordance with the tilt of the host vehicle 100 as described above makes it possible to avoid missing luminous points 202 derived from a front vehicle 200. Accordingly, the accuracy of detecting a front vehicle 200 can be increased.
The pair determining condition according to the present embodiment is satisfied when two luminous points 202 are contained within a rectangular frame 22 in an image IMG (a luminous point image IMGa according to the present embodiment). Then, the vehicle detecting device 6 sets the dimension of the frame 22 in the up-down direction of the vehicle to the first dimension H1 when the angle of tilt of the host vehicle 100 is the first angle and sets the dimension of the frame 22 in the up-down direction of the vehicle to the second dimension H2 greater than the first dimension H1 when the angle of tilt is the second angle greater than the first angle.
In this manner, according to the present embodiment, a relatively strict pair determining condition is set when the angle of tilt of the host vehicle 100 is relatively small, and a relatively relaxed pair determining condition is set then the angle of tilt is relatively large. Thus, the processing load on the detection unit 14 can be reduced, as compared to the case of performing vehicle detection using a pair determining condition that is relaxed by default regardless of the angle of tilt of the host vehicle 100. Furthermore, the above can suppress erroneous determination that a luminous point 202 that is not of a front vehicle 200 is determined to have come from a front vehicle 200.
The host vehicle 100 provided with the vehicle detecting device 6 according to the present embodiment is a straddle-type vehicle. The vehicle's right-left axis of a straddle-type vehicle tends to become tilted to a greater extent than that of a typical four-wheeled automobile. Therefore, the vehicle detecting device 6 can exhibit its functionality effectively especially in straddle-type vehicles. Herein, the host vehicle 100 may be a vehicle other than a straddle-type vehicle, such as a four-wheeled automobile.
The light distribution controlling device 8 according to the present embodiment sets a light distribution pattern PTN that includes a shaded region 24 to overlap a front vehicle 200, based on a detection result of the vehicle detecting device 6. This configuration makes it possible to achieve, with higher accuracy, both the reduction of glare caused on the driver of a front vehicle 200 and the improvement in the visibility of the driver of the host vehicle 100. Accordingly, safety in vehicle driving can be improved.
Thus far, Embodiment 4 according to the present invention has been described in detail. The embodiment described above merely illustrates a specific example for implementing the present invention. The contents of the embodiment do not limit the technical scope of the present invention, and a number of design changes, including modifications, additions, and deletions of constituent elements, can be made within the scope that does not depart from the sprit of the invention defined by the claims. A new embodiment resulting from adding a design change has advantageous effects of the embodiments combined as well as the advantageous effects of the modification. With regard to the embodiment described above, the expressions “according to the present embodiment,” “in the present embodiment,” and so on are added for emphasis to the content that can be subjected to such a design change as described above, but such a design change is also permitted on the content without these expressions. Any desired combination of the constituent elements described above is also valid as an aspect of the present invention. Hatching added along a section in the drawings does not limit the material of such with hatching.
The invention according to Embodiment 4 described above may be identified by the items indicated below.
A vehicle detecting device (6) that:
The vehicle detecting device (6) according to First Item, wherein
The vehicle detecting device (6) according to First Item or Second Item, wherein
A light distribution controlling device (8) that:
A vehicle detecting method including:
Number | Date | Country | Kind |
---|---|---|---|
2021-212077 | Dec 2021 | JP | national |
2021-212078 | Dec 2021 | JP | national |
2021-212079 | Dec 2021 | JP | national |
2021-213023 | Dec 2021 | JP | national |
This application is the continuation of International Patent Application No. PCT/JP2022/044932, filed on Dec. 6, 2022, which claims the benefit of priority from Japanese Patent Application No. 2021-212077, filed on Dec. 27, 2021, Japanese Patent Application No. 2021-212078, filed on Dec. 27, 2021, Japanese Patent Application No. 2021-212079, filed on Dec. 27, 2021, and Japanese Patent Application No. 2021-213023, filed on Dec. 27, 2021, the entire content of each of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/044932 | Dec 2022 | WO |
Child | 18750408 | US |