The present invention relates to a driving assistance apparatus and a driving assistance method to assist driving of a driver of a subject vehicle when the subject vehicle merges or changes lanes from a lane in which the subject vehicle travels to a lane in which a non-subject vehicle travels.
When a subject vehicle merges into an expressway and the like or changes lanes to an adjacent lane during travel, a driver of the subject vehicle is required to pay attention to movement of a non-subject vehicle traveling in a lane of a road into which the subject vehicle merges or in a lane to which the subject vehicle changes lanes.
Technology to identify a merging part to allow a subject vehicle to merge into a lane in which a non-subject vehicle travels, and present, to a driver, a location on a road surface at which steering is to be performed to allow the subject vehicle to merge into the merging part has been disclosed (see Patent Document 1, for example). Technology to present a merging point and time to reach the merging point to an occupant of a subject vehicle has also been disclosed (see Patent Document 2, for example).
Patent Document 1: Japanese Patent Application Laid-Open No. 2008-151507
Patent Document 2: Japanese Patent Application Laid-Open No. 2017-102739
When a subject vehicle merges or changes lanes, it is useful for a driver of the subject vehicle to know a location where the subject vehicle is currently to travel to allow the subject vehicle to smoothly merge or change lanes. Nevertheless, Patent Documents 1 and 2 are both silent on the location where the subject vehicle is currently to travel to allow the subject vehicle to smoothly merge in the merging part. Driving assistance to the driver when the subject vehicle merges or changes lanes thus has room for improvement.
The present invention has been conceived to solve such a problem, and it is an object of the present invention to provide a driving assistance apparatus and a driving assistance method allowing for appropriate driving assistance to a driver when a subject vehicle merges or changes lanes.
To solve the above-mentioned problem, a driving assistance apparatus according to the present invention includes: a subject vehicle information acquisition unit to acquire subject vehicle information including a current location and a speed of a subject vehicle; a non-subject vehicle information acquisition unit to acquire non-subject vehicle information including a current location and a speed of a non-subject vehicle; an object generation unit to generate, based on the subject vehicle information acquired by the subject vehicle information acquisition unit and the non-subject vehicle information acquired by the non-subject vehicle information acquisition unit, a travel location object indicating at least one of a location where the subject vehicle is currently to travel and a location where the subject vehicle is currently not to travel when the subject vehicle merges or changes lanes from a lane in which the subject vehicle travels to a lane in which the non-subject vehicle travels; and a display controller to perform control to display, in accordance with travel of the subject vehicle, the travel location object generated by the object generation unit superimposed on scenery around the subject vehicle.
A driving assistance method according to the present invention includes: acquiring subject vehicle information including a current location and a speed of a subject vehicle; acquiring non-subject vehicle information including a current location and a speed of a non-subject vehicle; generating, based on the acquired subject vehicle information and the acquired non-subject vehicle information, a travel location object indicating at least one of a location where the subject vehicle is currently to travel and a location where the subject vehicle is currently not to travel when the subject vehicle merges or changes lanes from a lane in which the subject vehicle travels to a lane in which the non-subject vehicle travels; and performing control to display, in accordance with travel of the subject vehicle, the generated travel location object superimposed on scenery around the subject vehicle.
According to the present invention, the driving assistance apparatus includes: the subject vehicle information acquisition unit to acquire the subject vehicle information including the current location and the speed of the subject vehicle; the non-subject vehicle information acquisition unit to acquire the non-subject vehicle information including the current location and the speed of the non-subject vehicle; the object generation unit to generate, based on the subject vehicle information acquired by the subject vehicle information acquisition unit and the non-subject vehicle information acquired by the non-subject vehicle information acquisition unit, the travel location object indicating at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel when the subject vehicle merges or changes lanes from the lane in which the subject vehicle travels to the lane in which the non-subject vehicle travels; and the display controller to perform control to display, in accordance with the travel of the subject vehicle, the travel location object generated by the object generation unit superimposed on the scenery around the subject vehicle, allowing for appropriate driving assistance to a driver when the subject vehicle merges or changes lanes.
The driving assistance method includes: acquiring the subject vehicle information including the current location and the speed of the subject vehicle; acquiring the non-subject vehicle information including the current location and the speed of the non-subject vehicle; generating, based on the acquired subject vehicle information and the acquired non-subject vehicle information, the travel location object indicating at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel when the subject vehicle merges or changes lanes from the lane in which the subject vehicle travels to the lane in which the non-subject vehicle travels; and performing control to display, in accordance with the travel of the subject vehicle, the generated travel location object superimposed on the scenery around the subject vehicle, allowing for appropriate driving assistance to the driver when the subject vehicle merges or changes lanes.
The objects, features, aspects, and advantages of the present invention will become more apparent from the following detailed description and the accompanying drawings.
Embodiments of the present invention will be described below based on the drawings.
<Configuration>
As shown in
Merging and changing lanes will be described herein.
Merging refers to movement of a subject vehicle 6 from a lane in which the subject vehicle 6 travels to a lane in which non-subject vehicles 7 to 9 travel as illustrated in
Another configuration of a driving assistance apparatus including the driving assistance apparatus 1 shown in
As shown in
The subject vehicle information acquisition unit 2 acquires the subject vehicle information including the current location and the speed of the subject vehicle. The current location of the subject vehicle is, for example, an absolute location of the subject vehicle included in a global positioning system (GPS) signal. A more accurate current location of the subject vehicle may be acquired based on the absolute location of the subject vehicle included in the GPS signal and the speed, a movement distance, a steering direction, and the like of the subject vehicle. Assume that, in this case, information on the movement distance and the steering direction of the subject vehicle is included in the subject vehicle information acquired by the subject vehicle information acquisition unit 2.
The non-subject vehicle information acquisition unit 3 includes a non-subject vehicle location calculation unit 11 and a non-subject vehicle speed calculation unit 12. The non-subject vehicle location calculation unit 11 acquires an image captured by the image capturing device 15, and performs image processing on the image to calculate a location of the non-subject vehicle relative to the subject vehicle. The non-subject vehicle speed calculation unit 12 acquires the image captured by the image capturing device 15, and performs image processing on the image to calculate a speed of the non-subject vehicle relative to the subject vehicle. The image capturing device 15 is installed in the subject vehicle to capture an image around the subject vehicle. Specifically, the image capturing device 15 captures the image so that the image includes a lane in which the subject vehicle travels and a lane to which the subject vehicle merges or changes lanes.
A case described in an example of
The map information acquisition unit 13 acquires, based on the current location of the subject vehicle acquired by the subject vehicle information acquisition unit 2, map information at least including lane information from the map information storage 16. The lane information includes information on line markings. The map information storage 16 is configured, for example, by a storage, such as a hard disk drive (HDD) and semiconductor memory, and stores the map information at least including the lane information. The map information storage 16 may be installed in the subject vehicle or external to the subject vehicle. The map information storage 16 may be included in the driving assistance apparatus 10.
The overall controller 14 calculates, based on the current location and the speed of the subject vehicle acquired by the subject vehicle information acquisition unit 2 and the current location and the speed of the non-subject vehicle acquired by the non-subject vehicle information acquisition unit 3, a point where the subject vehicle can merge or change lanes. The point where the subject vehicle can merge or change lanes can be calculated using known technology as disclosed in Patent Document 1, for example. The overall controller 14 also calculates at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel to reach the point where the subject vehicle can merge or change lanes. Furthermore, the overall controller 14 can perform image processing on the image captured by the image capturing device 15 to detect line markings constituting the lane in which the subject vehicle travels.
The object generation unit 4 generates the travel location object indicating at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel calculated by the overall controller 14. In this case, the object generation unit 4 determines the shape of the travel location object based on the image captured by the image capturing device 15.
The display controller 5 performs control to cause the display device 17 to display, in accordance with travel of the subject vehicle, the travel location object generated by the object generation unit 4 superimposed on the scenery around the subject vehicle. The display device 17 is, for example, a head up display (HUD), a monitor installed in an instrument panel, a monitor installed in a center console, or the like. In a case where the display device 17 is the HUD, for example, the display controller 5 performs control to display the travel location object superimposed on actual scenery seen through a windshield. In a case where the display device 17 is the monitor, for example, the display controller 5 performs control to display the travel location object superimposed on the image captured by the image capturing device 15.
Functions of the subject vehicle information acquisition unit 2, the non-subject vehicle information acquisition unit 3, the object generation unit 4, the display controller 5, the non-subject vehicle location calculation unit 11, the non-subject vehicle speed calculation unit 12, the map information acquisition unit 13, and the overall controller 14 included in the driving assistance apparatus 10 are each achieved by a processing circuit. That is to say, the driving assistance apparatus 10 includes the processing circuit to acquire the subject vehicle information, acquire the non-subject vehicle information, generate the object, perform control to display the object, calculate the location of the non-subject vehicle, calculate the speed of the non-subject vehicle, acquire the map information, calculate the point where the subject vehicle can merge or change lanes, calculate at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel, and detect the line markings. The processing circuit is a processor 18 (also referred to as a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, and a digital signal processor (DSP)) to execute a program stored in memory 19.
The functions of the subject vehicle information acquisition unit 2, the non-subject vehicle information acquisition unit 3, the object generation unit 4, the display controller 5, the non-subject vehicle location calculation unit 11, the non-subject vehicle speed calculation unit 12, the map information acquisition unit 13, and the overall controller 14 included in the driving assistance apparatus 10 are each achieved by software, firmware, or a combination of software and firmware. The software or the firmware is described as the program, and stored in the memory 19. The processing circuit reads and executes the program stored in the memory 19 to achieve each of the functions of the respective units. That is to say, the driving assistance apparatus 10 includes the memory 19 to store the program resulting in performance of steps including: acquiring the subject vehicle information; acquiring the non-subject vehicle information; generating the object; performing control to display the object; calculating the location of the non-subject vehicle; calculating the speed of the non-subject vehicle; acquiring the map information; calculating the point where the subject vehicle can merge or change lanes; calculating at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel; and detecting the line markings. It can be said that the program is to cause a computer to execute procedures or methods of the subject vehicle information acquisition unit 2, the non-subject vehicle information acquisition unit 3, the object generation unit 4, the display controller 5, the non-subject vehicle location calculation unit 11, the non-subject vehicle speed calculation unit 12, the map information acquisition unit 13, and the overall controller 14. The memory herein may be, for example, nonvolatile or volatile semiconductor memory, such as random access memory (RAM), read only memory (ROM), flash memory, erasable programmable read only memory (EPROM), and electrically erasable programmable read only memory (EEPROM), a magnetic disk, a flexible disk, an optical disc, a compact disc, a mini disc, a DVD, and the like or any storage medium to be used in the future.
<Operation>
In a step S101, the subject vehicle information acquisition unit 2 acquires the subject vehicle information including the current location and the speed of the subject vehicle.
In a step S102, the non-subject vehicle information acquisition unit 3 acquires the non-subject vehicle information including the current location and the speed of the non-subject vehicle.
In a step S103, the object generation unit 4 generates, based on the subject vehicle information acquired by the subject vehicle information acquisition unit 2 and the non-subject vehicle information acquired by the non-subject vehicle information acquisition unit 3, the travel location object indicating at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel when the subject vehicle merges or changes lanes from the lane in which the subject vehicle travels to the lane in which the non-subject vehicle travels.
In a step S104, the display controller 5 performs control to display, in accordance with travel of the subject vehicle, the travel location object generated by the object generation unit 4 superimposed on the scenery around the subject vehicle.
In a step S105, the display controller 5 judges whether to end display of the travel location object. When display of the travel location object is ended, processing ends. On the other hand, when display of the travel location object is not ended, processing returns to the step S101.
In a step S201, the subject vehicle information acquisition unit 2 acquires the subject vehicle information including the current location and the speed of the subject vehicle.
In a step S202, the non-subject vehicle information acquisition unit 3 and the overall controller 14 each acquire the image around the subject vehicle captured by the image capturing device 15. The image around the subject vehicle includes the lane in which the subject vehicle travels and the lane to which the subject vehicle merges or changes lanes.
In a step S203, the non-subject vehicle location calculation unit 11 performs image processing on the image acquired from the image capturing device 15 to calculate the location of the non-subject vehicle relative to the subject vehicle. The non-subject vehicle speed calculation unit 12 performs image processing on the image acquired from the image capturing device 15 to calculate the speed of the non-subject vehicle relative to the subject vehicle.
In a step S204, the overall controller 14 performs image processing on the image acquired from the image capturing device 15 to judge whether the line markings constituting the lane in which the subject vehicle travels have been detected. When the line markings have not been detected, processing proceeds to a step S205. On the other hand, when the line markings have been detected, processing proceeds to a step S206.
In the step S205, the map information acquisition unit 13 acquires, based on the location of the subject vehicle, the map information including the lane information from the map information storage 16 in accordance with instructions of the overall controller 14.
In the step S206, the map information acquisition unit 13 acquires, based on the location of the subject vehicle, the map information including the lane information from the map information storage 16 in accordance with instructions of the overall controller 14. In this case, the line markings constituting the lane in which the subject vehicle travels are acquired from the line markings detected by the overall controller 14 from the image acquired from the image capturing device 15 and the line markings included in the lane information acquired from the map information storage 16, so that accuracy of detection of the line markings can further be improved. Processing in the step S206 may be omitted.
In a step S207, the object generation unit 4 generates the travel location object indicating at least one of the location where the subject vehicle is currently to travel and the location where the subject vehicle is currently not to travel calculated by the overall controller 14. In this case, the object generation unit 4 determines the shape of the travel location object based on the image captured by the image capturing device 15.
In a step S208, the display controller 5 performs control to cause the display device 17 to display, in accordance with travel of the subject vehicle, the travel location object generated by the object generation unit 4 superimposed on the scenery around the subject vehicle.
In a step S209, the overall controller 14 judges whether to end display of the travel location object. When display of the travel location object is ended, processing ends. On the other hand, when display of the travel location object is not ended, processing returns to the step S201.
<Display>
As illustrated in
In
As illustrated in
In
As illustrated in
In
As illustrated in
In
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As described above, according to Embodiment 1, the display device 17 displays the location where the subject vehicle is currently to travel when the subject vehicle merges or changes lanes. This allows for appropriate driving assistance to the driver when the subject vehicle merges or changes lanes.
A configuration of a driving assistance apparatus according to Embodiment 2 of the present invention is similar to that of the driving assistance apparatus 10 according to Embodiment 1, and thus detailed description thereof is omitted herein.
In the step S307, the object generation unit 4 generates a travel location object indicating a path from the current location of the subject vehicle to a point where the subject vehicle can merge calculated by the overall controller 14.
In the step S308, the object generation unit 4 generates a merging prediction object. Specifically, the overall controller 14 calculates a location that would become the location of the subject vehicle and a location that would become the location of the non-subject vehicle if the subject vehicle travels to a merging point at the current speed. The object generation unit 4 generates an object indicating an imaginary subject vehicle being the subject vehicle that is imaginary and an imaginary non-subject vehicle being the non-subject vehicle that is imaginary respectively at the location that would become the location of the subject vehicle and the location that would become the location of the non-subject vehicle if the subject vehicle travels to the merging point at the current speed. The object indicating the imaginary subject vehicle and the imaginary non-subject vehicle corresponds to the merging prediction object.
In the step S309, the display controller 5 performs control to cause the display device 17 to display, in accordance with travel of the subject vehicle, the travel location object generated by the object generation unit 4 in the step S307 and the merging prediction object generated by the object generation unit 4 in the step S308 superimposed on the scenery around the subject vehicle.
In the step S310, the overall controller 14 judges whether to end display of the travel location object and the merging prediction object. When display of the travel location object and the merging prediction object is ended, processing ends. On the other hand, when display of the travel location object and the merging prediction object is not ended, processing returns to the step S301.
As illustrated in
As illustrated in
As described above, according to Embodiment 2, along with the location where the subject vehicle is currently to travel when the subject vehicle merges or changes lanes, the imaginary subject vehicle 23 and the imaginary non-subject vehicle 24 are displayed respectively at the location that would become the location of the subject vehicle and the location that would become the location of the non-subject vehicle if the subject vehicle travels to the merging point at the current speed. This allows for appropriate driving assistance to the driver when the subject vehicle merges or changes lanes.
A configuration of a driving assistance apparatus according to Embodiment 3 of the present invention is similar to that of the driving assistance apparatus 10 according to Embodiment 1, and thus detailed description thereof is omitted herein.
As illustrated in
As illustrated in
In
As described above, according to Embodiment 3, when there are a plurality of points where the subject vehicle can merge, the travel location object is displayed for each of the points. This allows for appropriate driving assistance to the driver when the subject vehicle merges or changes lanes.
In Embodiment 4 of the present invention, a case where the display device 17 is an electronic mirror to display an image behind the subject vehicle will be described. A configuration of a driving assistance apparatus according to Embodiment 4 is similar to that of the driving assistance apparatus 10 according to Embodiment 1, and thus detailed description thereof is omitted herein.
As illustrated in
As illustrated on the right side of
As described above, according to Embodiment 4, the travel location object is displayed by the electronic mirror to display an image behind the subject vehicle. This allows for appropriate driving assistance to the driver when the subject vehicle merges or changes lanes.
A case where the display device 17 is the electronic mirror is described in Embodiment 4, but the display device 17 is not limited to the electronic mirror. For example, the display device 17 may have a configuration in which a transparent display panel is provided on the surface of a mirror to reflect the image behind the subject vehicle. In this case, the travel location object is displayed by the display panel.
In Embodiment 5 of the present invention, a case where the object generation unit 4 generates the travel location object responsive to the extent to which the subject vehicle can merge or change lanes will be described. A configuration of a driving assistance apparatus according to Embodiment 5 is similar to that of the driving assistance apparatus 10 according to Embodiment 1, and thus detailed description thereof is omitted herein.
As illustrated in
As illustrated in
The display controller 5 may perform control to display or not to display the travel location object responsive to a predetermined event. The predetermined event herein includes, for example, provision of instructions by the driver of the subject vehicle using a direction indicator, detection of approach, to the subject vehicle, of the non-subject vehicle behind the subject vehicle, and making a gesture indicating that the subject vehicle changes lanes by the driver of the subject vehicle.
The display controller 5 may perform control to display the travel location object only on a line marking on a side of changing lanes.
A case of changing lanes is described above, but the same applies to a case of merging. Embodiment 5 is applicable to Embodiments 1 to 4.
As described above, according to Embodiment 5, the travel location object is displayed responsive to the extent to which the subject vehicle can merge or change lanes. This allows for appropriate driving assistance to the driver when the subject vehicle merges or changes lanes.
The driving assistance apparatus described above is applicable not only to an in-vehicle navigation device, i.e., a car navigation device but also to a navigation device or a device other than the navigation device constructed, as a system, by appropriately combining a portable navigation device (PND) mountable on a vehicle, a server provided external to the vehicle, and the like. In this case, the functions or the components of the driving assistance apparatus are distributed to functions constructing the above-mentioned system.
Specifically, as one example, the functions of the driving assistance apparatus can be placed in the server. For example, as shown in
As described above, even with a configuration in which the functions of the driving assistance apparatus are distributed to the functions constructing the system, an effect similar to that obtained in the above-mentioned embodiments can be obtained.
Software to perform operation in the above-mentioned embodiments may be incorporated, for example, into the server. A driving assistance method achieved by the server executing the software includes: acquiring subject vehicle information including a current location and a speed of a subject vehicle; acquiring non-subject vehicle information including a current location and a speed of a non-subject vehicle; generating, based on the acquired subject vehicle information and the acquired non-subject vehicle information, a travel location object indicating at least one of a location where the subject vehicle is currently to travel and a location where the subject vehicle is currently not to travel when the subject vehicle merges or changes lanes from a lane in which the subject vehicle travels to a lane in which the non-subject vehicle travels; and displaying, in accordance with travel of the subject vehicle, the generated travel location object superimposed on scenery around the subject vehicle.
An effect similar to that obtained in the above-mentioned embodiments can be obtained by incorporating the software to perform operation in the above-mentioned embodiments into the server, and operating the software.
Embodiments of the present invention can freely be combined with each other, and can be modified or omitted as appropriate within the scope of the invention.
While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous modifications not having been described can be devised without departing from the scope of the present invention.
1 driving assistance apparatus, 2 subject vehicle information acquisition unit, 3 non-subject vehicle information acquisition unit, 4 object generation unit, 5 display controller, 6 subject vehicle, 7 to 9 non-subject vehicle, 10 driving assistance apparatus, 11 non-subject vehicle location calculation unit, 12 non-subject vehicle speed calculation unit, 13 map information acquisition unit, 14 overall controller, 15 image capturing device, 16 map information storage, 17 display device, 18 processor, 19 memory, 20 travel location object, 21 non-subject vehicle, 22 acceleration and deceleration object, 23 imaginary subject vehicle, 24 imaginary non-subject vehicle, 25 travel location object, 26 non-subject vehicle, 27 and 28 travel location object, 29 to 31 non-subject vehicle, 32 and 33 path object, 34 subject vehicle, 35 and 36 non-subject vehicle, 37 travel location object, 38 subject vehicle, 39 lane changeable area, 40 lane changing caution area, 41 lane unchangeable area, 42 server.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/008970 | 3/8/2018 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/171528 | 9/12/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20090265061 | Watanabe | Oct 2009 | A1 |
20160082971 | Fuehrer | Mar 2016 | A1 |
20160300491 | Fukuda et al. | Oct 2016 | A1 |
20170106750 | Tauchi et al. | Apr 2017 | A1 |
20180148072 | Kamiya | May 2018 | A1 |
20180194363 | Sugiura et al. | Jul 2018 | A1 |
20180326996 | Fujisawa | Nov 2018 | A1 |
20190061766 | Nishiguchi | Feb 2019 | A1 |
20190071071 | Yamada | Mar 2019 | A1 |
Number | Date | Country |
---|---|---|
10-281795 | Oct 1998 | JP |
2001-134900 | May 2001 | JP |
2005-78414 | Mar 2005 | JP |
2007-147317 | Jun 2007 | JP |
2008-151507 | Jul 2008 | JP |
2008-222153 | Sep 2008 | JP |
2015-197706 | Nov 2015 | JP |
2017-102739 | Jun 2017 | JP |
WO 2015079623 | Jun 2015 | WO |
Entry |
---|
International Search Report, issued in PCT/JP2018/008970, dated May 22, 2018. |
Japanese Office Action for Japanese Application No. 2020-504583, dated Mar. 9, 2021, with English translation. |
Number | Date | Country | |
---|---|---|---|
20200286385 A1 | Sep 2020 | US |