This application is based on Japanese Patent Application No. 2014-82082 filed on Apr. 11, 2014, the disclosure of which is incorporated herein by reference.
The present disclosure relates to a recognition support system enabling a driver to recognize a situation of a position where a vehicle is about to turn.
It has been known a device that senses an obstacle in a direction to which a vehicle turns and that provides a warning based on the sensing result of the obstacle when a condition that the vehicle approaches an intersection is satisfied (see, patent literature 1).
Since the conventional device provides the warning for a less-dangerous obstacle located within the sensing range, there is a possibility to bother a driver in such a case.
Patent Literature 1: JP 2001-84496 A
It is an object of the present disclosure to provide a recognition support system that accurately selects information presented to a driver at an intersection and that reduces inconvenience of the driver.
According to an aspect of the present disclosure, a recognition support system includes: an object detection device detecting an object that is located in a periphery of a host vehicle; a direction sensing device sensing a direction to which a driver intends to turn the host vehicle; a range setting device setting a notification range based on the direction sensed by the direction sensing device; a selecting device defining the object, which is detected by the object detecting device, as a specified object when the object moves independently from the host vehicle, and selecting the specified object, which is located within the notification range, to be presented to the driver based on a state of the specified object; and an information presenting device presenting an information enabling the driver to recognize the specified object selected by the selecting device.
According to the structure, the recognition support system selects the range and the object to be presented based on the turning direction, the category of the object and the state of the object, so as to reduce driver's inconvenience caused by unnecessary information. Further, since the recognition support system does not select only one object to be presented, but presents information of plural objects, the recognition support system allows the driver to finally determine the handling of the specified object.
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description made with reference to the accompanying drawings, in which:
Embodiments of the present disclosure will be described with reference to drawings.
<Entire Structure>
As shown in
The periphery recognition sensor group 2 includes an image sensor that films a periphery of a vehicle to detect objects, a radar sensor that detects distance and relative speed of the objects and the vehicle using millimeter waves and sonic waves, a GPS sensor and a locator that detect a current position of the vehicle. As shown in
The vehicle state detection sensor group 3 includes at least a direction indicator sensor that detects at least an operation state of a direction indicator, and a speed sensor that detects speed of the vehicle.
The information presentation unit 5 includes a display that displays an image and a letter, and a lamp or a buzzer that visually and auditorily warns the drivers. In the present embodiment, as shown in
The control unit 4 includes a known microcomputer having at least a CPU 41 and a memory 42, and executes at least an object detection processing, a recognition support activation processing, and a recognition support processing.
<Processing>
The object detection processing is repeatedly executed while the control unit 4 is turned on. In the object detection processing, at least a specified object, a pedestrian crossing and a guard rail located within the sensing range are detected using the detection result of the periphery recognition sensor group 2. The specified object is an object that moves independently from the host vehicle, and, in the present embodiment, the specified object is a pedestrian or a bicycle. In the object detection processing, moving speed of the specified object is also detected.
The recognition support activation processing is repeatedly executed while the control unit 4 is turned on. The recognition support activation processing activates the recognition support processing when the host vehicle, in which the recognition support system 1 is mounded, detects that the driver intends to turn the vehicle at the intersection, and when the vehicle state detection sensor group 3 determined that the state of the host vehicle satisfies a predetermined activation condition. The activation condition is satisfied when a time period until the host vehicle reaches the intersection, at which the driver intends to turn the host vehicle, or a distance from the host vehicle to the intersection is estimated to be equal to or less than a threshold value. Since the activation condition is remotely related to the main part of the present disclosure, detail descriptions thereof will be omitted.
The details of the recognition support processing, which is activated by the recognition support activation processing, will be described in line with a flowchart shown in
When the recognition support processing is activated, at S110, the CPU 41 acquires a speed (i.e., vehicle speed) of the host vehicle and a turning direction to which the host vehicle is about to turn at the intersection from the vehicle state detection sensor group 3.
At S120, a notification range, for which the recognition support processing is executed, is set based on the vehicle speed and the turning direction of the host vehicle acquired at S110. Specifically, for example, when the turning direction indicates right, as shown in
At S130, based on the result of the object detection processing, the specified object, the pedestrian crossing, the guard rail and the like located within the notification range, which is set at S120, are extracted.
At S140, it is determined whether there is a specified object close to the pedestrian crossing, which is located ahead of the host vehicle that is about to turn, or a specified object close to an entrance to a road, which is located ahead of the host vehicle (hereinafter, “the pedestrian crossing and the like” means to include the pedestrian crossing and the entrance to the road). In the present embodiment, a region close to the pedestrian crossing and the like means to a region within a predetermined range (for example, equal to or less than 2 meters) from the pedestrian crossing and both ends of the pedestrian crossing, or a region corresponding to the region described above when there is no pedestrian crossing. When there is the specified object close to the pedestrian crossing and the like (S140: YES), the specified object is set as a first priority object at S150. Conversely, when there is no specified object close to the pedestrian crossing and the like (S140: NO), the processing of S150 is skipped and the processing proceeds to S160. Setting the specified object as the first priority object means that it is necessary to pay attention to the specified object that is crossing the pedestrian crossing and the like, and the specified object that is located close to the pedestrian crossing and that has a possibility of crossing. For example, in the situations shown in
As S160, it is determined that there is a specified object, other than the first priority object, that has a possibility of entering the pedestrian crossing within a predetermined time period. This determination is executed based on a position of the specified object and the moving speed of the specified object.
When there is such a specified object (S160: YES), the specified object is set as a second priority object at S170. Conversely, when there is not such a specified object (S160: NO), the processing of S170 is skipped and the processing proceeds to S180. Setting the specified object as the second priority object means that it is necessary to pay attention to the specified object that is not located close to the pedestrian crossing and the like, but has a possibility of crossing the pedestrian crossing at a time point when the host vehicle reaches the pedestrian crossing because the moving speed of the specified object is high. For example, in the situation shown in
As S180, it is determined whether there is a guard rail between the specified object and the host vehicle. When there is no guard rail between the specified object and the host vehicle (S180: NO), the specified object other than the first priority object and the second priority object is set as a third priority object at S190. Conversely, when there is a guard rail between the specified object and the host vehicle (S180: YES), the processing of S190 is skipped and the processing proceeds to S200. That is, when there is the guard rail between the specified object and the host vehicle, the specified object is less likely to approach the host vehicle across the guard rail. Therefore, the specified object is excluded from the object that should be aware of, and the other specified object is set as a third priority object. For example, in the situations shown in
At S200, the processing of presenting information of the first to third priority object, using the recognition support information presenting unit 51, is executed and the processing is finished.
In the information presentation processing, the first priority object has the highest priority and the third priority object has the lowest priority. A predetermined number of specified objects having higher priority (for example, three objects in the present embodiment) are selectively presented. Specifically, in
Specifically, in the situations shown in
<Effects>
As described above, the recognition support system 1 narrows the notification range based on the turning direction of the host vehicle, selects the objects to be presented depending on the category of the objects, sets the priority of the objects depending on the state of the objects and reduces the objects to be presented. The recognition support system 1 reduces driver's inconvenience caused by unnecessary information. Further, since the recognition support system 1 does not select only one object to be presented, but presents plural objects having high priority, the recognition support system 1 allows the driver to finally determine the handling of the specified object.
For example, in the case of right turning, when a driver recognizes that there is a pedestrian or a bicycle in a pedestrian crossing located ahead of the host vehicle that is about to turn right, the driver immediately determines not to proceed the host vehicle regardless of the presence of an oncoming vehicle. Further, the situation, in which the driver pays too much attention to the road ahead of the host vehicle and ignores the oncoming vehicle, is restricted. Conversely, the situation, in which the driver pays too much attention to the oncoming vehicle and ignores the road located ahead of the host vehicle, is restricted.
<Other Embodiments>
Although the embodiment of the present disclosure is described hereinabove, the present disclosure is not limited to the embodiment described above and may be implemented in various other ways.
(1) In the above embodiment, a case in which the periphery recognition sensor group 2 recognizes the objects well is described. When it is estimated that the periphery recognition sensor group 2 does not recognizes the objects well due to some obstacle such as oncoming vehicle or the like, the situation may be notified to cause the driver's attention.
(2) The members of the present disclosure are just conceptional members and the present disclosure is not limited to the members. For example, a function of one of the members may be dispersed in plural members, or functions of the plural members may be combined in the one of the members. Also, at least one of the members of the above embodiments may be replaced by a well-known member having the similar function. Furthermore, at least one of the members of the above embodiments may be added to the other embodiments, or at least one of the members of the above embodiments may be replaced in the other embodiments.
The present disclosure may be implemented in ways, other than the above recognition support system, such as a program allowing a computer to function as devices providing the recognition support system.
It is noted that a flowchart or the processing of the flowchart in the present disclosure includes sections (also referred to as steps), each of which is represented, for instance, as S110. Further, each section can be divided into several sub-sections, while several sections can be combined into a single section. Furthermore, each of thus configured sections can be also referred to as a device, module, or means.
Although the present disclosure is described based on the above embodiments, the present disclosure is not limited to the embodiments and the structures. Various changes and modification may be made in the present disclosure. Furthermore, various combination and formation, and other combination and formation including one, more than one or less than one element may be made in the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2014-82082 | Apr 2014 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2015/001677 | 3/24/2015 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2015/155946 | 10/15/2015 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
9290174 | Zagorski | Mar 2016 | B1 |
9481287 | Marti | Nov 2016 | B2 |
20070124027 | Betzitza | May 2007 | A1 |
20090303078 | Mochizuki et al. | Dec 2009 | A1 |
20100209890 | Huang | Aug 2010 | A1 |
20100209892 | Lin | Aug 2010 | A1 |
20120035825 | Morita | Feb 2012 | A1 |
20120062743 | Lynam | Mar 2012 | A1 |
20120133769 | Nagamine et al. | May 2012 | A1 |
20130093888 | Kim | Apr 2013 | A1 |
20130155534 | Sala | Jun 2013 | A1 |
20130226408 | Fung | Aug 2013 | A1 |
20140098664 | Mizuguchi | Apr 2014 | A1 |
20140118551 | Ikeda et al. | May 2014 | A1 |
20140132404 | Katoh | May 2014 | A1 |
20140267398 | Beckwith | Sep 2014 | A1 |
20150005982 | Muthukumar | Jan 2015 | A1 |
20150266455 | Wilson | Sep 2015 | A1 |
20150294547 | Ito | Oct 2015 | A1 |
20160019807 | Uchida | Jan 2016 | A1 |
20160046298 | DeRuyck | Feb 2016 | A1 |
20160086040 | Kuehnle | Mar 2016 | A1 |
20160358477 | Ansari | Dec 2016 | A1 |
Number | Date | Country |
---|---|---|
2001-084496 | Mar 2001 | JP |
2003-044983 | Feb 2003 | JP |
2006-317328 | Nov 2006 | JP |
2009-251758 | Oct 2009 | JP |
2011-044063 | Mar 2011 | JP |
2012-014527 | Jan 2012 | JP |
2012-014616 | Jan 2012 | JP |
Number | Date | Country | |
---|---|---|---|
20170036674 A1 | Feb 2017 | US |