The present invention relates to a vehicular environment estimation device that estimates an environmental state around a vehicle.
As described in Japanese Patent No. 4062353, a device for estimating an environmental state around a vehicle is known which stores the position or the like of an obstacle in the vicinity of the vehicle and predicts the route of the obstacle. This device finds routes, which interfere with each other, from among a plurality of predicted routes, and decreases the prediction probability of the routes which interfere with each other to predict the route of the obstacle.
However, in the above-described device, there is a case where it is difficult to appropriately estimate the actual environmental state around the vehicle. For example, in predicting the route while detecting other vehicles by radar, it is difficult to predict the route of another vehicle, which is traveling in the blind area of the vehicle.
The invention has been finalized in order to solve such a problem, and an object of the invention is to provide a vehicular environment estimation device capable of accurately estimating the travel environment around own vehicle on the basis of a predicted route of a mobile object, which is moving in a blind area.
An aspect of the invention provides a vehicular environment estimation device. The vehicular environment estimation device includes a behavior detection means that detects a behavior of a mobile object in the vicinity of own vehicle, and an estimation means that estimates an environment, which affects the traveling of the mobile object, on the basis of the behavior of the mobile object.
With this configuration, the behavior of the mobile object in the vicinity of the own vehicle is detected, and the environment that affects the traveling of the mobile object is estimated on the basis of the behavior of the mobile object. Therefore, it is possible to estimate a vehicle travel environment that cannot be recognized from the own vehicle but can be recognized from a mobile object in the vicinity of the own vehicle.
The vehicular environment estimation device may further include a behavior prediction means that supposes the environment, which affects the traveling of the mobile object, and predicts the behavior of the mobile object on the basis of the supposed environmental state, and a comparison means that compares the behavior of the mobile object predicted by the behavior prediction means with the behavior of the mobile object detected by the behavior detection means. The estimation means may estimate the environment, which affects the traveling of the mobile object, on the basis of the comparison result of the comparison means.
With this configuration, the environment that affects the traveling of the mobile object is supposed, and the behavior of the mobile object is predicted on the basis of the supposed environmental state. Then, the predicted behavior of the mobile object is compared with the detected behavior of the mobile object, and the environment that affects the traveling of the mobile object is estimated on the basis of the comparison result. Therefore, it is possible to estimate a vehicle travel environment, which affects the traveling of the mobile object, on the basis of the detected behavior of the mobile object.
Another aspect of the invention provides a vehicular environment estimation device. The vehicular environment estimation device includes a behavior detection means that detects a behavior of a mobile object in the vicinity of own vehicle, and an estimation means that estimates an environment of a blind area of the own vehicle on the basis of the behavior of the mobile object.
With this configuration, the behavior of the mobile object in the vicinity of the own vehicle is detected, and the environment of the blind area of the own vehicle is estimated on the basis of the behavior of the mobile object. Therefore, it is possible to estimate the vehicle travel environment of the blind area that cannot be recognized from the own vehicle but can be recognized from the mobile object in the vicinity of the own vehicle.
The vehicular environment estimation device may further include a behavior prediction means that supposes the environment of the blind area of the own vehicle and predicts the behavior of the mobile object on the basis of the supposed environmental state, and a comparison means that compares the behavior of the mobile object predicted by the behavior prediction means with the behavior of the mobile object detected by the behavior detection means. The estimation means may estimate the environment of the blind area of the own vehicle on the basis of the comparison result of the comparison means.
With this configuration, the environment of the blind area of the own vehicle is supposed, and the behavior of the mobile object is predicted on the basis of the supposed environmental state. Then, the predicted behavior of the mobile object is compared with the detected behavior of the mobile object, and the environment of the blind area of the own vehicle is estimated on the basis of the comparison result. Therefore, it is possible to estimate the vehicle travel environment of the blind area of the own vehicle on the basis of the detected behavior of the mobile object.
In the vehicular environment estimation device, the estimation means may predict the behavior of the mobile object, which is present in the blind area, as the environment of the blind area of the own vehicle.
With this configuration, the behavior of the mobile object which is present in the blind area, is predicted as the environment of the blind area of the own vehicle. Therefore, it is possible to accurately predict the behavior of the mobile object which is present in the blind area of the own vehicle.
The vehicular environment estimation device may further include an abnormal behavior determination means that, when the behavior detection means detects a plurality of behaviors of the mobile objects, and the estimation means estimates the environment of the blind area of the own vehicle on the basis of the plurality of behaviors of the mobile objects, determines that a mobile object which does not behave in accordance with the estimated environment of the blind area of the own vehicle behaves abnormally.
With this configuration, when the environment of the blind area of the own vehicle is estimated on the basis of a plurality of behaviors of the mobile objects, it is determined that a mobile object which does not behave in accordance with the estimated environment of the blind area of the own vehicle behaves abnormally. Therefore, it is possible to specify a mobile object which behaves abnormally in accordance with the estimated environment of the blind area.
In the vehicular environment estimation device, the estimation means may estimate the display state of a traffic signal in front of the mobile object on the basis of the behavior of the mobile object as the environment, which affects the traveling of the mobile object, or the environment of the blind area of the own vehicle.
With this configuration, the display state of a traffic signal in front of the mobile object is estimated on the basis of the behavior of the mobile object. Therefore, it is possible to accurately estimate the display state of a traffic signal that cannot be recognized from the own vehicle but can be recognized from the mobile object in the vicinity of the own vehicle.
The vehicular environment estimation device may further include an assistance means that performs travel assistance for the own vehicle on the basis of the environment estimated by the estimation means.
According to the aspects of the invention, it is possible to accurately estimate a travel environment around own vehicle on the basis of a predicted route of a mobile object or the like, which is moving in a blind area.
Hereinafter, embodiments of the invention will be described in detail with reference to the accompanying drawings. In the following description, the same parts are represented by the same reference numerals, and overlap descriptions will not be repeated.
A vehicular environment estimation device 1 of this embodiment is a device that is mounted in own vehicle and estimates the travel environment of the vehicle, and is used for, for example, an automatic drive control system or a drive assistance system of a vehicle.
As shown in
The vehicular environment estimation device 1 includes a navigation system 3. The navigation system 3 functions as a position information acquisition means that acquires position information of the own vehicle. For the navigation system 3, a system is used which has a GPS (Global Positioning System) receiver and stores map data therein.
The vehicular environment estimation device 1 includes an ECU (Electronic Control Unit) 4. The ECU 4 controls the entire device, and is primarily formed by a computer having a CPU, a ROM, and a RAM. The ECU 4 includes an obstacle behavior detection section 41, an undetected obstacle setting section 42, a first detected obstacle route prediction section 43, a route evaluation section 44, and a second detected obstacle route prediction section 45. The obstacle behavior detection section 41, the undetected obstacle setting section 42, the first detected obstacle route prediction section 43, the route evaluation section 44, and the second detected obstacle route prediction section 45 may be configured to be executed by programs which are stored in the ECU 4 or may be provided in the ECU 4 as separate units.
The obstacle behavior detection section 41 functions as a behavior detection means that detects a behavior of a mobile object in the vicinity of the own vehicle on the basis of a detection signal of the obstacle detection section 2. For example, the position of another vehicle in the vicinity of the own vehicle is stored and recognized or a transition of the position of another vehicle is recognized on the basis of the detection signal of the obstacle detection section 2.
The undetected obstacle setting section 42 supposes a plurality of travel environments which have different settings regarding the presence/absence of undetected obstacles, the number of undetected obstacles, the states of undetected obstacles, and the like, and functions as an undetected obstacle setting means that sets the presence/absence of an undetected obstacle in a blind area where the own vehicle cannot recognize an obstacle. For example, the undetected obstacle setting section 42 sets presence of another vehicle supposing that, at an intersection, another undetected vehicle is present in the blind area where the own vehicle cannot detect an obstacle, or supposes that another undetected vehicle is not present in the blind area. At this time, with regard to the attributes, such as the number of obstacles in the blind area, the position and speed of each obstacle, and the like, a plurality of hypotheses are set.
The first detected obstacle route prediction section 43 predicts the routes (first predicted routes) of a detected obstacle corresponding to a plurality of suppositions by the undetected obstacle setting section 42. The first detected obstacle route prediction section 43 functions as a behavior prediction means that supposes the environment, which affects the traveling of a detected mobile object, or the environment of the blind area of the own vehicle, and supposes or predicts the behavior or route of the mobile object on the basis of the supposed environmental state. For example, when it is supposed that an undetected obstacle is present, in each of the environments where the undetected obstacle is present, the route of the mobile object detected by the obstacle behavior detection section 41 is predicted. At this time, when it is supposed that a plurality of undetected obstacles are present, for the supposition on presence of each undetected obstacle, route prediction of a mobile object is carried out.
The route evaluation section 44 evaluates the route of the detected obstacle predicted by the first detected obstacle route prediction section 43. The route evaluation section 44 compares the behavior detection result of the detected obstacle detected by the obstacle behavior detection section 41 with the route prediction result of the detected obstacle predicted by the first detected obstacle route prediction section 43 to estimate a travel environment. The route evaluation section 44 functions as a comparison means that compares the behavior or route of the mobile object predicted by the first detected obstacle route prediction section 43 with the behavior of the mobile object detected by the obstacle behavior detection section 41. The route evaluation section 44 also functions as an estimation means that estimates the environment, which affects the traveling of the mobile object, or the environment of the blind area of the own vehicle on the basis of the comparison result.
The second detected obstacle route prediction section 45 is a route prediction means that predicts the route of a mobile object detected by the obstacle behavior detection section 41. For example, the route (second predicted route) of the mobile object detected by the obstacle behavior detection section 41 is predicted on the basis of the evaluation result of the route evaluation section 44.
The vehicular environment estimation device 1 includes a travel control section 5. The travel control section 5 controls the traveling of the own vehicle in accordance with a control signal output from the ECU 4. For example, an engine control ECU, a brake control ECU, and a steering control ECU correspond to the travel control section 5.
Next, the operation of the vehicular environment estimation device 1 of this embodiment will be described.
First, as shown in Step S10 (Hereinafter, Step S10 is simply referred to as “S10”. The same is applied to the steps subsequent to Step S10) of
Next, the process progresses to S12, and obstacle behavior detection processing is carried out. The obstacle behavior detection processing is carried out to detect the behavior of an obstacle or a mobile object, such as another vehicle, on the basis of the detection signal of the obstacle detection section 2. For example, as shown in
Next, the process progresses to S14 of
Specifically, as shown in
Next, the process progresses to S16 of
For example, as shown in
Next, the process progresses to S18 of
For example, the route of the vehicle B predicted by the first detected obstacle route prediction processing of S16 is compared with the route of the vehicle B detected by the obstacle behavior detection processing of S12. A high evaluation is provided when the route of the vehicle B predicted by the first detected obstacle route prediction processing of S16 is closer to the route of the vehicle B detected by the obstacle behavior detection processing of S12. Then, from among the routes of the vehicle B predicted by the first detected obstacle route prediction processing of S16, a route which is closest to the route of the vehicle B detected by the obstacle behavior detection processing of S12 is selected as a predicted route. The vehicle travel environment, which affects the traveling of the vehicle B, or the vehicle travel environment of the blind area S of the own vehicle A is estimated on the basis of the selected predicted route of the vehicle B. For example, when a route on which the vehicle B travels in a straight line and reduces speed is predicted as the predicted route of the vehicle B, it is estimated that the vehicle C which is traveling toward the intersection is present in the blind area S.
Next, the process progresses to S20 of
For example, referring to
Next, the process progresses to S22 of
As described above, according to the vehicular environment estimation device 1 of this embodiment, the behavior of the vehicle B in the vicinity of the own vehicle A is detected, and the environment which affects the traveling of the vehicle B is estimated on the basis of the behavior of the vehicle B. Therefore, it is possible to estimate the vehicle travel environment that cannot be recognized from the own vehicle A but can be recognized from the vehicle B in the vicinity of the own vehicle.
As described above, the environment which affects the traveling of the vehicle B is estimated, instead of the environment which directly affects the own vehicle A. Therefore, it is possible to predict the route of the vehicle B and to predict changes in the vehicle travel environment of the own vehicle A in advance, thereby carrying out safe and smooth drive control.
In the vehicular environment estimation device 1 of this embodiment, the environment which affects the traveling of the vehicle B is supposed, and the behavior of the vehicle B is predicted on the basis of the supposed environmental state. The predicted behavior of the vehicle B is compared with the detected behavior of the vehicle B, and the environment which affects the traveling of the vehicle B is estimated on the basis of the comparison result. Therefore, it is possible to estimate the vehicle travel environment, which affects the traveling of the vehicle B, on the basis of the behavior of the vehicle B.
According to the vehicular environment estimation device 1 of this embodiment, the behavior of the vehicle B in the vicinity of the own vehicle A is detected, and the environment of the blind area S of the own vehicle A is estimated on the basis of the behavior of the vehicle B. Therefore, it is possible to estimate the vehicle travel environment of the blind area S that cannot be recognized from the own vehicle A but can be recognized from the vehicle B in the vicinity of the own vehicle.
In the vehicular environment estimation device 1 of this embodiment, the environment of the blind area S of the own vehicle A is supposed, and the behavior of the vehicle B is predicted on the basis of the supposed environmental state. The predicted behavior of the vehicle B is compared with the detected behavior of the vehicle B, and the environment of the blind area S of the own vehicle A is estimated on the basis of the comparison result. Therefore, it is possible to estimate the vehicle travel environment of the blind area S of the own vehicle A on the basis of the detected behavior of the vehicle B.
Next, a vehicular environment estimation device according to a second embodiment of the invention will be described.
A vehicular environment estimation device 1a of this embodiment is a device that is mounted in own vehicle and estimates the travel environment of the vehicle. The vehicular environment estimation device 1a substantially includes the same configuration as the vehicular environment estimation device 1 of the first embodiment, and is different from the vehicular environment estimation device 1 of the first embodiment in that an undetected obstacle route prediction section 46 is provided.
The ECU 4 includes an undetected obstacle route prediction section 46. The undetected obstacle route prediction section 46 may be configured to be executed by a program stored in the ECU 4, or may be provided as a separate unit from the obstacle behavior detection section 41 and the like in the ECU 4.
The undetected obstacle route prediction section 46 predicts a route of an undetected obstacle that cannot be directly detected by the obstacle detection section 2. For example, the undetected obstacle route prediction section 46 predicts a behavior of a mobile object, which is present in the blind area, on the basis of the environment of the blind area of the own vehicle. The route prediction result of an undetected obstacle, such as a mobile object, is used for drive control of the vehicle.
Next, the operation of the vehicular environment estimation device 1a of this embodiment will be described.
First, as shown in S30 of
Next, the process progresses to S32, and obstacle behavior detection processing is carried out. The obstacle behavior detection processing is carried out to detect the behavior of an obstacle or a mobile object, such as another vehicle, on the basis of the detection signal of the obstacle detection section 2. The obstacle behavior detection processing is carried out in the same manner as S12 of
Next, the process progresses to S34, and undetected obstacle setting processing is carried out. The undetected obstacle setting processing is carried out to suppose a plurality of travel environments which have different settings regarding the presence/absence of undetected obstacles, the number of undetected obstacles, the states of undetected obstacles, and the like. During the undetected obstacle setting processing, the presence/absence of an obstacle which cannot be detected by the obstacle detection section 2 is supposed, and an undetectable obstacle is set in a predetermined region. The undetected obstacle setting processing is carried out in the same manner as S14 of
Next, the process progresses to S36, and first detected obstacle route prediction processing is carried out. The first detected obstacle route prediction processing is carried out to predict the routes (first predicted routes) of a detected obstacle corresponding to a plurality of suppositions by the undetected obstacle setting processing of S34. During the first detected obstacle route prediction processing, the behavior or route of a mobile object is predicted on the basis of the travel environment, which is supposed through S34. The first detected obstacle route prediction processing is carried out in the same manner as S16 of
Next, the process progresses to S38, and route evaluation processing is carried out. The route evaluation processing is carried out to evaluate the routes of the detected obstacle predicted by the first detected obstacle route prediction processing of S36. During the route evaluation processing, the behavior detection result of the detected obstacle detected by the obstacle behavior detection processing of S32 is compared with the route prediction result of the detected obstacle predicted by the first detected obstacle route prediction processing of S36, thereby estimating the travel environment. The route evaluation processing is carried out in the same manner as S18 of
Next, the process progresses to S40, and second detected obstacle route prediction processing is carried out. The second detected obstacle route prediction processing is carried out to predict the route of the mobile object detected by the obstacle behavior detection processing of S32. During the second detected obstacle route prediction processing, the route (second predicted route) of the mobile object detected by the obstacle behavior detection processing of S32 is predicted on the basis of the evaluation result by the route evaluation processing of S38. The second detected obstacle route prediction processing is carried out in the same manner as S20 of
Next, the process progresses to S42, and undetected obstacle route prediction processing is carried out. The undetected obstacle route prediction processing is carried out to predict the route of an undetected obstacle. During the undetected obstacle route prediction processing, for example, the route of an undetected obstacle is predicted on the basis of the predicted route of the obstacle predicted by the second detected obstacle route prediction processing of S40.
For example, as shown in
Next, the process progresses to S44 of
As described above, according to the vehicular environment estimation device 1a of this embodiment, in addition to the advantages of the vehicular environment estimation device 1, it is possible to accurately predict the behavior of a mobile object, which is in the blind area S, as the environment of the blind area S of the own vehicle A.
Next, a vehicular environment estimation device according to a third embodiment of the invention will be described.
A vehicular environment estimation device 1b of this embodiment is a device that is mounted in own vehicle and estimates the travel environment of the vehicle. The vehicular environment estimation device 1b substantially includes the same configuration as the vehicular environment estimation device 1 of the first embodiment, and is different from the vehicular environment estimation device 1 of the first embodiment in that an abnormality determination section 47 is provided.
The ECU 4 includes an abnormality determination section 47. The abnormality determination section 47 may be configured to be executed by a program stored in the ECU 4, or may be provided as a separate unit from the obstacle behavior detection section 41 and the like in the ECU 4.
The abnormality determination section 47 determines whether the behavior of a detected obstacle which is directly detected by the obstacle detection section 2 is abnormal or not. For example, when a plurality of mobile objects are detected by the obstacle behavior detection section 41, the presence or route of an undetected obstacle which is present in the blind area is estimated on the basis of the behaviors of the mobile objects. At this time, when an undetected obstacle is recognized to be different from other mobile objects, it is determined that the behavior of the mobile object is abnormal.
Next, the operation of the vehicular environment estimation device 1b of this embodiment will be described.
First, as shown in S50 of
Next, the process progresses to S52, and obstacle behavior detection processing is carried out. The obstacle behavior detection processing is carried out to detect the behavior of an obstacle or a mobile object, such as another vehicle, on the basis of the detection signal of the obstacle detection section 2. For example, as shown in
Next, the process progresses to S54, and undetected obstacle setting processing is carried out. The undetected obstacle setting processing is carried out to suppose a plurality of travel environments which have different settings regarding the presence/absence of undetected obstacles, the number of undetected obstacles, the states of undetected obstacles, and the like. During the undetected obstacle setting processing, the presence/absence of an obstacle which cannot be detected by the obstacle detection section 2 is supposed, and an undetectable obstacle is set in a predetermined region. The undetected obstacle setting processing is carried out in the same manner as S14 of
Next, the process progresses to S56, and first detected obstacle route prediction processing is carried out. The first detected obstacle route prediction processing is carried out to predict the routes (first predicted routes) of a detected obstacle corresponding to a plurality of suppositions by the undetected obstacle setting processing of S54. During the first detected obstacle route prediction processing, the behavior or route of a mobile object is predicted on the basis of the travel environment, which is supposed through S54. The first detected obstacle route prediction processing is carried out in the same manner as S16 of
Next, the process progresses to S58, and route evaluation processing is carried out. The route evaluation processing is carried out to evaluate the routes of the detected obstacle predicted by the first detected obstacle route prediction processing of S56. During the route evaluation processing, the behavior detection result of the detected obstacle detected by the obstacle behavior detection processing of S52 is compared with the route prediction result of the detected obstacle predicted by the first detected obstacle route prediction processing of S56, thereby estimating the travel environment. The route evaluation processing is carried out in the same manner as S18 of
Next, the process progresses to S60, and second detected obstacle route prediction processing is carried out. The second detected obstacle route prediction processing is carried out to predict the route of the mobile object detected by the obstacle behavior detection processing of S52. During the second detected obstacle route prediction processing, the route (second predicted route) of the mobile object detected by the obstacle behavior detection processing of S52 is predicted on the basis of the evaluation result by the route evaluation processing of S58. The second detected obstacle route prediction processing is carried out in the same manner as S20 of
Next, the process progresses to S62, and abnormality determination processing is carried out. The abnormality determination processing is carried out to determine abnormality with respect to the behaviors of a plurality of obstacles detected in S52. For example, when a plurality of obstacles are detected by the obstacle behavior detection processing 52, if an undetected obstacle is recognized to be different from other mobile objects by a predetermined value or more, it is determined that the behavior of the mobile object is abnormal.
Referring to
Next, the process progresses to S64 of
As described above, according to the vehicular environment estimation device 1b of this embodiment, in addition to the advantages of the vehicular environment estimation device 1 of the first embodiment, in estimating the environment of the blind area of the own vehicle on the basis of the behaviors of a plurality of detected obstacles, it is possible to determine that a detected obstacle which does not behave in accordance with the estimated environment of the blind area of the own vehicle behaves abnormally. That is, it is possible to specify a detected obstacle which abnormally behaves in accordance with the estimated environment of the blind area.
Next, a vehicular environment estimation device according to a fourth embodiment of the invention will be described.
A vehicular environment estimation device 1c of this embodiment is a device that is mounted in own vehicle and estimates the travel environment of the vehicle. The vehicular environment estimation device 1c of this embodiment estimates the lighting display state of an undetected or unacquired traffic signal on the basis of the behaviors of detected obstacles. The vehicular environment estimation device 1c substantially has the same configuration as the vehicular environment estimation device 1 of the first embodiment, and is different from the vehicular environment estimation device 1 of the first embodiment in that, an undetected traffic signal display setting section 48 is provided, instead of the undetected obstacle setting section 42.
The ECU 4 includes an undetected traffic signal display setting section 48. The undetected traffic signal display setting section 48 may be configured to be executed by a program stored in the ECU 4, or may be provided as a separate unit from the obstacle behavior detection section 41 and the like in the ECU 4.
The undetected traffic signal display setting section 48 sets display of a traffic signal when a blind area is placed due to a heavy vehicle in front of the own vehicle and a sensor cannot detect display of a traffic signal or when a communication failure occurs and display information of a traffic signal cannot be acquired. The undetected traffic signal display setting section 48 functions as an undetected traffic signal display setting means that sets the display state of an undetected or unacquired traffic signal. For example, when the own vehicle cannot detect the lighting display state of a traffic signal due to a heavy vehicle in front of the vehicle at an intersection or the like, the display state of the traffic signal is supposed and set as green display, yellow display, red display, or arrow display.
Next, the operation of the vehicular environment estimation device 1c of this embodiment will be described.
First, as shown in S70 of
Next, the process progresses to S72, and obstacle behavior detection processing is carried out. The obstacle behavior detection processing is carried out to detect the behavior of an obstacle or a mobile object, such as another vehicle, on the basis of the detection signal of the obstacle detection section 2. The obstacle behavior detection processing is carried out in the same manner as S12 of
Next, the process progresses to S74, and undetected traffic signal setting processing is carried out. The undetected traffic signal setting processing is carried out in which, when the display state of a traffic signal in front of the vehicle cannot be detected or acquired, the lighting display state of the traffic signal is supposed and set. For example, the lighting display state of the traffic signal is set as red lighting, yellow lighting, green lighting, or arrow lighting.
Next, the process progresses to S76, and first detected obstacle route prediction processing is carried out. The first detected obstacle route prediction processing is carried out to predict the routes (first predicted routes) of a detected obstacle corresponding to a plurality of suppositions by the undetected traffic signal display setting processing of S74. During the first detected obstacle route prediction processing, the behavior or route of a mobile object is predicted on the basis of traffic signal display, which is supposed through S74.
Specifically, when in S74, traffic signal display is set as red display, the route of the mobile object (detected obstacle) is predicted on which the mobile object stops or reduces speed. Meanwhile, when in S74, traffic signal display is green display, the route of the mobile object is predicted on which the mobile object travels at a predetermined speed.
Next, the process progresses to S78, and route evaluation processing is carried out. The route evaluation processing is carried out to evaluate the routes of the detected obstacle predicted by the first detected obstacle route prediction processing of S76. During the route evaluation processing, the behavior detection result of the detected obstacle detected by the obstacle behavior detection processing of S72 is compared with the route prediction result of the detected obstacle predicted by the first detected obstacle route prediction processing of S76, thereby estimating the travel environment.
For example, as shown in
Next, the process progresses to S80, and second detected obstacle route prediction processing is carried out. The second detected obstacle route prediction processing is carried out to predict the route of the obstacle detected in S72. For example, during the second detected obstacle route prediction processing, the route (second predicted route) of the mobile object detected by the obstacle behavior detection processing of S72 is predicted on the basis of the evaluation result by the route evaluation processing of S78. For example, referring to
Next, the process progresses to S82 of
As described above, according to the vehicular environment estimation device 1c of this embodiment, in addition to the advantages of the vehicular environment estimation device 1 of the first embodiment, it is possible to estimate the display state of the traffic signal in front of the vehicle on the basis of the behavior of a detected obstacle. For this reason, it is possible to accurately estimate the display state of a traffic signal which cannot be recognized from the own vehicle but can be recognized from a mobile object in the vicinity of the own vehicle.
The foregoing embodiments are for illustration of the exemplary embodiments of the vehicular environment estimation device of the invention; however, the vehicular environment estimation device of the invention is not limited to those described in the embodiments. The vehicular environment estimation device of the invention may be modified from the vehicular environment estimation devices of the embodiments or may be applied to other systems without departing from the scope of the invention defined by the appended claims.
For example, during the route evaluation processing of S18 and the like in the foregoing embodiments, the state of an undetected obstacle supposed on a first predicted route, which most conforms to the detection result selected in S18, may be used as the estimation result of the travel environment as it is.
During the second detected obstacle route prediction processing of S20 and the like in the foregoing embodiments, the first predicted route selected in S18 (the route having highest similarity to the detection result) may be set as the second predicted route. In addition, during the second detected obstacle route prediction processing of S20 and the like in the foregoing embodiments, at the time of comparison in S18, the similarity of each first predicted route may be calculated, and a plurality of first predicted routes may be combined in accordance with the similarities to obtain a second predicted route.
During the undetected obstacle route prediction processing in the foregoing embodiments, route prediction may be carried out on the basis of a plurality of undetected obstacle states which are estimated at different times.
During the drive control processing in the foregoing embodiments, instead of drive control of the vehicle, a drive assistance operation, such as a warning or notification to the driver of the vehicle, may be carried out.
According to the invention, it is possible to accurately estimate the travel environment around the own vehicle on the basis of the predicted route of a mobile object, which is moving in the blind area.
Number | Date | Country | Kind |
---|---|---|---|
2009-120015 | May 2009 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | 17453796 | Nov 2021 | US |
Child | 18148906 | US | |
Parent | 15293674 | Oct 2016 | US |
Child | 17453796 | US | |
Parent | 13320706 | Nov 2011 | US |
Child | 15293674 | US |