The present invention relates to a sensor recognition integration device for processing integration of a plurality of pieces of object data (object information) from a plurality of sensors of different types with a low load, for example.
In autonomous driving of an automobile, an object around the own vehicle is recognized, and driving is planned and determined according to the object. There are various sensors for detecting an object, such as a radar, a camera, a sonar, and a laser radar. Since these sensors have various conditions for a detection range, a detectable object, detection accuracy, cost, and the like, it is necessary to combine a plurality of sensors according to a purpose and integrate object information detected or acquired by each sensor. However, when the number of objects to be handled increases, it is necessary to improve the processing performance of an ECU depending on the processing cost of the integration, and thus, it is necessary to reduce the load of the integration processing. The “processing cost” refers to a processing time for the integration.
PTL 1 is a prior art document that reduces the load of processing of information from a plurality of sensors. The technique disclosed in FIG. 1 of PTL 1 includes: a first radar having a first detection range and performs calculation in a long calculation time; a second radar that has a second detection range overlapping the first detection range and performs calculation in a short calculation time; a first determination means that performs a presence determination of a target on the basis of a calculation result of the first radar; a second determination means that performs a presence determination of a target on the basis of a calculation result of the second radar; a presence confirmation means that confirms the presence of a target from a presence determination result of the first determination means; and a designation means that inputs a calculation result of the second radar to the second determination means and causes the second determination means to perform a current presence determination after confirmation of a previous target when the target whose presence has been confirmed by the presence confirmation means is present in an overlapping range of the first and second radars. In this target detection device, when the presence of the target is confirmed in the overlapping range of the first and second radars, the presence or absence of the next target can be confirmed even with low accuracy. Therefore, when the target whose presence has been confirmed by the presence confirmation means is present in the overlapping range, the designation means inputs the calculation result obtained by the second radar in the short calculation time to the second determination means, and causes the second determination means to determine the current presence after the confirmation of the previous target. Thus, the target can be detected at high speed. When the target enters the overlapping region with the first radar from the non-overlapping detection range of the second radar, the calculation amount can be reduced by narrowing the detection range by setting the focused detection range near the boundary between the overlapping range and the non-overlapping range.
However, in PTL 1, it is assumed that the sensors are of the same type such as the radars, and when a target enters the overlapping region with the first radar from the non-overlapping detection range of the second radar, only the detection range is narrowed in the vicinity of the boundary between the overlapping range and the non-overlapping range, and for example, in a system configuration that handles a plurality of types of sensors, there is a problem that the load of integration processing cannot be reduced while maintaining the accuracy by utilizing the features of the plurality of types of sensors.
The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a sensor recognition integration device capable of reducing the load of integration processing so as to satisfy the minimum necessary accuracy required for vehicle travel control, and capable of improving processing performance of an ECU and suppressing an increase in cost.
In order to solve the above problem, a sensor recognition integration device according to the present invention is a sensor recognition integration device that integrates a plurality of pieces of object information related to an object around an own vehicle detected by a plurality of external recognition sensors, the sensor recognition integration device including: a prediction update unit that generates predicted object information obtained by predicting an action of the object on the basis of an action of the own vehicle estimated from a behavior of the own vehicle and the object information detected by the external recognition sensors; an association unit that calculates a relationship between the predicted object information and the plurality of pieces of object information; an integration processing mode determination unit that switches an integration processing mode for determining a method of integrating the plurality of pieces of object information on the basis of a positional relationship between a specific region in an overlapping region of detection regions of the plurality of external recognition sensors and the predicted object information; and an integration target information generation unit that integrates the plurality of pieces of object information associated with the predicted object information on the basis of the integration processing mode.
According to the present invention, it is possible to reduce the load of the integration processing so as to satisfy the minimum necessary accuracy required for the vehicle travel control according to the detection situation, and it is possible to obtain the effects of improving the processing performance of an ECU and suppressing an increase in cost.
Problems, configurations, and effects other than those described above will be clarified from the following description of embodiments.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In all the drawings for describing the embodiments of the invention, portions having the same functions are denoted by the same reference numerals, and repeated description thereof will be omitted.
The autonomous driving system according to the present embodiment includes an information acquisition device B000, an input communication network B005, the sensor recognition integration device B006, an autonomous driving plan determination device B007, and an actuator group B008. The information acquisition device B000 includes an own vehicle behavior recognition sensor B001, an external recognition sensor group B002, a positioning system B003, and a map unit B004. The sensor recognition integration device B006 includes an information storage unit B009, a sensor object information integration unit B010, and an own vehicle surrounding information integration unit B011.
The own vehicle behavior recognition sensor B001 outputs recognized information D001 to the input communication network B005.
The external recognition sensor group B002 outputs recognized information D002 to the input communication network B005.
The positioning system B003 outputs positioned information D003 to the input communication network B005.
The map unit B004 outputs acquired information D004 to the input communication network B005.
D001, D002, D003, and D004 are input to the input communication network B005, and the input communication network B005 outputs information D005a flowing through the communication network to the sensor recognition integration device B006. D001 is input to the input communication network B005, and the input communication network B005 outputs information D005b flowing through the communication network to the autonomous driving plan determination device B007.
The sensor recognition integration device B006 receives the information D005a from the input communication network B005 as input, and outputs an integration result D011, which is vehicle surrounding information, to the autonomous driving plan determination device B007 as output information.
The autonomous driving plan determination device B007 receives the information D005b from the input communication network B005 and the integration result D011 from the sensor recognition integration device B006 as input, and outputs a plan determination result D007 as command information to the actuator group B008 as output information.
The information D005a from the input communication network B005 is input to the information storage unit B009, and the information storage unit B009 outputs stored information D009a to the sensor object information integration unit B010. The information storage unit B009 outputs stored information D009b to the own vehicle surrounding information integration unit B011.
The sensor object information integration unit B010 receives the information D009a from the information storage unit B009 as input, and outputs an integration result D010 that is integrated object information to the own vehicle surrounding information integration unit B011 as output information.
The own vehicle surrounding information integration unit B011 receives the information D009b from the information storage unit B009 as input, and outputs an integration result D011 that is vehicle surrounding information to the autonomous driving plan determination device B007 as output information.
The own vehicle behavior recognition sensor B001 includes a gyro sensor, a wheel speed sensor, a steering angle sensor, an acceleration sensor, and the like mounted on the vehicle, and the recognized information D001 includes a yaw rate, a wheel speed, a steering angle, acceleration, and the like representing the behavior of the own vehicle.
The information D002 recognized from the external recognition sensor group B002 includes information (position information, speed information, and the like) obtained by detecting an object outside (around) the own vehicle, a white line of a road, a sign, and the like. As the external recognition sensor group B002, a combination of a plurality of external recognition sensors (hereinafter, simply referred to as a sensor) such as a radar, a camera, and a sonar is used. In addition, V2X and C2X may be included, and the configuration of the sensor is not particularly limited.
The positioned information D003 from the positioning system B003 includes a result of estimating the position of the own vehicle. An example of the positioning system B003 is a satellite positioning system.
The information D004 acquired from the map unit B004 includes map information around the own vehicle. In addition, the acquired information D004 may include route information in cooperation with navigation.
The information D005a from the input communication network B005 includes all or some of the information D001, D002, D003, and D004. Further, the information D005b includes at least the information D001. As the input communication network B005, a controller area network (CAN), Ethernet (registered trademark), wireless communication, or the like, which is a network generally used in in-vehicle systems, is used.
The output information D011 from the sensor recognition integration device B006 includes data obtained by integrating, as vehicle surrounding information, vehicle behavior information, sensor object information, sensor road information, positioning information, and map information from the input communication network B005.
The command information D007 from the autonomous driving plan determination device B007 includes information obtained by planning and determining how to move the own vehicle based on the information from the input communication network B005 and the own vehicle surrounding information from the sensor recognition integration device B006.
The actuator group B008 operates the vehicle in accordance with the command information D007 from the autonomous driving plan determination device B007. The actuator group B008 includes various actuators such as an accelerator device, a brake device, and a steering device mounted on the vehicle.
The sensor recognition integration device B006 according to the present embodiment includes the information storage unit B009, the sensor object information integration unit B010, and the own vehicle surrounding information integration unit B011.
The information storage unit B009 stores the information from the input communication network B005, and outputs information D009a in response to requests from the sensor object information integration unit B010 and the own vehicle surrounding information integration unit B011. The requests from the sensor object information integration unit B010 and the own vehicle surrounding information integration unit B011 include time synchronization of data of the information D002 from the plurality of sensors constituting the external recognition sensor group B002, standardization of a coordinate system, and the like. The sensor object information integration unit B010 acquires the information (sensor object information) D009a from the information storage unit B009, integrates information of the same object detected by the plurality of sensors constituting the external recognition sensor group B002 as the same object (to be described in detail later), and outputs the integrated object information D010 to the own vehicle surrounding information integration unit B011. Therefore, even when the positioning system B003 and the map unit B004 are not mounted, the integrated object information D010 can be output from the sensor object information integration unit B010, and the integrated object information D010 is output instead of the output information D011 of the own vehicle surrounding information integration unit B011, whereby the sensor object information integration unit B010 is established as a system. Therefore, even if the positioning system B003 and the map unit B004 are not necessarily mounted, the operation of the present embodiment is not hindered. The own vehicle surrounding information integration unit B011 acquires the integrated object information D010 from the sensor object information integration unit B010 and the information D009b (including the own vehicle behavior information, the sensor road information, the positioning information, and the map information) from the information storage unit B009, integrates them as the own vehicle surrounding information D011, and outputs them to the autonomous driving plan determination device B007. The own vehicle surrounding information D011 includes information indicating to which of a white line on a road and a lane on a map the integrated object information D010 from the sensor object information integration unit B010 belongs.
The sensor object information integration unit B010 includes a prediction update unit 100, an association unit 101, an integration processing mode determination unit 102, an integration target information generation unit 104, an integration update unit 105, and an integrated object information storage unit 106. The processing of the sensor object information integration unit B010 is continuously repeatedly executed many times. In each execution, it is determined at which time information is to be estimated. For the sake of explanation, it is assumed that after the execution of the estimation of information at the time t1, the estimation of information at the time t2, which is the time after the time Δt, is executed.
Sensor object information 207A and 207B has a sensor object ID assigned by tracking processing in the sensors constituting the external recognition sensor group B002, a relative position with respect to the own vehicle, a relative speed with respect to the own vehicle, and an error covariance. Information such as the object type, the detection time, and the reliability of the information may be additionally included.
An integration processing mode 209 has a mode for determining an integration method in the integration target information generation unit 104.
Predicted object information 200 and integrated object information 205A and 205B (synonymous with the integrated object information D010 in
Integration target information 204 includes information on a sensor object after integration to be associated with the object ID of each object in the predicted object information 200. A position and speed of the sensor object information included in the integration target information 204 do not necessarily coincide with the original sensor object information 207A, and an integrated value utilizing the characteristics of the plurality of sensors constituting the external recognition sensor group B002 is calculated on the basis of the integration processing mode 209 of the integration processing mode determination unit 102.
Association information 201A and 201B has information indicating a correspondence between the predicted object information 200 and the plurality of pieces of sensor object information 207A and 207B.
As illustrated in
The prediction update unit 100 receives the integrated object information 205B at the time t1 from the integrated object information storage unit 106, and generates and outputs the predicted object information 200 at the time t2.
The association unit 101 receives the plurality of pieces of sensor object information 207A from the external recognition sensor group B002 and the predicted object information 200 at the time t2 from the prediction update unit 100 as input, and outputs the association information 201A indicating which predicted object information is associated with which of the plurality of pieces of sensor object information at the time t2. Further, the sensor object information 207A is output as the sensor object information 207B without being changed. Note that the sensor object information 207A and 207B needs to be in the same time zone as the predicted object information 200 at the time t2, and the sensor object information 207A and 207B is time-synchronized with the time t2 in the information storage unit B009 in
The integration processing mode determination unit 102 calculates (switches) and outputs the integration processing mode 209 on the basis of the sensor object information 207B and the association information 201A from the association unit 101 and the sensor detection range 208 stored in advance in the information storage unit B009 in
The integration target information generation unit 104 receives the association information 201B at the time t2 from the integration processing mode determination unit 102, the sensor object information 207B from the association unit 101, and the integration processing mode 209 from the integration processing mode determination unit 102 as input, and calculates an integrated value from coordinates and a speed of object information associated with each predicted object at the time t2 and outputs the integrated value as the integration target information 204. At this time, a function of switching the integration method on the basis of the integration processing mode 209 from the integration processing mode determination unit 102 is included (described in detail later).
The integration update unit 105 receives the integration target information 204 from the integration target information generation unit 104 and the predicted object information 200 at the time t2 from the prediction update unit 100, estimates a state (such as coordinates and a speed) of each object at the time t2, and outputs the estimated state as the integrated object information 205A. Even when the integration method is switched by the integration target information generation unit 104, the integrated object information 205A may be generated inside the integration update unit 105 so that the position information and the speed information of the object do not change suddenly. In order to smoothly (continuously) change the integrated object information 205A, which is the object information after the integration, when the integration method (that is, the integration processing mode 209) is switched in the integration target information generation unit 104, when the position information or the speed information suddenly changes, a method of changing the position information or the speed information from the value before the change to the value after the change by linear interpolation can be considered. In addition, the change method may be not only linear interpolation but also spline interpolation.
The integrated object information storage unit 106 stores the integrated object information 205A from the integration update unit 105, and outputs the integrated object information 205A to the prediction update unit 100 as the integrated object information 205B.
In the association determination (S512), it is determined whether a distance between the predicted object information 200 from the prediction update unit 100 and position information, speed information, and the like of the sensor object included in the plurality of pieces of sensor object information 207A from the external recognition sensor group B002 is short, and it is determined whether to perform association. Further, the result is output as association information 201A. As the distance to be obtained at this time, the Mahalanobis distance may be used on the basis of the distance in the Euclidean space of the coordinates and speed of each sensor, the coordinates and speed of each sensor, and the error covariance. Note that the Mahalanobis distance is a generally defined distance, and description thereof is omitted in the present embodiment.
After the setting in S571, S572, and S573, the processing returns to S551. Then, the processing is repeated until an unprocessed predicted object is not present in S554, and when an unprocessed predicted object is not present (S554: No), the processing of the integration processing mode determination unit 102 ends in S588.
Note that the selection processing mode (S573) means that object information of a single sensor is adopted. The selection processing mode (S573) may be performed in the low-processing-load integration processing mode (S572).
Here, the reason why the integration processing mode 209 (high-processing-load integration processing mode, low-processing-load integration processing mode) is set (switched) using the position information of the predicted object information 200 and the distance 561 to the boundary in the overlapping region of the sensor detection range 208 as a determination criterion is that the detection error of the sensor varies between the boundary in the overlapping region of the sensor detection range 208 and the region other than the boundary, specifically, the detection error of the sensor at the boundary in the overlapping region of the sensor detection range 208 tends to be relatively large.
For an object from a sensor that is not associated with any predicted object, the association determination may be performed between objects of each sensor as illustrated in
As an example of the high-processing-load integration processing mode in the integration processing mode determination unit 102, in other words, as an example of the high-processing-load integration processing mode used in the integration processing of the integration target information generation unit 104, there is an integration method illustrated in
In
As an example of the high-processing-load integration processing mode in the integration processing mode determination unit 102, in other words, as an example of the high-processing-load integration processing mode used in the integration processing of the integration target information generation unit 104, there is an integration method illustrated in
Similarly to
The method for calculating the position information of the predicted object information 200 and the distance 561 to the boundary in the sensor detection range 208 in S560 of
In the integration target information generation unit 104, the region in which the sensor detection ranges overlap is a target, and predicted object information F15 positioned at F12 and F13 is to be integrated. Note that a plurality of objects from the sensor are associated with the predicted object information F15 by the association unit 101, and a plurality of objects from the sensor to be associated are obtained. The plurality of objects from the sensor are to be integrated. First, in order to determine whether the predicted object information F15 belongs to F12 or F13, the distance between the boundary portion of F13 and the position of the predicted object information F15 is calculated. As a simple method, as illustrated in
In addition to the condition of
As described above, the present embodiment is the sensor recognition integration device B006 that integrates a plurality of pieces of object information (sensor object information) related to an object around an own vehicle detected by a plurality of external recognition sensors, the sensor recognition integration device B006 including: the prediction update unit 100 that generates predicted object information obtained by predicting an action of the object on the basis of an action of the own vehicle estimated from a behavior of the own vehicle and the object information detected by the external recognition sensors; the association unit 101 that calculates a relationship (association information) between the predicted object information and the plurality of pieces of object information; the integration processing mode determination unit 102 that switches an integration processing mode (a high-processing-load integration processing mode in which processing of integrating the plurality of pieces of object information is relatively detailed processing, and a low-processing-load integration processing mode in which processing of integrating the plurality of pieces of object information is relatively simple processing) for determining a method of integrating the plurality of pieces of object information on the basis of a positional relationship between a specific region (for example, a boundary portion) in an overlapping region of detection regions of the plurality of external recognition sensors and the predicted object information; and, the integration target information generation unit 104 that integrates the plurality of pieces of object information related to (corresponding to) the predicted object information on the basis of the integration processing mode.
According to the present embodiment, for example, the integration processing with a high processing load is allocated to the boundary portion in the overlapping region of the sensors, and the integration processing with a low processing load is performed other than that, so that the ratio of the integration processing with a high processing load as a whole is minimized, and the effect of reducing the processing load can be expected. In addition, regardless of the high-processing-load integration processing or the low-processing-load integration processing, the integration target information generation unit 104 in consideration of the error covariance (error distribution) of the object takes components with small errors of the plurality of sensors, and the accuracy as a system can be improved.
According to the present embodiment, it is possible to reduce the load of the integration processing so as to satisfy the minimum necessary accuracy required for the vehicle travel control according to the detection situation, and it is possible to obtain the effects of improving the processing performance of an ECU and suppressing an increase in cost.
In the second embodiment, as in the first embodiment, the functional block diagram of
In
The travel route estimation unit B012 estimates a travel route F22 (see
A scene illustrated in
By adopting the travel route F22 of the second embodiment, in addition to the effects of the first embodiment, the ratio of the integration processing with a high processing load can be further minimized, and the effect of reducing the processing load is improved. Since the travel route on which the own vehicle travels is emphasized, the pedestrian F21 that is likely to require an alarm or a brake is preferentially processed. In addition, in a case where a bicycle or a motorcycle passing on the left of the own vehicle is present, a sensor detection range F26 illustrated in
In the third embodiment, as in the first embodiment, the autonomous driving system configuration of
F31 and F32 illustrated in
By adopting the concept of the reliability of the sensors (external recognition sensors) of the third embodiment, in addition to the effects of the first embodiment, the ratio of integration processing with a high processing load can be further minimized, and the effect of reducing the processing load is improved. In addition, by excluding information from a sensor having low reliability, an effect of preventing erroneous detection of an integration result and a decrease in accuracy is obtained.
In addition, the first, second, and third embodiments may be combined, and the mode may be switched between the high processing load mode, the low processing load mode, and the selection processing mode based on a combination of the conditions as illustrated in
In the fourth embodiment, as in the first embodiment, the autonomous driving system configuration of
In the fourth embodiment, the integration processing mode 209 is switched (set) based on a tracking state of a predicted object. Here, the tracking state refers to a tracking time at which the predicted object can be continuously tracked without interruption. In a case where the tracking can be performed continuously, the tracking object is given the same tracking ID. In the case of the initial detection in which the tracking time of the predicted object is short, the mode is set to the high processing load mode, and in the case where the tracking time is long to some extent, the mode is switched to the low processing load mode. In addition, the condition may be a case where the tracking time is long and the distance of the object from the own vehicle is long. When it is determined that the existence probability (for example, it is calculated from the tracking time of the predicted object or the like) of the predicted object is low, the low processing load mode is set, and when it is determined that the existence probability of the predicted object is high, the high processing load mode is set. In addition, the mode is switched between the high processing load mode, the low processing load mode, and the selection processing mode according to the existence probability of a sensor object detected by the sensor instead of the predicted object.
By adopting the concept of the tracking state of an object including a predicted object, a sensor object, or the like according to the fourth embodiment, in addition to the effects of the first embodiment, the ratio of integration processing with a high processing load can be further minimized, and the effect of reducing the processing load is improved.
In the fifth embodiment, as in the first embodiment, the functional block diagram of
In
The planned path refers to a target value of a travel path of an own vehicle planned by the autonomous driving plan determination device B007 based on an integration result D011 during autonomous driving. The planned path is converted into a lane-level planned path by the autonomous driving plan determination device B007 on the basis of navigation information from a map unit B004.
The planned path D009d is replaced with the travel route estimation result D012 of
By adopting the planned path D009d for controlling the vehicle (own vehicle) of the fifth embodiment, in addition to the effects of the first embodiment, the ratio of integration processing with a high processing load can be further minimized, and the effect of reducing the processing load is improved. In addition, the integration processing mode 209 can be switched using the path with higher accuracy than that of the second embodiment, and erroneous mode switching is reduced and accuracy is increased.
In the sixth embodiment, as in the first embodiment, the autonomous driving system configuration of
In the sixth embodiment, when the tracking state is stable based on the tracking state of the predicted object, the execution frequency (processing cycle) of the high-load processing mode is decimated, and the low-processing load mode is executed in the decimated processing cycle instead. That is, the processing cycle of the high-load processing mode is made variable on the basis of the tracking state of the predicted object, and when the tracking state is stable, the execution frequency (processing cycle) of the high-load processing mode is set to be long. The stable tracking state includes a case where the object is traveling at a constant speed, a case where the tracking time so far is long, and a case where a situation where an object that is obtained from the sensor and is to be integrated is not repeatedly detected and not detected frequently continues. In addition, the processing cycle of the integration processing mode such as the high processing load mode may be variable according to the tracking state of the sensor object detected by the sensor instead of the predicted object.
By setting the execution cycle of the high-load processing mode of the sixth embodiment to be long, in addition to the effects of the first embodiment, the ratio of the integration processing with a high processing load can be further minimized, and the effect of reducing the processing load is improved. In addition, since it is limited to a case where the tracking state of the object such as the predicted object is stable, an abrupt change in the position of the object can be suppressed by the integration update with the prediction update.
In the seventh embodiment, as in the first to sixth embodiments, the autonomous driving system configuration of
As illustrated in
In addition, in the processing of the association unit 101 in
According to the seventh embodiment, similarly to the first embodiment, the ratio of the high-processing-load integration processing can be minimized, and the effect of reducing the processing load is enhanced.
Note that the present invention is not limited to the above-described embodiments and includes various modifications. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the described configurations. In addition, a part of the configuration of a certain embodiment can be replaced with the configuration of another embodiment, and the configuration of a certain embodiment can be added to the configuration of another embodiment. In addition, for a part of the configuration of each embodiment, it is possible to add, delete, and replace another configuration.
In addition, some or all of the above-described configurations, functions, processing units, processing means, and the like may be implemented by hardware, for example, by designing with an integrated circuit. In addition, each of the above-described configurations, functions, and the like may be implemented by software by a processor interpreting and executing a program for implementing each function. Information such as a program, a table, and a file for implementing each function can be stored in a storage device such as a memory, a hard disk, and a solid state drive (SSD), or a recording medium such as an IC card, an SD card, and a DVD.
In addition, the control lines and the information lines indicate what is considered to be necessary for the description, and not all the control lines and the information lines on the product are indicated. In practice, it may be considered that almost all the configurations are connected to each other.
Number | Date | Country | Kind |
---|---|---|---|
2020-103101 | Jun 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/011102 | 3/18/2021 | WO |