The present invention relates to an outside sensing information processing device and relates to an outside sensing information processing device mainly as a component of an advanced safety system or an automatic driving system of a vehicle, which senses the situation of the outside by utilizing also outside sensing information of something other than a vehicle in which the device mounted (hereinbelow, called present vehicle), and detects or recognizes an object which exerts an influence on the vehicle travel.
In recent years, a vehicle in which an advanced safety system which automatically performs brake operation and the like at the time of emergency to avoid collision or reduce damage at the time of collision is mounted is being spread. A vehicle (automatic driving car) in which an automatic driving system capable of making the vehicle autonomously move is mounted is realized at an experimental level or under limited conditions. In an advanced safety system and an automatic driving system, usually, to recognize a situation of the outside of a vehicle, an outside sensor such as a camera and a radar, a laser range finder (Lidar), or a sonar and a signal processing device processing sensing data obtained by the outside sensor and detecting an object and a situation in the circumferential area are mounted. However, there are objects difficult to be detected by an outside sensor of a vehicle depending on the situation of the circumferential area and the distance from the vehicle. To detect and recognize those objects, there is a method of using information obtained by an outside sensor which is provided for something other than a present vehicle by communicating with a device mounted on a road or the like or another vehicle. As a method of using information obtained by an outside sensor of another vehicle, patent literature 1 discloses a method of obtaining position information of an object detected by another vehicle, converting position information of an object detected by the another vehicle to position information for the present vehicle by grasping the positional relation between the present vehicle and the another vehicle by detecting the position of the another vehicle by an outside sensor of the present vehicle, and using the converted information.
Patent literature 1: Japanese Unexamined Patent Application Publication No. 2018-81609
However, in the method disclosed in the patent literature 1, the position of another vehicle has to be recognized by the outside sensor of the present vehicle. Since it is a precondition that the another vehicle can be observed from the present vehicle, information of another vehicle which cannot be observed from the present vehicle cannot be used. There is a method of obtaining the position of the another vehicle by the present vehicle by transmitting the position in a map of the another vehicle which is recognized by the another vehicle from the another vehicle to the present vehicle. However, the precision of obtaining a position in a map varies among vehicles and there is a case that, depending on a situation, the position in a map of the another vehicle becomes inaccurate. Therefore, when the position in the map of the another vehicle transmitted from the another vehicle is used as it is as a reference, there is the possibility that precision of position information of an object detected by the another vehicle is low so that the operation of an advanced safety system or an automatic driving system performing warning and vehicle control is not properly performed depending on the position of an object. For example, calculation of a collision prediction time to an obstacle becomes inaccurate, and there is the possibility such that an unnecessary hard brake is activated or a brake is weak and it becomes difficult to avoid collision. Since the position in a map is used, in the case where there is a deviation between the map of a present vehicle and the map of another vehicle, when position information of an object detected by the another vehicle is used using the position of the another vehicle transmitted from the another vehicle as a reference and the object is set as it is in the map of the present vehicle, the object may be set in a deviated place in the map of the present vehicle. As a result, there is the possibility that the operation of an advanced safety system or an automatic driving system performing warning and vehicle control cannot be properly performed.
The present invention has been achieved in consideration of the problem as described above and an object of the invention is to provide an outside sensing information processing device capable of suppressing the adverse influence on warning and vehicle control of an advanced safety system and an automatic driving system, at the time of using position information of an object detected by another vehicle, by suppressing a position error at the time of reflecting the existence of the object into a map of a present vehicle, verifying position information precision of the another vehicle, and avoiding use of the information of the another vehicle depending on a situation.
To achieve the object, an outside sensing information processing device according to the present invention, which detects position of an object or an indication existing on the outside of a first moving object, is characterized by including: a receiving function of receiving configuration information of an environmental map as a reference of a position, which is extracted from sensing information by an outside sensor mounted in a second moving object or a stationary object; a matching function of matching configuration information of the environmental map obtained by the receiving function and configuration information of an environmental map obtained by the function of the sensing information processing device; and a correcting function, by using a matching result of the matching function, of correcting position information of an object or an indication existing on the outside detected by an outside sensor mounted in the second moving object or the stationary object.
An outside sensing information processing device according to the present invention, which detects position of an object or an indication existing on the outside of a first moving object, is characterized by having: a receiving function of receiving configuration information of an environmental map as a position reference, which is extracted from sensing information obtained by an outside sensor mounted in a second moving object or a stationary object and position information of the second moving object or the stationary object; a matching function of matching configuration information of the environmental map obtained by the receiving function and configuration information of an environmental map obtained by the function of the sensing information processing device; a position calculating function of calculating the position of the second moving object or stationary object by using a matching result of the matching function; and an error detecting function of detecting an error of a position of the second moving object or the stationary object recognized by the second moving object or the stationary object by comparing a calculation result of the position calculating function and position information of the second moving object or the stationary object obtained by the receiving function.
According to the present invention, for example, at the time of transmitting position information of an object or an indication detected by an outside sensor mounted in another vehicle (which may be an outside sensor such as a camera installed on a road), the positions of feature points of an object (stationary object) and an indication as configuration information of an environmental map as a reference of a position are also detected and transmitted. When a present vehicle receives those pieces of information, the position of the object or indication on the environmental map of the present vehicle is calculated (corrected) on the basis of the relative position relation between position information of an object or indication as a target and the positions of feature points and the positions of the feature points grasped by the present vehicle.
Another vehicle transmits also position information of itself, and the present vehicle recognizes an error of the position information of the another vehicle grasped by the another vehicle from the received position information of the another vehicle and the relation between the position of a feature point detected by the another vehicle and the position of the feature point detected by itself and, when the error is large or in an unstable case, avoids use of the position information from the another vehicle.
As described above, in the present invention, the position of an object or an indication detected by an outside sensor mounted in another vehicle is calculated on the basis of a feature point detected by the outside sensor mounted in another vehicle, so that a detection result of the outside sensor mounted in another vehicle can be used also in a situation where the another vehicle and the like cannot be detected from the present vehicle. In addition, since the position of an object or an indication detected by an outside sensor mounted in another vehicle is calculated using, as a reference, the position of a feature point grasped by the present vehicle, the position precision at the time of reflecting the position of an object or an indication detected by an outside sensor mounted in another vehicle into a map of the present vehicle improves.
By comparing the position of another vehicle detected by the another vehicle itself and the position of the outside sensor mounted in the another vehicle calculated by the present vehicle, the position precision of the another vehicle is grasped. In such a manner, another vehicle which is improper for position information use can be detected. Consequently, by limiting use of the position information of a detected object or indication provided by the another vehicle and the position information of the another vehicle detected by the another vehicle itself, the adverse influence on the operation of the advanced safety system and the automatic driving system can be suppressed.
The other objects, configurations, and effects will become apparent by the following description of embodiments.
Hereinafter, embodiments of an outside sensing information processing device (hereinbelow, also simply called information processing device) of the present invention will be described with reference to the drawings.
A situation of applying a first embodiment of the present invention will be described by using
As illustrated in
A configuration example of hardware of the first embodiment of the present invention, that is, a configuration example of hardware in which the first and second outside sensing information processing devices 100A and 300A illustrated in
As illustrated in
The ECU 200 has therein a microcomputer 210, a main storing device 221, a flash memory 226, a CAN transceiver 231, a sensor interface 236, and a V2X module 241.
The microcomputer 210 is a device executing software for realizing various functions, transmits/receives information to/from the connected devices on the basis of an instruction of a program, and processes the information.
The flash memory 226 is used to hold data necessary for the software which is executed by the microcomputer 210 and data necessary for the microcomputer 210, which has to be stored even when the power supply is disconnected.
The main storing device 221 is a storing device used to make the software operate by the microcomputer 210 and temporarily hold data. As the main storing device 221, a RAM (Random Access Memory) is usually used.
The CAN transceiver 231 is a device for adjusting mainly electric characteristics when the microcomputer 210 performs communication via the CAN bus 251. The sensor interface 236 is an interface used by the microcomputer 210 to communicate with the outside sensor module 237 mounted in the vehicle and is, for example, a device for performing a high-speed serial communication.
The outside sensor module 237 is a module in which an outside sensor such as an in-vehicle camera and a function of converting a signal obtained by the outside sensor to a signal adapted to the sensor interface 236 are integrated. The outside sensor module 237 may also have a recognition process function or the like.
There is also a case that communication with the microcomputer 210 is performed via the CAN bus 251 without using the sensor interface 236 which is prepared to connect the outside sensor module 237 to the microcomputer 210. Hereinafter, the outside sensor module 237 itself including an outside sensor sensing (detecting) the situation of the outside is also simply called an outside sensor.
The V2X module 241 is a device for allowing the microcomputer 210 to communicate with other vehicles and facilities on roads and is a device for performing wireless communication by connecting the V2X antenna 242 as an antenna for V2X communication on the outside of the ECU 200.
The GPS module 246 connected to the microcomputer 210 is a device for allowing the microcomputer 210 to obtain present time and present position. The GPS module 246 receives radio waves from a GPS satellite by an internal antenna and, further, analyzes a reception signal by an LSI to obtain the position and time.
Since the functional differences of the information processing devices 100A and 300A of the embodiment relate to the differences in software operated by the microcomputer 210, the differences in the processing capability of the microcomputer 210, the differences in the capacities of the main storing device 221 and the flash memory 226 used to store and process the software and data, the differences of the outside sensor module 237 and the like, the hardware configuration of the information processing devices 100A and 300A is as illustrated in
A configuration example of function blocks of the first embodiment of the present invention will be described by using
The information processing device (type FA) 300A has a time managing unit B 331 managing time of the entire device, a vehicle position detecting unit B 301 having the functions corresponding to a navigation system in which a GPS and the like are mounted and obtaining the position and the orientation on a map of the vehicle in which the device is mounted, an outside sensor connecting unit B 322 obtaining sensing information by an outside sensor B 321 sensing the situation of the outside by communication, a recognition processing unit B 341 detecting an object and an indication such as a figure and characters existing on the outside from the sensing information obtained by the outside sensor B 321 via the outside sensor connecting unit B 322 and recognizing relative position and orientation using the outside sensor B 321 as a reference, a reference feature point extraction processing unit B 311 extracting feature points (hereinafter, called reference feature points) of a stationary object and an indication which can be configuration information of an environmental map of the circumferential area and can be used as position references on the map from the result of recognition of the recognition processing unit B 341, and a communication processing unit B 351 performing communication with the information processing device (type EA) 100A mounted in a different vehicle.
The time managing unit B 331 has a function of suppressing an error within one millisecond by, for example, obtaining synchronization with time information of the GPS every one hour at maximum and, if precision cannot be assured for a reason such that the time information of the GPS cannot be obtained for long time and synchronization cannot be obtained, notifying the fact to each of the components of the information processing device (type FA) 300A and to another information processing device which receives information from the information processing device (type FA) 300A via the communication processing unit B 351. In the case of using the GPS, the time managing unit B 331 may share the GPS signal receiving function with the vehicle position detecting unit B 301.
As the outside sensor B 321, although use of a camera, a laser range finder, a millimeter-wave radar, an ultrasonic sensor (sonar), or the like is assumed, another sensor may be used as long as the sensor can sense the situation of the outside. At the time of outputting sensing information, the outside sensor B 321 adds time (sensing time) at which sensing is executed.
At the time of integrating outside sensing results by various outside sensors, to correct changes in the position of each object with time, the sensing time is important. The sensing time added by another information processing device is used on precondition that the time precision is assured. Consequently, in the case where the time precision is not assured, the function of notifying another information processing device of the fact is necessary.
The information of the position and orientation on the map of the vehicle obtained by the vehicle position detecting unit B 301, time at which they are obtained, the result recognized by the recognition processing unit B 341, time at which sensing information used for the recognition is obtained, and a feature point detected by the reference feature point extraction processing unit B 311 is transmitted to another information processing device (that is, the information processing device (type EA) 100A) via the communication processing unit B 351.
The result recognized by the recognition processing unit B 341 includes the kind of the recognized object (in the case where the kind is unknown, the kind may be “unknown”) and position information of the object. Preferably, the position information includes a relative position using the outside sensor B 321 as a reference. However, if information necessary to calculate the relative position is included in the information transmitted from the communication processing unit B 351, the position information may be a relative position on the coordinates managed by the environmental map of the information processing device (type FA) 300A or a position using longitude and latitude. For example, when the reference point of a vehicle in which the information processing device (type FA) 300A is mounted is set as the origin and, in case of position information in a coordinate system adapted to the orientation of the vehicle, the information of the position and the orientation of the outside sensor B 321 in the coordinate system is used, and a relative position using the outside sensor B 321 as a reference can be calculated. In the case where the position information is longitude and latitude, there is information of the position and the orientation as a reference used to calculate the longitude and the latitude (for example, information obtained by the vehicle position detecting unit B 301 and the information of the vehicle installation position and the orientation of the outside sensor B 321). Consequently, from those pieces of information, the relative position using the outside sensor B 321 as a reference can be calculated.
The recognition processing unit B 341 has, to use the recognition result in the vehicle in which the information processing device (type FA) 300A is mounted, an outside recognition output B 391 as an output (value). By using the outside recognition output B 391, the vehicle in which the information processing device (type FA) 300A is mounted can use a process result of the recognition processing unit B 341 for, for example, a control of an advanced drive assisting function.
The information processing device (type EA) 100A has a time managing unit A 131 managing time of the entire device, a vehicle position detecting unit A 101 having the functions corresponding to a navigation system in which a GPS and the like are mounted and obtaining the position and the orientation on a map of the vehicle in which the device is mounted, an outside sensor connecting unit A 122 obtaining sensing information by an outside sensor A 121 sensing the situation of the outside by communication, a recognition processing unit A 141 detecting an object and an indication such as a figure and characters existing on the outside from the sensing information obtained by the outside sensor A 121 via the outside sensor connecting unit A 122 and recognizing relative position and orientation using the outside sensor A 121 as a reference, a reference feature point extraction processing unit A 111 extracting reference feature points which can be configuration information of an environmental map of a circumferential area and can be used as position references on the map from the result of recognition of the recognition processing unit A 141, and a communication processing unit A 151 performing communication with the information processing device (type FA) 300A mounted in a different vehicle.
The time managing unit A 131 has a function of suppressing an error within one millisecond by, for example, obtaining synchronization with time information of the GPS every one hour at maximum and, if precision cannot be assured for a reason such that the time information of the GPS cannot be obtained for long time and synchronization cannot be obtained, notifying the fact to components of the information processing device (type EA) 100A. In the case of using the GPS, the time managing unit A 131 may share the GPS signal receiving function with the vehicle position detecting unit A 101.
As the outside sensor A 121, although use of a camera, a laser range finder, a millimeter-wave radar, an ultrasonic sensor (sonar), or the like is assumed, another sensor may be used as long as it can sense the situation of the outside. At the time of outputting sensing information, the outside sensor A 121 adds time (sensing time) at which sensing is executed.
The information processing device (type EA) 100A further includes, to use information obtained from the information processing device (type FA) 300A by using the communication processing unit A 151, a reference feature point matching processing unit 161, a position correction processing unit A 171, and a recognition result integration processing unit 181.
The reference feature point matching processing unit 161 makes association (matching process) between a reference feature point extracted by the reference feature point extraction processing unit A 111 and a reference feature point extracted by the reference feature point extraction processing unit B 311 obtained via the communication processing unit B 351 in consideration of the position and the orientation obtained by the vehicle position detecting unit A 101 and the position and the orientation obtained by the vehicle position detecting unit B 301 via the communication processing unit A 151.
In the association (matching process), first, the same objects or indications existing in almost the same position are matched, and feature points of the objects or indications are matched in consideration of the positional relation between the feature points and the orientations of the outside sensors of the information processing devices.
With respect to the matched feature point, using the position and orientation information of the feature point obtained by the recognition processing unit A 141 and obtained as a reference feature point by extraction of the reference feature point extraction processing unit A 111 and the position and orientation of the vehicle obtained by the vehicle position detecting unit A 101, the position on the environmental map of the circumferential area is calculated. The position is passed as the position on the environment map of the circumferential area of the reference feature point recognized by the information processing device (type EA) 100A to the position correction processing unit A 171.
The position information of the reference feature point obtained from the information processing device (type FA) 300A via the communication processing unit A 151 is also passed as position information of the reference feature point recognized by the information processing device (type FA) 300A to the position correction processing unit A 171. Preferably, the position information from the information processing device (type FA) 300A includes information of a relative position using the outside sensor B 321 as a reference.
The position correction processing unit A 171 corrects position information of a reference feature point recognized on the side of the information processing device (type FA) 300A obtained via the communication processing unit A 151 and a recognized object other than the reference feature point (in other words, an object or indication existing on the outside detected by the outside sensor B 321). For this correction, a position recognized as the information processing device (type EA) 100A obtained from the reference feature point matching processing unit 161 and a position recognized as the information processing device (type FA) 300A relative to the same reference feature point are used (the details will be described later). The corrected position information is sent together with information of the recognition object corresponding to the position to the recognition result integration processing unit 181.
In the case where information which can determine stop of the vehicle in which the information processing device (type FA) 300A is mounted, for example, vehicle velocity information or the like can be obtained from the information processing device (type FA) 300A, after time in which vibrations of the vehicle are considered to be diminished after stop of the vehicle lapses and a situation that the outside sensor B321 is regarded to be stationary, the position on the environmental map is calculated once with respect to a reference feature point obtained from the information processing device (type FA) 300A. After that, until the outside sensor B 321 starts moving in association with the start of motion of the vehicle (specifically, until start of the motion of the outside sensor B 321 is recognized), association (matching process) of reference feature points is fixed and the association (matching process) of the reference feature points in the reference feature point matching processing unit 161 is omitted (stopped), thereby reducing the process load of the reference feature point matching processing unit 161. Alternately, without completely omitting the association of the reference feature points in the reference feature point matching processing unit 161, it can be considered to make the frequency (process interval) of performing the association longer than the normal case.
The recognition result integration processing unit 181 integrates a recognition object recognized by the recognition processing unit A 141 and a recognition object on the side of the information processing device (type FA) 300A obtained from the position correction processing unit A 171 and outputs the resultant as an outside recognition output A 191. Since the relative position of the recognition object recognized by the outside sensor A 121 is converted to a position on the map by the recognition processing unit A 141, the recognition result integration processing unit 181 also uses an output of the vehicle position detecting unit A 101.
The recognition result integration processing unit 181 reflects the position of the vehicle in which the information processing device (type FA) 300A is mounted into the environmental map by using the position and the orientation on the map of the vehicle obtained by the vehicle position detecting unit B 301 via the communication processing unit A 151. In a possible case, the recognition result integration processing unit 181 also reflects the position of the vehicle in which the information processing device (type FA) 300A is mounted into the environmental map by using the position information corrected by the position correcting unit A 171. For example, in the case where the relation between the position and the orientation of the points indicating the outside sensor B 321 and the vehicle position is provided from the information processing device (type FA) 300A and the position of the reference feature point is a relative position from the outside sensor B 321, the position of the vehicle in which the information processing device (type FA) 300A corrected to the coordinates of the environmental map managed by the recognition result integration processing unit 181 is mounted is obtained by calculation.
In the recognition result integration processing unit 181, at the time of generating the environmental map by integrating various position information, correction of a positional deviation caused due to variations of sensing time is also performed. For this process, in the recognition result integration processing unit 181, at the time of generating an environmental map, reference time is determined, interpolation or extrapolation is performed on the basis of a change in a time sequence of position information of each object, and each object is disposed in an estimated position at the reference time.
To facilitate correction of a positional deviation which occurs due to variations of time in the recognition result integration processing unit 181, it is also considered to obtain the velocity vector of the object detected by the outside sensor A 121, the outside sensor B 321, the recognition processing unit A 141, or the recognition processing unit B 341 and transfer also the information to the recognition result integration processing unit 181. By performing the position correction on the basis of the velocity vector, information which can be used to estimate a position change with time of the detected object increases, and improvement of position estimation precision at the time as a reference can be expected. Further, when the position estimation precision accompanying the time change improves, it can be also considered to use the precision improvement amount to suppress the frequency of receiving the information from the information processing device (type FA) 300A.
After reflecting an object which can be determined as a stationary one from the velocity vector once into the environmental map, by eliminating the object from the position calculation process until the motion is detected again or a predetermined period elapses, the computation process load of the position correction processing unit A 171 or the recognition result integration processing unit 181 can be reduced.
The outside recognition output A 191 as an output of the recognition result integration processing unit 181 includes the position and kind of each object, a time-sequential change, and a velocity vector included in input information on the basis of the generated environmental map and can be used, for example, for determination and control necessary for an advanced drive assistance function and an automatic driving function.
To the communication processing unit A 151, a determination information input A 195 as an input of a result of a determining process as a part of an advanced drive assisting function and an automatic drive function is supplied. For example, in the case where a hard braking is made as a result of the determining process, this input is used for transmitting execution of the hard braking together with the position and identification information (such as car registration plate information) of the vehicle in which the information processing device (type EA) 100A is mounted to an information processing device of a vehicle existing in the surrounding from the communication processing unit A 151. The information processing device of the vehicle which received the information determines whether the information is from a vehicle travelling ahead or not on the basis of the relative position and the identification information and, if the information is from the vehicle ahead, by demanding the determining function as a component of the advanced drive assisting function and the automatic drive function to start braking, the risk of collision to the vehicle ahead can be suppressed.
The feature points extracted as reference feature points by the reference feature point extraction processing unit B 311 of the information processing device (type FA) 300A and the reference feature point extraction processing unit A 111 of the information processing device (type EA) 100A are corners of various markers (stop line, sign indicating intersection, traffic lane, and the like) on the roads, a part closest to the ground of a specific sign plate, the center of a pole at the root of the pole on which a sign is mounted, and the like. Other things may be also employed as long as they are points as references to specify the positions on a map.
With reference to
To the intersection illustrated in
It is assumed that the precision of the vehicle position detecting unit B 301 of the information processing device (type FA) 300A is low and there is an error in the orientation of the second vehicle 702. Due to the influence of the error, the information processing device (type FA) 300A recognizes the intersection mark 751 as an intersection mark 761 which is position deviated, the intersection mark feature point 752 as an intersection mark feature point 762 which is position deviated, the second stop line 755 as a second stop line 765 which is position deviated, the root 756 of the second supporting pillar as a root 766 of a second supporting pillar which is position deviated, and the bicycle 731 as a bicycle 732 which is position deviated. That is, the information processing device (type FA) 300A transmits information corresponding to a recognition result which is influenced by the error from the communication processing unit B 351 to the information processing devices of the vehicles existing in the circumferential area.
An environmental map 900 of the circumferential area recognized by the information processing device (type EA) 100A includes a first vehicle 901 on the environmental map based on vehicle self-position information detected by the vehicle position detecting unit A 101, a second vehicle 902 on the environmental map based on the position information provided by the information processing device (type FA) 300A, and an intersection mark 951 on the environmental map and an intersection mark feature point 952 on the environmental map obtained by the recognizing process and the vehicle position detecting process of the information processing device (type EA) 100A. In addition, a bicycle 931 on the environmental map based on position information which is subjected to position correction by the position correction processing unit A 171 exists. In reality, those positions are results of correcting the positional deviations caused by time differences in the recognition result integration processing unit 181.
A vector A 601 indicates a relative position to an intersection mark feature point 762 which is position deviated from the outside sensor B 321, and a vector B 602 indicates a relative position to the bicycle 732 which is position deviated from the outside sensor B 321. That is, the vector A 601 and the vector B 602 correspond to position vectors in a three-dimensional orthogonal coordinate system which is fixed in a certain direction using the outside sensor B 321 as the origin and the orientation of the outside sensor B 321 as a reference.
By subtracting the vector A 601 from the vector B 602, a vector X 604 indicating the relative position to the bicycle 732 which is position deviated from the deviated intersection mark feature point 762 can be obtained.
Since the vector X 604 does not depend on the position and the orientation of the outside sensor B 321, it can be regarded as the same vector value indicating the relative position to the bicycle 732 from the intersection mark feature point 752 (which is not position-deviated). Consequently, by adding the vector X 604 to the position of the intersection mark feature point 952 on the environmental map, the position using the intersection mark feature point 952 on the environmental map as a reference point can be calculated. The vector X 604 is added after performing proper conversion so as to be adapted to the coordinate system of the environmental map.
By using the calculated position as a position correction result by the position correcting unit A 171, the position of the bicycle 931 on the environmental map can be determined. In the process, the positions of the vector A 601 and the intersection mark feature point 952 on the environmental map become correction parameters to correct the position of the position-deviated bicycle 732 indicated by the vector B 602.
Since the vector X 604 is a vector indicating the relative position to the bicycle 732 which is position deviated from the position-deviated intersection mark feature point 762, the position of the bicycle 732 which is position deviated may be calculated by using the position recognized as the information processing device (type FA) 300A in place of the vector A 601 and the vector B 602. However, when the difference is calculated on the basis of the relative position information output from the outside sensor B 321, that is, the relative position information from the outside sensor which is output from the same outside sensor like the vector A 601 and the vector B 602, there is an advantage that the influence of an installation shift in the vehicle of the outside sensor, the influence of vibration of the vehicle, and the like can be cancelled off. In the case of calculation using the vector A 601 and the vector B 602, the calculation does not depend on the position of the vehicle 702 in which the information processing device (type FA) 300A is mounted, so that the position information of the vehicle 702 is also unnecessary.
Although the correction of the position of the (position-deviated) bicycle 732 existing in a position which comes in a blind spot from the first vehicle 701 in which the information processing device (type EA) 100A is mounted due to the existence of the wall 791 by using the position-deviated intersection mark feature point 762 and the like has been described as an example, obviously, the other recognition objects (the intersection mark 761, the intersection mark feature point 762, the second stop line 765, and the root 766 of the second supporting pillar) can be similarly corrected.
Although the third vehicle 703 traveling behind the first vehicle 701 is not reflected in the environmental map 900, in the case where the third vehicle 703 has the function of notifying another vehicle of the self-position or an outside sensor for monitoring the rearward exists in the first vehicle 701, the existence of the third vehicle 703 can be reflected in the environmental map 900 from the information.
In the case where the first vehicle 701 determines to activate an emergency brake on the basis of the environmental map 900, the information is transmitted (together with the determination information input A 195) from the communication processing unit A 151. When the third vehicle 703 can receive the information and it is determined that the information is from the vehicle 701 ahead of the third vehicle 703, the third vehicle 703 starts deceleration and can prepare for a hard braking of the first vehicle 701.
In
Sensor ID 12 is an identification number of an outside sensor used for detecting the position. Since an outside sensor in the information processing device (type FA) 300A is the outside sensor B 321 only, all of the sensor IDs 12 in
A longitude offset 13 and a latitude offset 14 indicate relative positions of a reference feature point and a detected object to the outside sensor using the longitude and latitude axes as references. In a place other than a place of high latitude, it can be regarded that the longitude and latitude cross at right angles in a relatively close region, so that the axes of longitude and latitude can be used. In the case of indicating the position of a wide object, for example, the position corresponding to the center of the width of the object is indicated.
An altitude offset 15 indicates height of a reference feature point and a detected object using the position of the road surface as a reference. In the case where the height of a tall object is indicated, for example, the height of the center of the object is indicated.
Object ID 16 has, together with information indicating a reference feature point or another detected object, information of an identification number assigned for a lump of indications (such as a figure) or on the object unit basis and identification number assigned every feature point belonging to a lump of indications (such as a figure) and an object. As a concrete example, as illustrated by the intersection mark 761 in the bottom of
Kind 17 indicates kind information determined by the indication or object recognized by the recognition processing unit B 341. Since the kind information is useful for behavior prediction in the case of performing determination in advanced drive assist or automatic driving, it is included in transmission information. When the recognition processing unit B 341 cannot determine the kind but detects the presence of any indication or object, the kind may be “unknown”. However, even in the case of “unknown”, when it can be discriminated that the object is a stationary object or a moving object, moving/stationary determination information may be added like “unknown stationary object” and “unknown moving object”.
Width 18 and height 19 are information indicating width and height of an indication and an object detected. In the case of a feature point including a reference feature point, it indicates “point”, so that each of width and height becomes zero. The width 18 and the height 19 correspond to the width 18 and the height 19 of a rectangle shape surrounding an object from the viewpoint of an outside sensor detecting the object, for example, like the bicycle 732 illustrated in
Reliability 20 indicates reliability of each reference feature point, the existence of a detected object, and position information. Since it is difficult to perform the recognizing process always accurately, the reliability is lowered until the recognition result becomes stable.
The reliability is lowered also in the case such that the position of a feature point may be influenced by blur of a part of an indication such as a figure or deformation of the supporting pillar 727 of a sign. The information processing device (type EA) 100A preferentially uses information with high reliability among received information. The information processing device (type EA) 100A refers to, as a reference feature point, only a reference feature point corresponding to a reference feature point with high reliability which can be extracted also by the reference feature point extraction processing unit A 111.
In the case where a plurality of reference feature points with high reliability can be used, position correction of a detected object is performed by using a representative one point or a plurality of points. In the case of using a plurality of reference feature points, a correction position is calculated by using each of the reference feature points, and an average position of the calculated ones is obtained and used. The maximum number of reference feature points used may be limited without using all of the reference feature points in consideration of the processing load of the information processing device (type EA) 100A.
With respect to the position of a detected object, in a possible case, information of feature points corresponding to both ends of width like a third feature point 51 and a fourth feature point 52 illustrated in
Addition of a velocity vector to the information of a detected object can be also considered. In this case, for example, when the velocity vector is a velocity vector relative to the outside sensor B 321, it can be handled as a value indicating a change with time of the vector B 602 in the information processing device (type EA) 100A and can be used for position correction accompanying time variation in the recognition result integration processing unit 181.
In the embodiment, for convenience, it is described on the precondition that the first outside sensing information processing device 100A and the second outside sensing information processing device 300A are basically mounted in vehicles. However, they may be mounted in moving objects different from vehicles or the second outside sensing information processing device may be mounted in a stationary object installed on a road or the like. In the case of mounting the devices in a moving object different from a vehicle or a stationary object, the “vehicle” may be considered as the “object”.
For example, as illustrated in
As the process of the information processing device (type EA) 100A in this case, a process similar to that in the case where the vehicle in which the information processing device (type FA) 300A is mounted stops can be performed. Specifically, once the position on the environmental map is calculated with respect to a reference feature point obtained from the information processing device (type FA) 300A, by fixing the position on the environmental map and omitting (stopping) association (matching process) of the reference feature point in the reference feature point matching processing unit 161, the process load of the reference feature point matching processing unit 161 can be reduced.
Another configuration may be employed such that the processing units constructing the inner part illustrated as the information processing device in the embodiment may be mounted as different devices, and the devices are connected to one another. Further, the information processing device (type EA) 100A may communicate with a plurality of information processing devices (type FA) 300A at the same time, integrate information, and output the resultant as the outside recognition output A 191.
In the case of performing communication with a plurality of information processing devices (type FA) 300A (in other words, in the case of receiving sensing information by a plurality of outside sensors B 321), on each of roads gathered at an intersection to which the first vehicle 701 in which the information processing device (type EA) 100A is mounted is approaching, the first vehicle 701 faces towards the intersection, and the information processing device (type EA) 100A uses information (sensing information) from (the outside sensor mounted in) a vehicle closest to the intersection preferentially for a matching process or the like. Information (sensing information) from (the outside sensor mounted in) a vehicle whose orientation is close to a direction which is orthogonal to the orientation of the first vehicle 701 is preferentially used for a matching process or the like.
By using the information (sensing information) at such priority, even in the case where information from all of the information processing devices (type FA) 300A cannot be processed in computation load and a communication destination has to be limited, the possibility that position information of an object with high importance is obtained increases for the following reasons. The possibility that, on each of roads gathered at an intersection, there is an obstacle in front of a vehicle closest to the intersection is low, and a blind area by an obstacle occurs more easily, as compared with a road on which the first vehicle 701 travels, on a road crossing the road.
In the information processing device (type EA) 100A, the recognition processing unit A 141, the reference feature point extraction processing unit A 111, and the vehicle position detecting unit A 101 are connected to the communication processing unit 151 and the function of notifying a situation of assurance of time precision of the time managing unit A 131 from the communication processing unit 151 to another information processing device is added. In such a manner, the function of the information processing device (type FA) 300A can be added to the information processing device (type EA) 100A. Between vehicles in each of which an information processing device having such a configuration is mounted, a reference feature point and position information of an object or an indication can be transmitted to both of them, and each of the vehicles can correct the position information of the object or indication of the other vehicle and use the resultant information.
As described above, the information processing device (type EA) 100A of the first embodiment has: the receiving function (the communication processing unit A 151) of receiving configuration information (reference feature points) of an environmental map as a reference of position, which is extracted from sensing information of the outside sensor B 321 mounted in a second moving object or a stationary object (the second vehicle 702 or the intersection monitoring camera 811); the matching function (the reference feature point matching processing unit 161) of matching configuration information (reference feature point) of the environmental map obtained by the receiving function to configuration information (reference feature point) of an environmental map obtained by a function of the information processing device (type EA) 100A (in the embodiment, the outside sensor A 121, the reference feature point extraction processing unit A 111, or the like mounted in a first moving object (the first vehicle 701); and a correction function (the position correction processing unit A 171) of correcting position information of an object or an indication existing on the outside of the first moving object (the first vehicle 701) detected (sensed) by the external sensor B 321 mounted in the second moving object or a stationary object (the second vehicle 702 or the intersection monitoring camera 811) by using the matching result of the matching function.
According to the first embodiment, for example, at the time of transmitting position information of an object or an indication detected by the outside sensor mounted in (the information processing device (type FA) 300A) of another vehicle (or an outside sensor of a camera mounted on a road), the position of a feature point of the object (stationary object) and the indication as configuration information of an environmental map as a reference of positions is also detected and transmitted. When (the information processing device (type EA) 100A) of the present vehicle receives those pieces of information, the position of the object or indication on the environmental map of the present vehicle is calculated (corrected) on the basis of a relative position relation of the position information of an object or indication as a target and the position of the feature point (reference feature point) and the position of the feature point (reference feature point) grasped by the present vehicle.
As described above, in the first embodiment, the position of an object or indication detected by the outside sensor B321 mounted in another vehicle is calculated on the basis of a feature point detected by the outside sensor B321 mounted in another vehicle. Consequently, even in a situation that the present vehicle cannot detect another vehicle or the like, the vehicle can use a detection result of the outside sensor B 321 mounted in the another vehicle. Moreover, since the position of an object or indication detected by the outside sensor B 321 mounted in another vehicle is calculated using the position of a feature point (reference feature point) grasped by the present vehicle as a reference, the position precision at the time of reflecting the position of the object or indication detected by the outside sensor B321 mounted in another vehicle into the map of the present vehicle improves.
A configuration example of function blocks of a second embodiment of the present invention will be described with reference to
First, an information processing device (type FB) 300B will be described. The information processing device (type FB) 300B of the second embodiment is obtained by adding a reference feature point map B 318 to the configuration of the information processing device (type FA) 300A of the first embodiment, and changing the reference feature point extraction processing unit B 311 to a reference feature point selection processing unit B 316.
The reference feature point map B 318 is configuration information of an environmental map of the circumferential area and is detailed map information (map data) having identification ID, place, and kind of a reference feature point which can be used as a position reference on the map. In the embodiment in which the reference feature point map B 318 exists, the reference feature point selection processing unit B 316 makes determination from the information of the position and orientation of a vehicle obtained from the vehicle position detecting unit B 301, selects a candidate reference feature point from the reference feature point map B 318, and provides it to the recognition processing unit B 341.
At the time of performing a process of recognizing sensing information obtained by the outside sensor B 321 via the outside sensor connecting unit B 322, the recognition processing unit B 341 recognizes a reference feature point on the basis of information provided from the reference feature point selection processing unit B 316. At the time of performing the recognizing process, the recognition processing unit B 341 can know an area to be recognized to detect a reference feature point from the provided information. Consequently, the efficiency of the process of recognizing a reference feature point improves.
The position and the orientation on a map of a vehicle obtained by the vehicle position detecting unit B 301, time at which they are obtained, a result of recognition of an object and a reference feature point recognized by the recognition processing unit B 341, and time at which sensing information used for recognition is obtained are transmitted to an information processing device (type EB) 100B via the communication processing unit B 351.
The function of the time managing unit B 331 is the same as that of the first embodiment.
Next, the configuration of the information processing device (type EB) 100B will be described. The information processing device (type EB) 100B of the second embodiment is obtained by, with respect to the configuration of the information processing device (type EA) 100A of the first embodiment, adding a reference feature point map A 118, changing the reference feature point extraction processing unit A 111 to a reference feature point selection processing unit A 116, and adding a reference feature point selection processing unit AB 117.
The reference feature point map A 118 is configuration information of an environmental map of the circumferential area and detailed map information (map data) having information of identification ID of a reference feature point, location, and kind which can be used as a position reference on the map. In the second embodiment in which the reference feature point map A 118 exists, the reference feature point selection processing unit A 116 makes determination from the information of the position and orientation of a vehicle obtained from the vehicle position detecting unit A 101, selects a candidate reference feature point from the reference feature point map A 118, and provides it to the recognition processing unit A 141. The reference feature point selection processing unit AB 117 makes determination from the information of the position and the orientation of a vehicle obtained from the vehicle position detecting unit B 301 transmitted from the information processing device (type FB) 300B, selects a candidate reference feature point from the reference feature point map A 118, and provides it to the reference feature point matching processing unit 161.
At the time of performing a process of recognizing sensing information obtained by the outside sensor A 121 via the outside sensor connecting unit A 122, the recognition processing unit A 141 recognizes a reference feature point on the basis of information provided from the reference feature point selection processing unit A 116. At the time of performing the recognizing process, the recognition processing unit A 141 can know an area to be recognized to detect a reference feature point from the provided information. Consequently, the efficiency of the process of recognizing a reference feature point improves.
The recognition processing unit A 141 feeds back a result of recognizing a reference feature point to the vehicle position detecting unit A 101. By the feedback, correction can be made so that the position of a recognized reference feature point becomes a position corresponding to the reference feature point map A 118, so that a vehicle and a recognition result of the recognition processing unit A 141 can be reflected in positions using the reference feature point map A 118 as a reference.
The reference feature point matching processing unit 161 performs association (matching process) between a reference feature point recognized on the information processing device (type FB) 300B side and a reference feature point as a candidate provided by the reference feature point selection processing unit AB 117. The reference feature point matching processing unit 161 transmits the matched reference feature point together with the position information of the reference feature point recognized on the side of the information processing device (type FB) 300B while adding the information position in the reference feature point map A 118 to the position correction processing unit A 171.
The position correction processing unit A 171 corrects the position of a detected object or indication and a feature point in a manner similar to the method described with reference to
When the same identification ID is assigned to the same reference feature point in the reference feature point map A 118 and the reference feature point map B 318, the reference feature point matching processing unit 161 can make the matching only by the identification ID, and the matching process of the reference feature point becomes easier.
The function of each of the time managing unit A 131 and the recognition result integration processing unit 181 is the same as that of the first embodiment.
Using
On the other hand, in the configuration of the second embodiment, regardless of sensing by the outside sensor A 121 provided for the information processing device (type EB) 100B, when the second stop line 755 and the second supporting pillar root 756 exist in the reference feature point map A 118, the position correction becomes possible, so that there are more cases which can be used than the first embodiment.
By adding the function of the information processing device (type FB) 300B to the information processing device (type EB) 100B and mounting the function in both of the first and second vehicles 701 and 702, the reference feature point and the position information of an object or indication can be transmitted to both of the vehicles, and the position information of the object or indication of another vehicle can be corrected and used in both of the vehicles like in the first embodiment. In a manner similar to the first embodiment, the information processing device (type EB) 100B can communicate with a plurality of information processing devices (type FB) 300B.
As described above, the information processing device (type EB) 100B of the second embodiment has: the receiving function (the communication processing unit A 151) of receiving configuration information (reference feature points) of an environmental map as a reference of position, which is extracted from sensing information of the outside sensor B 321 mounted in a second moving object or a stationary object (the second vehicle 702 or the intersection monitoring camera 811); the matching function (the reference feature point matching processing unit 161) of matching configuration information (reference feature point) of the environmental map obtained by the receiving function to configuration information (reference feature point) of an environmental map obtained by a function of the information processing device (type EB) 100B (in the embodiment, the reference feature point map A 118 having, as map data, configuration information of an environmental map as the reference of a position, the reference feature point selection processing unit AB 117 selecting a reference feature point existing in the map data, and the like); and a correcting function (the position correction processing unit A 171) of correcting position information of an object or an indication existing on the outside of the first moving object (the first vehicle 701) detected (sensed) by the external sensor B 321 mounted in the second moving object or a stationary object (the second vehicle 702 or the intersection monitoring camera 811) by using the matching result of the matching function.
Consequently, in addition to the fact that effects similar to those of the first embodiment are obtained, the number of scenes in which the device can be used increases, so that convenience improves.
A configuration example of function blocks of a third embodiment of the present invention will be described with reference to
The third embodiment in
In the third embodiment, the information processing device (type FB) 300B is the same as the second embodiment.
In the information processing device (type FB) 300B, however, a vector C 603 indicating the relative position of the outside sensor B 321 to the position reference point 705 of the second vehicle 702 in which the device is mounted is included in transmission information from the communication processing unit B 351 (refer to
The configuration of an information processing device (type EC) 100C will be described. The information processing device (type EC) 100C of the third embodiment is obtained by adding a position calculating unit 173 and a position error managing unit 175 to the configuration of the information processing device (type EB) 100B described in the second embodiment. An output of the reference feature point matching processing unit 161 is connected to the position calculating unit 173, not the position correction processing unit A 171.
The position calculating unit 173 is a part which calculates the position of the position reference point 705 of the vehicle 702 in which the information processing device (type FB) 300B is mounted on the basis of an output (matching result) of the reference feature point matching processing unit 161. The calculating method will be described with reference to
From the information processing device (type FB) 300B, the vector A 601 indicating the relative position of the position-deviated intersection mark feature point 762 as the reference feature point from the outside sensor B 321 and the vector C 603 indicating the relative position from the position reference point 705 of the vehicle 702 to the outside sensor B 321 are transmitted. It is assumed that the same coordinate system as a reference of calculation of the vector A 601 and the vector C 603 is used.
The position calculating unit 173 obtains the position reference point 705 of the vehicle 702 corresponding to the reference feature point map A 118 by subtracting the vector A 601 and the vector C 603 from the position in the reference feature point map A 118 of the intersection mark feature point 752 corresponding to the position-deviated intersection mark feature point 762 as a reference feature point provided from the reference feature point matching processing unit 161.
The position error managing unit 175 compares the information of the position and orientation of each of vehicles transmitted from the vehicles and the information of the position and orientation of each of the vehicles output from the position calculating unit 173 every vehicle in which the information processing device (type FB) 300B is mounted and manages the maximum value of an error, the range of fluctuations in a recent predetermined period, the average value, and stability in a recent predetermined period of the average value. In the case where the maximum value of the error or the magnitude of the fluctuation become a predetermined value or larger, and in the case where the stability of the average value becomes a predetermined value or less, the position error managing unit 175 notifies the recognition result integration processing unit 181 not to use the information of the vehicle. In other words, the position error managing unit 175 checks information of an error of the position managed by the position error managing unit 175 and outputs the result of the check to the recognition result integration processing unit 181. The recognition result integration processing unit 181 selects a vehicle using information of an error of the position or the like on the basis of the output from the position error managing unit 175. The position error managing unit 175 transmits an average value to the position correction processing unit A 171.
The position correction processing unit A 171 corrects the position of an object or the like obtained from the information processing device (type FB) 300B by using an average value of position errors obtained from the position error managing unit 175 and transmits the corrected one to the recognition result integration processing unit 181. In the case where there is information of a reference feature point which is associated, the position of an object or the like may be corrected in a manner similar to the second embodiment.
In the case where there is no information of a reference feature point which is associated, that is, in the case where the vector A 601 is not obtained in the example illustrated in
The recognition result integration processing unit 181 basically performs the same operation as that in the second embodiment. However, as described above, the function of not using information from a vehicle instructed not to use from the position error managing unit 175 is added.
In the third embodiment, a position error can be managed by the position error managing unit 175 every vehicle which provides information, so that a vehicle whose position information is considered inaccurate can be detected, and position information included in recognition information of an object or an indication obtained from an information processing device whose precision is insufficient can be ignored. Since the position error managing unit 175 holds average values of position errors of vehicles, even in the case where there is no reference feature point at the timing when it is desired to perform position correction, the position correction can be performed by using the held data.
For example, as illustrated in
In a manner similar to the second embodiment, by adding the function of the information processing device (type FB) 300B to the information processing device (type EC) 100C and mounting the resultant in both of the first vehicle 701 and the second vehicle 702, the reference feature point and the position information of an object or indication can be transmitted to both of them, and the position information of an object or indication of another vehicle can be corrected and used in each of the vehicles. In addition, in a manner similar to the second embodiment, the information processing device (type EC) 100C can communicate with a plurality of information processing devices (type FB) 300B.
As described above, the information processing device (type EC) 100C of the third embodiment has, in a manner similar to the foregoing first and second embodiments, the receiving function (the communication processing unit A 151), the matching function (the reference feature point matching processing unit 161), and the correcting function (the position correction processing unit A 171).
The information processing device (type EC) 100C of the third embodiment has: the receiving function (the communication processing unit A 151) of receiving configuration information (reference feature points) of an environmental map as a reference of position, which is extracted from sensing information of the outside sensor B 321 mounted in a second moving object or a stationary object (the second vehicle 702 or the intersection monitoring camera 811) and position information of the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811); the matching function (the reference feature point matching processing unit 161) of matching configuration information (reference feature point) of the environmental map obtained by the receiving function and configuration information (reference feature point) of an environmental map obtained by a function of the information processing device (type EC) 100C; a position calculating function (the position calculating unit 173) of calculating the position of the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) by using a matching result of the matching function; and an error detecting function (the position error managing unit 175) of detecting an error of the position of the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) recognized by the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) by comparing the calculation result of the position calculating function with the position information of the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) obtained by the receiving function.
The information processing device (type EC) 100C of the third embodiment also has the correcting function (the position correction processing unit A 171) of correcting the position information of an object or an indication existing on the outside of a first moving object (the first vehicle 701) detected by the outside sensor B 321 mounted in the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) by using the error detection result.
The information processing device (type EC) 100C of the third embodiment further has the error managing function (the position error managing unit 175) managing the error detection result, and further has: the correcting function (the position correction processing unit A 171) of correcting position information of an object or an indication existing on the outside of a first moving object (the first vehicle 701) detected by the outside sensor B 321 mounted in the second moving object or a stationary object (the second vehicle 702 or the intersection monitoring camera 811) by using an error of the position of the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) managed by the error managing function; and the selecting function (the recognition result integration processing unit 181) of checking information of the error of the position of the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) managed by the error managing function and selecting the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) using the information of the position error.
According to the third embodiment, in addition to the foregoing first and second embodiments, (the information processing device (type FB) 300B of) another vehicle transmits also the position information of the position of the another vehicle itself, and (the information processing device (type EC) 100C) of the present vehicle checks the error of the position information of the another vehicle grasped by the another vehicle itself from the position information of the another vehicle received, and the relation between the position of the of the feature point (reference feature point) detected by the another vehicle and the position of the feature point (reference feature point) grasped by the present vehicle and, when the error is large or unstable, avoids use of the position information from the another vehicle.
As described above, in the third embodiment, in addition that effects similar to those of the first and second embodiments are obtained, by comparing the position of another vehicle detected by the another vehicle itself and the position of the outside sensor B 321 mounted in the another vehicle calculated by the present vehicle, the position precision of the another vehicle is grasped. In such a manner, another vehicle which is improper for position information use can be detected. Consequently, by limiting use of the position information of a detected object or indication provided by the another vehicle and the position information of the another vehicle detected by the another vehicle itself, the adverse influence on the operation of the advanced safety system and the automatic driving system can be suppressed.
A configuration example of function blocks of a fourth embodiment of the present invention will be described with reference to
The fourth embodiment in
In the fourth embodiment, a part of the position information correcting function integrated in the information processing device (type EC) 100C as the side of receiving position information of an object or an indication in the third embodiment is moved to the information processing device (type FB) 300B on the side of transmitting position information of an object or an indication.
An information processing device (type FD) 300D of the fourth embodiment is obtained by adding a position correction processing unit B 371 to the information processing device (type FC) 300C described in the third embodiment. The position correction processing unit B 371 corrects information of the position and the orientation of a vehicle detected by the vehicle position detecting unit B 301 on the basis of deviation information (managed by the position error managing unit 175) of the position of the vehicle in which the information processing device (type FD) 300D is mounted, which is transmitted from the information processing device (type ED) 100D.
The information processing device (type ED) 100D is obtained by eliminating the position correction processing unit A 171 from the information processing device (type EC) 100C, enabling transmission/reception of information between the position error managing unit 175 and the communication processing unit A 151 in both directions, and information of a position error managed by the position error managing unit 175 can be transmitted to the information processing device (type FB) 300B side.
On the basis of the position error information detected on the side of the information processing device (type ED) 100D, the information of the position and the orientation of the vehicle detected by the vehicle position detecting unit B 301 is corrected on the side of the information processing device (type FD) 300D. Consequently, an object position detection error can be also suppressed, so that necessity of the position correcting process in the information processing device (type ED) 100D decreases. Since the position error information can be used on the information processing device (type FD) 300D side, in the case where a large error is notified from a plurality of vehicles, the information processing device (type FD) 300D determines that the possibility of occurrence of a trouble in the vehicle position detecting unit B 301 is high and can use the information for failure detection.
Also in the configuration of the fourth embodiment, there may be a case that the information processing device (type FD) 300D communicates with a plurality of vehicles in each of which the information processing device (type ED) 100D is mounted and, as a result, deviation information of the vehicle position from a certain vehicle in which the information processing device (type ED) 100D is mounted is ignored. Consequently, it is also considered to provide the position correcting function (the position correction processing unit A 171) also to the information processing device (type ED) 100D and perform position correction regardless of correction on the side of the information processing device (type FD) 300D.
As described above, the information processing device (type ED) 100D of the fourth embodiment has, in a manner similar to the foregoing third embodiment, the receiving function (the communication processing unit A 151), the matching function (the reference feature point matching processing unit 161), the position calculating function (the position calculating unit 173), the error detecting function (the position error managing unit 175), the error managing function (the position error managing unit 175), and the like and further has the transmitting function (the communication processing unit A 151) of transmitting information of an error of the position of the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811) managed by the error managing function to the second moving object or the stationary object (the second vehicle 702 or the intersection monitoring camera 811).
Consequently, the effects similar to those of the foregoing first, second, and third embodiments are obtained and, in addition, simplification of the configuration of the information processing device (type ED) 100D, improvement of the reliability of the information processing device (type FD) 300D, and the like can be realized.
The present invention is not limited to the foregoing embodiments but includes various modifications. For example, the forgoing embodiments have been described to make the present invention easily understood and are not necessarily limited to a configuration having all of the components described. A part of the components of a certain embodiment can be replaced with a component of another embodiment, or a component of an embodiment can be added to the configuration of another embodiment. With respect to a part of the configuration of each embodiment, addition of another configuration, deletion, or replacement can be performed.
A part or all of the configurations, functions, processing units, processing means, and the like may be realized by hardware by, for example, designing by an integration circuit. The configurations, functions, and the like may be realized by software in a manner such that a processor interprets a program realizing each function and executes it. Information of a program, a table, a file, and the like realizing each function can be stored in a storing device such as a memory, a hard disk, an SSD (Solid State Drive) or the like or a recording medium such as an IC card, an SD card, a DVD, or the like.
The control lines and information lines illustrated are considered to be necessary for the description, and all of control lines and information lines necessary for a product are not always illustrated. It may be considered that almost all of the components are mutually connected in practice.
Number | Date | Country | Kind |
---|---|---|---|
2019-052098 | Mar 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/004481 | 2/6/2020 | WO | 00 |