The present invention relates to a technique for detecting a position of a vehicle.
As background art of the present technical field, Patent Literature 1 recites: “by detecting a current location of a vehicle using dead reckoning navigation to manage current location information about the vehicle, integrating an amount of movement in left and right directions using the dead reckoning navigation, and comparing the amount of movement with a lane width of a road to detect lane movement of the current location information, a current location of the vehicle is detected using the dead reckoning navigation by current location detecting means, and by detecting lane movement by lane movement detecting means to manage the current location information about the vehicle including a lane position by current location information managing means.”
Japanese Patent Laid-Open No. 2006-189325
The technique described in Patent Literature 1, however, has a problem that, since a current position of a vehicle is detected by using an integrated amount of movement of the vehicle, an error between the detected position of the vehicle and an actual position of the vehicle increases when a vehicle traveling distance increases.
The present invention has been made in view of the situation described above, and an object is to calculate a position of a vehicle on a road with a higher accuracy.
In order to achieve the above object, the present invention is an information processing device mounted in a vehicle, characterized by comprising a control portion acquiring photographed image data obtained by photographing an outside of the vehicle, calculating, when object image data that is image data of a predetermined object is included in the photographed image data, a relative position of the vehicle relative to the object on the basis of the object image data, and detecting a position of the vehicle on a road on the basis of the calculated relative position.
Further, the information processing device of the present invention is characterized in that the control portion judges whether the object image data is included in the photographed image data or not on the basis of a result of comparison between stored image data corresponding to the object image data and the photographed image data.
Further, the information processing device of the present invention is characterized by comprising a storage portion storing road information including information showing a position of the object and information showing a relationship between the object and a road; wherein the control portion calculates the position of the vehicle on the road on the basis of the calculated relative position and the road information stored in the storage portion.
Further, the information processing device of the present invention is characterized in that the control portion calculates a right angle direction separation distance that is a separation distance between the vehicle and the object in a direction crossing a traveling direction of the vehicle, as the relative position, and calculates the position of the vehicle on the road on the basis of the calculated right angle direction separation distance and the road information stored in the storage portion.
Further, the information processing device of the present invention is characterized in that the road information includes information about widths of lanes that the road has and information about a separation distance between the object and the road; and the control portion identifies a lane in which the vehicle is traveling on the basis of the calculated right angle direction separation distance and the road information stored in the storage portion.
Further, the information processing device of the present invention is characterized in that the object includes a road sign.
Further, the information processing device of the present invention is characterized by comprising an interface to which a photographing device having a photographing function is connectable; wherein the control portion receives and acquires the photographed image data from the photographing device via the interface.
In order to achieve the above object, a vehicle position detecting method of the present invention is characterized by comprising: acquiring photographed image data obtained by photographing an outside of a vehicle, by a control portion; when object image data that is image data of a predetermined object is included in the photographed image data, calculating a relative position of the vehicle relative to the object on the basis of the object image data, by the control portion; and detecting a position of the vehicle on a road on the basis of the calculated relative position, by the control portion.
Further, the vehicle position detecting method of the present invention is characterized by comprising: storing image data corresponding to the object image data; and judging whether the object image data is included in the photographed image data or not on the basis of a result of comparison between stored image data corresponding to the object image data and the photographed image data.
Further, the vehicle position detecting method of the present invention is characterized by comprising calculating the position of the vehicle on the road on the basis of the calculated relative position and road information including information showing a position of the object and information showing a relationship between the object and the road.
Further, the vehicle position detecting method of the present invention is characterized by comprising: calculating a right angle direction separation distance that is a separation distance between the vehicle and the object in a direction crossing a traveling direction of the vehicle, as the relative position; and calculating the position of the vehicle on the road on the basis of the calculated right angle direction separation distance and the road information stored in the storage portion.
Further, the vehicle position detecting method of the present invention is characterized by comprising: identifying a lane in which the vehicle is traveling on the basis of the calculated right angle direction separation distance and the road information including information about widths of lanes that the road and information about a separation distance between the object and the road.
Further, in order to achieve the above object, the present invention is an information processing device communicably connected to an in-vehicle device mounted in a vehicle via a network, the information processing device being characterized by comprising a control portion acquiring photographed image data obtained by photographing an outside of the vehicle, from the in-vehicle device, calculating, when object image data that is image data of a predetermined object is included in the photographed image data, a relative position of the vehicle relative to the object on the basis of the object image data, detecting a position of the vehicle on a road on the basis of the calculated relative position and notifying the in-vehicle device of a detection result.
According to the present invention, it is possible to calculate a position of a vehicle on a road with a higher accuracy.
Embodiments of the present invention will be described below with reference to drawings.
The in-vehicle navigation device 1 is a device mounted in a vehicle and is provided with a function of performing own vehicle position detection of detecting a current position of the vehicle, a function of displaying a map and performing map display of displaying the current position of the vehicle on a displayed map, a function of performing route search of searching for a route to a destination, and a function of performing route guidance of displaying a map, displaying a route to a destination on the map and guiding a route to the destination.
Hereinafter, a vehicle mounted with the in-vehicle navigation device 1 will be expressed as an “own vehicle”.
As shown in
The control portion 10 is provided with a CPU, a ROM, a RAM, other peripheral circuits and the like and controls each portion of the in-vehicle navigation device 1. The control portion 10 controls each portion of the in-vehicle navigation device 1 by cooperation between hardware and software, for example, the CPU reading and executing a control program stored in the ROM.
The touch panel 11 is provided with a display panel 111 and a touch sensor 112. The display panel 111 is provided with a display device such as a liquid crystal display panel and an organic EL panel and displays various images in accordance with control of the control portion 10. The touch sensor 112 is arranged being overlapped on the display panel 111, and the touch sensor 112 detects a user's touch operation and outputs a signal indicating the touch operation to the control portion 10. The control portion 10 executes a process corresponding to the user's touch operation on the basis of the signal inputted from the touch sensor 112.
The storage portion 12 is provided with a nonvolatile memory and stores various data.
The storage portion 12 stores map data 121.
The map data 121 includes parcel data. The parcel data is data used in the map display and route guidance described above, and includes depiction data for display of a map such as road depiction data for depiction of shapes of roads, background depiction data for depiction of backgrounds such as landforms, and character string depiction data for depiction of character strings for administrative districts and the like. The road depiction data further includes node information having information about nodes corresponding to connection points in a road network, such as intersections, link information having information about links corresponding to roads formed among nodes, and information required for the route guidance.
Further, the map data 121 includes region data. The region data is data used in the route search described above, and includes information required for the route search, such as the node information having information about nodes corresponding to connection points in a road network, such as intersections, and the link information having information about links corresponding to roads formed among nodes.
Further, the map data 121 includes road information data 1211. The road information data 1211 will be described later.
The GPS unit 13 receives a GPS radio wave from a GPS satellite via a GPS antenna not shown and acquires a current position and a traveling direction of the own vehicle from a GPS signal superimposed on the GPS radio wave by calculation. The GPS unit 13 outputs an acquisition result to the control portion 10.
The relative bearing detecting unit 14 is provided with a gyro sensor and an acceleration sensor. The gyro sensor is configured, for example, with a vibration gyro and detects a relative orientation of the own vehicle (for example, an amount of turning in a yaw axis direction). The acceleration sensor detects acceleration acting on the own vehicle (for example, inclination of the own vehicle relative to the traveling direction). The relative bearing detecting unit 14 outputs detection results of the gyro sensor and the acceleration sensor to the control portion 10.
The vehicle speed acquiring portion 16 is connected, for example, to a sensor for detecting a vehicle speed pulse, detects vehicle speed of the own vehicle on the basis of a vehicle speed pulse inputted from the sensor. Further, for example, by communicating with an ECU (Engine Control Unit), the vehicle speed acquiring portion 16 acquires information about vehicle speed from the ECU to detect the vehicle speed of the own vehicle. The vehicle speed acquiring portion 16 outputs a detection result to the control portion 10. The control portion 10 detects the vehicle speed of the own vehicle on the basis of the input from the vehicle speed acquiring portion 16.
In the case of performing the own vehicle position detection, the control portion 10 estimates a current position of the own vehicle on the basis of the inputs from the GPS unit 13 and the relative bearing detecting unit 14, the state of the own vehicle, such as the vehicle speed of the own vehicle detected on the basis of the input from the vehicle speed acquiring portion 16, and the map data 121, and appropriately corrects the estimated current position by a method to be described later to detect the current position of the own vehicle.
Further, in the case of performing the map display, the control portion 10 displays the detected current position of the own vehicle on a map displayed on the touch panel 11.
Further, in the case of performing the route search, the control portion 10 searches for a route from the detected current position to a destination set by the user on the basis of the map data 121.
Further, in the case of performing the route guidance, the control portion 10 displays the appeared current position of the own vehicle on the map while showing the route to the destination on the map to guide the route.
An external device is connected to the interface 15, and the interface 15 communicates with the connected external device in accordance with a predetermined protocol, in accordance with control of the control portion 10. In the present embodiment, an in-vehicle camera 20 (a photographing device) is connected to the interface 15 as the external device.
The in-vehicle camera 20 is a stereo camera having two photographing portions for photographing a forward direction of the own vehicle. Lens mechanisms of the two photographing portions are arranged being separated from each other in a left-right direction, which is a direction orthogonal to a front-back direction of the own vehicle, on the inner side of front glass of the own vehicle. The two photographing portions synchronously execute photographing in a predetermined cycle. The in-vehicle camera 20 generates two pieces of photographed image data on the basis of photographing results of the two photographing portions and outputs the generated two pieces of photographed image data to the control portion 10 via the interface 15.
By the way, as described above, the in-vehicle navigation device 1 according to the present embodiment has the function of performing the own vehicle position detection of detecting a current position of the own vehicle.
As for the own vehicle position detection, there is a need for detecting a current position of an own vehicle on a road where the own vehicle is traveling with as high accuracy as possible. Especially, as for the own vehicle position detection, there is a need for, in a case where a road on which an own vehicle is traveling has a plurality of lanes, detecting in which lane the own vehicle is traveling among the plurality of lanes with as high accuracy as possible. By detecting the lane in which the own vehicle is traveling (hereinafter referred to as a “traveling lane”) with as high accuracy as possible, it is possible to accurately inform the user of the lane in which the own vehicle is traveling at the time of performing the map display and accurately inform the user of change of the lane for smoothly traveling on a retrieved route at the time of performing the route guidance, and, thereby, user convenience is improved. On the basis of the above, the in-vehicle navigation device 1 detects a lane in which the own vehicle is traveling by the following method.
Further,
The process described below using the flowchart of
That is, it is assumed that the travel lane has a plurality of lanes (in the example of
Further, it is assumed that the travel lane linearly extends without bending at least up to the position of the road sign (in the example of
Further, it is assumed that, on the left side of the leftmost lane relative to the traveling direction (in the example of
Further, it is assumed that the position of the road sign (in the example of
In the description below, a direction crossing the traveling direction of the own vehicle will be referred to as a “right angle direction” (in the example of
As shown in
As described above, the in-vehicle camera 20 synchronously photographs the forward direction of the own vehicle in a predetermined cycle by the two photographing portions and outputs the photographed image data based on a result of the photographing to the control portion 10. Therefore, the control portion 10 executes the processing of step SA1 in a cycle corresponding to the cycle of the in-vehicle camera 20 outputting the photographed image data, and executes the pieces of processing at and after step SA2 with execution of the processing of step SA1 (acquisition of the photographed image data) as a trigger.
Next, the control portion 10 analyzes the photographed image data acquired at step SA1, and judges whether object image data, which is image data of an image of a road sign showing maximum speed (speed limit, regulatory speed) (hereinafter, referred to as a “maximum speed road sign”), is included in the photographed image data or not (step SA2). The processing of step SA2 will be described below in detail.
The control portion 10 executes the processing of step SA2 using any one of the pieces of photographed image data that are synchronously inputted from the two stereo cameras.
In the present embodiment, the photographed image data is image data in which dots having information about colors (for example, information about color components of each of RGB colors represented by gradation values of a predetermined gradation) are arranged in a form of a dot matrix according to predetermined resolution.
Here, the map data 121 has image data to be used as a template in pattern matching for each of maximum speed road signs for maximum speeds (hereinafter referred to as “template image data”). The template image data corresponds to “image data corresponding to stored object image data”. At step SA2, the control portion 10 performs pattern matching using the template image data the map data 121 has, and judges whether object image data is included in the photographed image data or not.
In the example of
At step SA2, in order to improve accuracy of the judgment about whether object image data is included in photographed image data or not and accuracy of calculation of a sign/vehicle distance to be described later, the control portion 10 may judge that object image data is included in photographed image data when the size of the object image data included in the photographed image data is equal to or larger than a predetermined threshold.
The method for judging whether object image data is included in photographed image data or not is not limited to the method using pattern matching but may be any method.
If it is judged at step SA2 that object image data is not included in the photographed image data (step SA2: NO), the control portion 10 ends the process.
If it is judged at step SA2 that object image data is included in the photographed image data (step SA2: YES), the control portion 10 recognizes a photographed road sign (the maximum speed road sign) on the basis of the object image data (step SA3).
Specifically, at step SA3, the control portion 10 analyzes the object image data and acquires the type of a road sign corresponding to the object image data. For example, the control portion 10 identifies a character string and a figure included in the road sign corresponding to the object image data. Here, for each type of road sign, the map data 121 has information associating a character string and a figure included in the road sign with the type of the road sign. The control portion 10 acquires the type of the road sign corresponding to the identified character string and figure, on the basis of the information.
The method for identifying the type of a road sign is not limited to the method based on a character string and a figured included in the road sign but may be any method. For example, the method for identifying the type of a road sign may be a method of identifying the type of the road sign by reflecting the shape, color and the like of the road sign.
At the next step SA4, the control portion 10 calculates a separation distance between the road sign corresponding to the object image data and the own vehicle (hereinafter referred to as a “sign/vehicle distance”; in the example of
For example, the control portion 10 calculates the sign/vehicle distance by existing image processing utilizing a difference between positions of the pieces of object image data in the two pieces of photographed image data of the two photographing portions inputted from the in-vehicle camera 20 (a parallax).
The method for calculating the sign/vehicle distance is not limited to the exemplified method but may be any method. For example, the control portion 10 may calculate the sign/vehicle distance by predetermined means based on sizes of the pieces of object image data in the pieces of photographed image data.
At the next step SA5, the control portion 10 calculates an angle between a virtual straight line extending in the traveling direction of the own vehicle (in the example of
For example, the control portion 10 calculates the sign/vehicle angle by existing image processing based on the sign/vehicle distance calculated at step SA4, positions of the pieces of object image data in the two pieces of photographed image data of the two photographing portions inputted from the in-vehicle camera 20, and a direction of white lines indicating boundaries among lanes in the pieces of photographed image data.
The method for calculating the sign/vehicle angle is not limited to the exemplified method but may be any method.
At the next step SA6, the control portion 10 calculates a distance between a current position of the own vehicle (in the example of
The control portion 10 calculates the right angle direction separation distance by the following formula M1 on the basis of the sign/vehicle distance calculated at step SA4 and the sign/vehicle angle calculated at step SA5.
Right angle direction separation distance=Sign/vehicle distance·sin(Sign/vehicle angle) (Formula M1):
Next, the control portion 10 detects a current position of the own vehicle on the basis of the inputs from the GPS unit 13, the relative bearing detecting unit 14 and the vehicle speed acquiring portion 16 (step SA7).
In the description below, the current position of the own vehicle detected on the basis of the inputs from the GPS unit 13 and the relative bearing detecting unit 14 will be expressed as an “estimated current position”. Since the estimated current position is calculated using the input from the GPS unit 13, an error due to the GPS may occur, and it is not appropriate to detect a traveling lane on the basis of the estimated current position. Further, the estimated current position indicates a current position of the own vehicle by longitude and latitude.
Next, the control portion 10 refers to the road information data 1211 (step SA8).
The road information data 1211 is a database having a record for each of road signs displayed on a map based on the map data 121 (road signs managed in the map data 121).
As shown in
The sign information J1 is information about a road sign and has a sign ID J11 for uniquely identifying the road sign, sign type information J12 showing the type of the road sign, and sign position information J13 showing a position of the road sign (a position indicated by longitude and latitude).
The corresponding road information J2 is information about a road on which the road sign is provided. Note that the road on which the road sign is provided means a one-side road on which traveling in conformity with a rule shown by the road sign is required.
The corresponding road information J2 has a link ID J21 of the road (identification information assigned for each link in the link information of the region data or parcel data described above), number-of-lanes information J22 showing the number of lanes of the road, road separation information J23 showing a separation distance between the left end of the leftmost lane in the traveling direction among lanes of the road on which the road sign is provided and the position of the road sign (hereinafter referred to as a “sign/road separation distance). Further, the corresponding road information J2 has first lane width information J241 to n-th lane width information J24n showing widths of the lane for n lanes (n is an integer equal to or larger than “1”) that the road has, respectively. In the description below, the n lanes are expressed as a first lane, a second lane, . . . , an n-th lane in order from the leftmost lane in the traveling direction.
Information that each record of the road information data 1211 has corresponds to “road information”.
Further, the road separation information J23 corresponds to “information about a separation distance between an object and a road”.
At the next step SA9, the control portion 10 identifies a record of a road sign corresponding to the object image data among the records that the road information data 1211 has. The processing of step SA9 will be described below in detail.
At step SA9, the control portion 10 extracts a record in which a position shown by the sign position information J13 of the road information data 1211 and the estimated current position detected at step SA7 are in a predetermined relationship, among the records that the road information data 1211 has.
That the position shown by the sign position information J13 of the road information data 1211 and the estimated current position detected at step SA7 are in a predetermined relationship means that the position shown by the sign position information J13 is within a photographing range of the in-vehicle camera 20 with the estimated current position as a starting point.
When one record is extracted, the control portion 10 identifies the extracted record as the record of the road sign corresponding to the object image data.
On the other hand, a case may occur where a plurality of records are extracted. In this case, the control portion 10 identifies such a record that the type of a road sign shown by the sign type information J12 corresponds to the type of the road sign corresponding to the object image data acquired at step SA3, among the extracted plurality of records, as the record of the road sign corresponding to the object image data.
Here, in general, road signs of the same type are arranged being separated by a predetermined distance or more. Therefore, by identifying a corresponding record by the above method, it is possible to appropriately identify a record of a road sign corresponding to object image data.
At the next step SA10, the control portion 10 acquires road separation information J23 and first lane width information J241 to n-th lane width information J24n on the basis of the record identified at step SA9.
Next, the control portion 10 identifies a lane in which the own vehicle is traveling (a traveling lane) on the basis of the right angle direction separation distance calculated at step SA6, and the road separation information J23 and the first lane width information J241 to n-th lane width information J24n acquired at step SA10 (step SA11). The processing of step SA11 will be described below in detail.
Here, the lane in which the own vehicle is traveling can be identified by a relationship among the right angle direction separation distance, the sign/road separation distance and widths of the lanes that the road has.
That is, the right angle direction separation distance, the sign/road separation distance and the widths of the first lane to n-th lane that the road (the travel lane) has are in the following relationship: “Sign/road separation distance+Width of first lane+ . . . +Width of (m−1)th lane<Right angle direction separation distance<Sign/road separation distance+Width of first lane+ . . . +Width of m-th lane” (m is an integer equal to or larger than “1”). In this case, the lane in which the own vehicle is traveling (the traveling lane) is the m-th lane.
For example, in the case of
On the basis of the above, at step SA11, the control portion 10 identifies the lane in which the own vehicle is traveling (the traveling lane), on the basis of the relationship among the right angle direction separation distance, the sign/road separation distance and the width of each lane that the road has.
The operation of the in-vehicle navigation device 1 at the time of detecting (identifying) the lane in which the own vehicle is traveling has been described above.
Here, in the above description, the control portion 10 calculates a sign/vehicle angle, a sign/vehicle distance and a right angle direction separation distance.
If the sign/vehicle angle and the sign/vehicle distance are decided, a position of the own vehicle relative to a road sign is decided. Therefore, the sign/vehicle angle and the sign/vehicle distance correspond to a “relative position of an own vehicle (a vehicle) relative to a road sign (an object)”.
Similarly, if the right angle direction separation distance is decided, a position of the own vehicle in a right angle direction relative to the road sign is decided. Therefore, the right angle direction separation distance corresponds to the “relative position of an own vehicle (a vehicle) relative to a road sign (an object)”.
Further, in the embodiment described above, the control portion 10 detects a lane in which the own vehicle is traveling, using the calculated sign/vehicle angle, sign/vehicle distance and right angle direction separation distance. These relative positions, however, can be used in other methods at the time of detecting the own vehicle.
For example, the control portion 10 can detect a relative position of the own vehicle relative to a road sign based on the sign/vehicle angle and the sign/vehicle distance. Therefore, the control portion 10 can detect a position of the own vehicle on a map by acquiring a position of a road sign on the map. Then, for example, by correcting an estimated current position detected from an input from the GPS unit 13 or the like by the position of the own vehicle on the map detected on the basis of the sign/vehicle angle and sign/vehicle distance, the position of the own vehicle can be detected with a higher accuracy.
Further, though, in a self-driving system (including not only a complete self-driving system but also a system supporting self-driving in a predetermined case), it is required to detect a position of the own vehicle with a high accuracy, it is possible to detect the position of the own vehicle with a higher accuracy by using a calculated sign/vehicle angle, sign/vehicle distance and right angle direction separation distance at the time of detecting the position of the own vehicle.
As described above, the in-vehicle navigation device 1 (the information processing device) according to the present embodiment is provided with the control portion 10 that acquires photographed image data obtained by photographing an outside of the own vehicle (the vehicle), and, when object image data, which is image data of a road sign (a predetermined object), is included in the photographed image data, calculates a relative position of the vehicle relative to the road sign (a combination of a sign/vehicle angle and a sign/vehicle distance, or a right angle direction separation distance) on the basis of the object image data, and detects a position of the vehicle on a road on the basis of the calculated relative position.
According to this configuration, a relative position of the own vehicle relative to a road sign is calculated on the basis of object image data included in photographed image data, and a position of the own vehicle on the road is detected on the basis of the calculated relative position. Therefore, for example, in comparison with the case of detecting a current position of a vehicle using an integrated amount of movement of the vehicle, an error of position detection accompanying increase in a traveling distance of the vehicle does not occur, and it is possible to calculate a position of the vehicle on a road with a high accuracy.
Further, in the present embodiment, the in-vehicle navigation device 1 is provided with the storage portion 12 that stores the road information data 1211 having road information including information showing positions of road signs and information showing relationships between the road signs and roads.
The control portion 10 calculates a position of the own vehicle on a road on the basis of the calculated relative position (the combination of the sign/vehicle angle and the sign/vehicle distance, or the right angle direction separation distance) and the road information data 1211 stored in the storage portion 12.
According to this configuration, the control portion 10 can detect a position of the own vehicle on a road with a high accuracy on the basis of a calculated relative position using the road information that the road information data 1211 has.
Further, in the present embodiment, the control portion 10 calculates a right angle direction separation distance, which is a separation distance between the own vehicle and a road sign in the right angle direction (a direction crossing a traveling direction of the own vehicle) as the relative position, and calculates a position of the own vehicle on a road on the basis of the calculated right angle direction separation distance and the road information data 1211.
According to this configuration, the control portion 10 detects a position of the own vehicle on a road with a high accuracy on the basis of a calculated right angle direction separation distance using the road information that the road information data 1211 has.
Further, in the present embodiment, the road information of the road information data 1211 includes the first lane width information J241 to the n-th lane width information J24n (information about widths of lanes a road has) and the road separation information J23 (information about a separation distance between an object and a road).
The control portion 10 identifies a lane in which the own vehicle is traveling on the basis of the calculated right angle direction separation distance and the road information data 1211.
According to this configuration, a position of the own vehicle on a road is detected with a high accuracy on the basis of the calculated right angle direction separation distance using the road information that the road information data 1211 has.
Next, a second embodiment will be described.
In the description below, the same components as the components described in the first embodiment will be given the same reference numerals, and detailed description thereof will be omitted.
Further, in the second embodiment, as for a shape of a travel lane and a relationship among the travel lane, a current position of the own vehicle and a position of a road sign, they are assumed to be similar to those according to the first embodiment.
As for the pieces of processing of steps SA4 to SA6 among the pieces of processing described using the flowchart of
As shown in
In the example of
Next, the control portion 10 monitors whether or not the own vehicle has traveled a predetermined distance or more after the timing of executing the processing of step SB1 (step SB2). The detection of step SB2 about whether the own vehicle has traveled a predetermined distance or more does not have to be strict detection. For example, in a situation that there is a strong possibility that the own vehicle has traveled the predetermined distance or more, from a relationship between vehicle speed and traveling time, a judgment that the own vehicle has traveled the predetermined distance or more may be made.
If the own vehicle has traveled the predetermined distance or more after the timing of executing the processing of step SB1 (step SB2: YES), the control portion 10 calculates a second sign/vehicle angle based on a current position of the own vehicle at that time point (hereinafter referred to as a “second current position”; in the example of
The second sign/vehicle angle is an angle between a virtual straight line extending in a traveling direction of the own vehicle (in the example of
At step SB3, the control portion 10 calculates the second sign/vehicle angle in a method similar to the method for calculating a sign/vehicle angle described in the first embodiment.
Next, the control portion 10 calculates a distance between the position of the own vehicle at the timing of executing the processing of step SB1 (in the example of
At step SB4, for example, the control portion 10 detects an estimated current position of the own vehicle at the timing of executing the processing of step SB1 and an estimated current position of the own vehicle at the time of executing the processing of step SB3 on the basis of inputs from the GPS unit 13 and the relative bearing detecting unit 14, and appropriately performs correction on which the situation of vehicle speed during traveling and the like are reflected to calculate a vehicle traveling distance. Further, for example, the control portion 10 calculates the vehicle traveling distance on the basis of an aspect of a change between an image of a predetermined object (which may be a road sign) in photographed image data based on photographing performed by the in-vehicle camera 20 at the timing of executing the processing of step SB1 and an image of the predetermined object in photographed image data based on photographing performed by the in-vehicle camera 20 at the timing of executing the processing of step SB3.
The method for calculating the vehicle traveling distance is not limited to the exemplified method but may be any method.
Next, the control portion 10 calculates a right angle direction separation distance (in the example of
Here, when a distance between the second current position (in the example of
tan(Sign/vehicle angle)=Right angle direction separation distance/(Vehicle traveling distance+Corresponding distance) (Formula M2):
tan(Second sign/vehicle angle)=Right angle direction separation distance/Corresponding distance (Formula M3):
Therefore, the right angle direction separation distance can be calculated by the following formula M4:
Right angle direction separation distance=(Vehicle traveling distance·tan(Sign/vehicle angle)·tan(Second sign/vehicle angle))/(tan(Second sign/vehicle angle)−tan(Sign/vehicle angle)) (Formula M4):
On the basis of the above, the following formulas are satisfied in the case of the example of
tan θ1=Right angle direction separation distance C/(Vehicle traveling distance E+Corresponding distance x) (Formula M2′):
tan θ2=Right angle direction separation distance C/Corresponding distance x (Formula M3′):
The right angle direction separation distance C can be calculated by the following formula M4′:
Right angle direction separation distance C=(Vehicle traveling distance E·tan θ1·tan θ2)/(tan θ2−tan θ1) (Formula M4′):
At step SB5, the control portion 10 calculates the right angle direction separation distance using the formula M4 described above.
The operation performed at the time of the control portion 10 of the in-vehicle navigation device 1 according to the present embodiment detecting a position of the own vehicle (a lane in which the own vehicle is traveling) has been described above. By performing the process described in the present embodiment, it is possible to detect a position of the own vehicle with a higher accuracy similarly to the first embodiment.
Though calculation of angles with a current position of the own vehicle as a vertex (the sign/vehicle angle and the second sign/vehicle angle) is performed twice in the present embodiment, a configuration is also possible in which the calculation is executed three or more times according to travel of the own vehicle, and a relative position of the own vehicle relative to a road sign (the right angle direction separation distance) is calculated in a method corresponding to the method described above on the basis of each of the calculated angles. According to this configuration, it is possible to calculate the relative position with a higher accuracy.
Further, in the embodiment described above, the in-vehicle camera 20 photographs a forward direction of the own vehicle, and the control portion 10 calculates a relative position of the own vehicle relative to a road sign on the basis of photographed image data based on a result of the photographing of the forward direction of the own vehicle. On the other hand, if the in-vehicle camera 20 is provided at a position capable of photographing a side direction or backward direction of the own vehicle, the control portion 10 can calculate the relative position of the own vehicle relative to a road sign on the basis of photographed image data based on a result of the photographing of the side direction or backward direction of the own vehicle in the method described above.
Next, a third embodiment will be described.
In the description below, the same components as the components described in the first embodiments will be the same reference numerals, and detailed description of the components will be omitted.
In the first and second embodiments described above, it is assumed that a road (a travel lane) does not bend at least from a current position of the own vehicle to a road sign. On the other hand, in the present embodiment, operation of the in-vehicle navigation device 1 when a road (a travel lane) from a current position of the own vehicle to a road sign bends will be described.
The in-vehicle navigation device 1 executes the process of the flowchart shown in
If judging that the road bends between the current position of the own vehicle and the road sign bends, the control portion 10 executes the process of the flowchart of
It is assumed that, at the starting point of the flowchart of
As shown in
Next, the control portion 10 refers to the road information data 1211 to identify a record of a road sign corresponding to the aspect image data in a method similar to the method described in the first embodiment described above, and acquires sign position information J13 that the identified record has (step SC2). As described above, the sign position information J13 is information showing the position of the road sign (a position indicated by longitude and latitude; coordinates in a predetermined coordinate system on which a map based on the map data 121 is developed is also possible).
Next, the control portion 10 calculates a current position of the own vehicle (in the example of
By the sign/vehicle distance and the sign/vehicle angle being decided, a relative position of the own vehicle relative to the road sign is decided. Therefore, by the position of the road sign being decided, the current position of the own vehicle is decided.
Next, the control portion 10 refers to the map data 121 to acquire information about a center line of the road (the travel lane) on which the own vehicle is traveling (hereinafter referred to as “center line information”) (step SC4).
In the present embodiment, a center line of a road refers to a line following the center of a roadway in a right angle direction relative to the overall road width including travel lanes in opposite traveling directions, and is a center line TS in the example of
At step SC4, the control portion 10 acquires unit straight line information about a unit straight line positioned in a side direction of the position of the own vehicle (in the example of
Next, the control portion 10 calculates, in a case of drawing a perpendicular line down from the current position of the own vehicle to the unit straight line shown by the unit straight line information acquired at step SC4, a length between the current position of the own vehicle and an intersection point between the perpendicular line and the unit straight line (in the example of
Next, the control portion 10 refers to the road information data 1211 to acquire first lane width information J241 to n-th lane width information J24n about the road on which the own vehicle is traveling (step SC6).
Next, the control portion 10 identifies a lane in which the own vehicle is traveling on the basis of the center line information (the unit straight line information) acquired at step SC4, the length of the perpendicular line calculated at step SC5 and the first lane width information J241 to the n-th lane width information J24n acquired at step SC6 (step SC7).
Here, on a road, lanes are provided side by side in a left direction relative to a traveling direction from a center line. Therefore, if a width of each lane provided on the road and a distance between the center line and a current position of the own vehicle are decided, a lane in which the own vehicle is positioned is decided.
On the basis of the above, at step SC7, the control portion 10 calculates a position of the intersection point between the perpendicular line and the center line on the map, and identifies the lane in which the own vehicle is traveling on the basis of a relationship among the position, the length of the perpendicular line and a width of the lane.
The operation performed at the time of the control portion 10 of the in-vehicle navigation device 1 according to the present embodiment detecting a position of the own vehicle (a lane in which the own vehicle is traveling) has been described above. By performing the process described in the present embodiment, it is possible to detect a position of the own vehicle with a higher accuracy similarly to the first and second embodiments.
Next, a fourth embodiment will be described.
In the description below, the same components as the components described in the first embodiment will be given the same reference numerals, and detailed description thereof will be omitted.
In the first to third embodiments described above, a device mounted in a vehicle executes a process for detecting a current location of the own vehicle. On the other hand, in the present embodiment, a control server 3 communicable with the device mounted in a vehicle via a network N executes the process.
In the present embodiment, the control server 3 functions as an “information processing device”.
As shown in
The in-vehicle device 1b is communicably connected to the control server 3 via the network N that is configured including the Internet. A configuration is also possible in which the in-vehicle device 1b is provided with a function of accessing the network N, and the in-vehicle device 1b directly accesses the network N. A configuration is also possible in which the in-vehicle device 1b and a terminal having a function of accessing the network N (for example, a mobile phone that a person in the vehicle possesses) are connected via near-field wireless communication or wired communication, or other communication systems, and the in-vehicle device 1b accesses the network N via the terminal.
The in-vehicle device 1b has a function of transmitting photographed image data inputted from the in-vehicle camera 20 to the in-vehicle navigation device 1 via the network N.
The control server 5 is provided with a server control portion 6 that is provided with a CPU, a ROM, a RAM, other peripheral circuits and the like and controls each portion of the control server 5 by cooperation between hardware and software, for example, reading and executing a program.
The server control portion 6 functions as a “control portion”.
The server control portion 6 receives photographed image data from the in-vehicle device 1b, performs the processes corresponding to the flowchart of
The in-vehicle device 1b executes a corresponding process on the basis of the notification from the control server 5.
The fourth embodiment has been described above. Even in configuration of the fourth embodiment, the in-vehicle device 1b mounted in a vehicle can acquire a relative position of the own vehicle relative to a road sign and execute a corresponding process on the basis of the acquired relative position.
The embodiments described above merely show aspects of the present invention and can be arbitrarily modified and applied within a scope of the present invention.
For example, in the embodiments described above, the in-vehicle navigation device 1 and the control server 5 detect a relative position of the own vehicle relative to a road sign as an object. The object, however, is not limited to a road sign but may be anything that can be photographed by the in-vehicle camera 20. For example, the object may be a signal, a building, a signboard or the like. However, since a road sign has a characteristic that a position where the road sign is provided is restricted to some extent because of a relationship with a road, a characteristic of being managed with the map data 121, and a characteristic that types are limited, and there is a shape standard for each of the types, the road sign is appropriate as an object.
Further,
Further, processing units of the flowcharts described using drawings are obtained by division according to main pieces of processing content to cause the processes of the in-vehicle navigation device 1 and the control server 5 to be easily understood. The invention of the present application is not restricted by the way of division of processing units and the names of the processing units. The process of each device can be divided into more processing units according to processing content. Further, one processing unit can be divided so as to include more pieces of processing. Further, processing orders of the above flowcharts are not limited to the shown examples if similar free state judgment can be performed.
Further, though the in-vehicle navigation device 1 is configured to acquire photographed image data from the in-vehicle camera 20 which is an external device in the embodiments described above, a configuration is also possible in which the in-vehicle navigation device 1 has a photographing function.
Number | Date | Country | Kind |
---|---|---|---|
2015-056157 | Mar 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/058501 | 3/17/2016 | WO | 00 |