ROAD SHAPE ESTIMATING DEVICE, ROAD SHAPE ESTIMATING METHOD AND PROGRAM

Information

  • Patent Application
  • 20090248299
  • Publication Number
    20090248299
  • Date Filed
    March 26, 2009
    15 years ago
  • Date Published
    October 01, 2009
    15 years ago
Abstract
A road shape estimating device has a data obtaining processing unit for obtaining interpolation point data, a radius calculation processing unit for calculating a radius of curvature at each of shape interpolation points based on the interpolation point data, a corner detection processing unit for detecting a corner in a predetermined section, based on the radius of curvature at each of the shape interpolation points, and for setting a candidate start point and a candidate end point for the detected corner, and a corner connection processing unit for determining whether or not a predetermined connecting condition is met based on candidate start points and candidate end points of detected corners, and connecting, when the predetermined connecting condition is met, a predetermined corner among the plurality of corners and a corner adjacent the predetermined corner.
Description
INCORPORATION BY REFERENCE

The disclosure of Japanese Patent Application No. 2008-086520 filed on Mar. 28, 2008 including the specification, drawings and abstract is incorporated herein by reference in its entirety.


BACKGROUND OF THE INVENTION

The present invention relates to a road shape estimating device, a road shape estimating method and a program.


Description of the Related Art

Conventionally in a navigation device, the actual position, namely, the current position of a vehicle is detected for example by a global positioning system (GPS), map data are read from a data recording unit, a map screen is formed on a display unit, and a map or the like of a vehicle position representing the current position and an area surrounding the vehicle position (vicinity) are displayed on the map screen. Therefore, a driver can drive the vehicle according to the vehicle position and the like displayed on the map screen.


Further, when the driver inputs a destination and sets a search condition, route search processing is performed based on the search condition, and a route from the place of departure represented by the current position to the destination is searched for according to the map data. Then the route that has been located by search (“searched route”) is displayed together with the vehicle position on the map screen, and guidance on the searched route, namely, route guidance is performed. Therefore, the driver can drive the vehicle along the displayed searched route.


Incidentally, there is a vehicle control system that changes shift speed of the automatic transmission or output of the engine using information obtained by the navigation device, so as to control travel of the vehicle.


A road shape estimating device is provided in the vehicle control system, and a road shape is estimated with the road shape estimating device. Then the road shape estimating device reads road data from a database provided in the data recording unit of the navigation device so as to obtain interpolation point data regarding interpolation points set for representing a road shape by a plurality of points in the database, namely, shape interpolation points. The road shape estimating device then calculates the radius of curvature (hereinafter, simply referred to as “radius”) of the road between shape interpolation points by means of three-point calculation method, detects and sets a corner in the database based on each radius, and sets a candidate start point representing a start point of the corner and sets a candidate end point representing an end point of the corner. Further, the road shape estimating device corrects the positions of the candidate start point and the candidate end point based on a clothoid curve expressed by an approximate expression of the corner, estimates the road shape, and records the corrected positions of the candidate start point and the candidate end point as data representing the road shape in the recording unit (refer to, for example, Japanese Patent Application Publication No. JP-A-2005-214839).


SUMMARY OF THE INVENTION

In the conventional road shape estimating device, however, depending on the manner of setting shape interpolation points in the database, it is possible that a corner that should be regarded as one corner on the actual road is detected as a plurality of corners.


Consequently, the road shape cannot be estimated precisely.


It is an object of the present invention to solve the problem of the conventional road shape estimating device, and to provide a road shape estimating device, a road shape estimating method and a program which are capable of estimating a road shape precisely.


To achieve this object, a road shape estimating device according to the present invention has a data obtaining processing unit for obtaining interpolation point data of a plurality of shape interpolation points which are set along a road and represent a shape of the road, a radius calculation processing unit for calculating a radius of curvature at each of the shape interpolation points based on the interpolation point data in a predetermined section of the road, a corner detection processing unit for detecting a corner in the section based on the radius of curvature at each of the shape interpolation points and for setting a candidate start point and a candidate end point of the detected corner, and a corner connection processing unit for determining, when plural corners are detected in the section, whether or not a predetermined connecting condition is met based on the candidate start points and the candidate end points of the detected corners, and for connecting, when the predetermined connecting condition is met, a predetermined corner among the plurality of corners and a corner adjacent to the predetermined corner.


According to the present invention, based on the radius of curvature at each of shape interpolation points in a predetermined section, a corner in the section is detected, and a candidate start point and a candidate end point of the detected corner are set. When a plurality of corners are detected in the section, whether or not a predetermined connecting condition is met is determined based on candidate start points and candidate end points of the detected corners, and when the predetermined connecting condition is met, a predetermined corner among the plurality of corners and a corner adjacent to the predetermined corner are connected. Thus, a corner that should be regarded as one corner on the actual road will not be regarded as a plurality of corners dependent on the manner in which setting shape interpolation points are recorded in the database. Therefore, the road shape can be estimated precisely.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a block diagram of a vehicle control system of an embodiment of the present invention; and FIG. 1B is a block functional diagram of the CPU of FIG 1A;



FIG. 2 is a main flowchart showing an operation of a road shape estimation processing unit in the embodiment of the present invention;



FIG. 3 is a flowchart showing a subroutine of radius calculation processing in the embodiment of the present invention;



FIG. 4 is a diagram explaining the principle of calculating a radius in the embodiment of the present invention;



FIG. 5 is a first explanatory diagram of corner detection processing in the embodiment of the present invention;



FIG. 6 is a second explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 7 is a third explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 8 is a fourth explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 9 is a fifth explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 10 is a sixth explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 11 is a seventh explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 12 is an eighth explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 13 is a ninth explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 14 is a tenth explanatory diagram of the corner detection processing in the embodiment of the present invention;



FIG. 15 is a first diagram showing a subroutine of corner connection processing in the embodiment of the present invention;



FIG. 16 is a second diagram showing the subroutine of the corner connection processing in the embodiment of the present invention;



FIG. 17 is a first explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 18 is a second explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 19 is a third explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 20 is a fourth explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 21 is a fifth explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 22 is a sixth explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 23 is a seventh explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 24 is an eighth explanatory diagram of the corner connection processing in the embodiment of the present invention;



FIG. 25 is a table showing a clothoid coefficient map in the embodiment of the present invention;



FIG. 26 is a flowchart showing a subroutine of clothoid curve calculation processing in the embodiment of the present invention; and



FIG. 27 is an explanatory diagram of fitting processing in the embodiment of the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Hereinafter, a preferred embodiment of the present invention will be explained with reference to the drawings.



FIG. 1A is a block diagram of a vehicle control system in the embodiment of the present invention, and FIG. 2 is a main flowchart of operation of a road shape estimation processing unit in the embodiment of the present invention.


In FIG. 1A, numeral 14 denotes an information terminal, for example, a navigation device as an on-vehicle device mounted in a vehicle, and the navigation device 14 includes a GPS sensor 15 as a current position detecting unit which detects the current position of the vehicle as a vehicle position and the direction of the vehicle as a vehicle direction, a data recording unit 16 as an information recording unit in which various information besides map data (not shown) is recorded, a navigation processing unit 17 performing various calculation processing such as navigation processing, an operation unit 34 as a first input unit with which a driver who is an operator performs a predetermined input by operating it, a display unit 35 as a first output unit for performing various types of display with images displayed on a screen (not shown) for informing the driver, an audio input unit 36 as a second input portion with which the driver performs a predetermined input by voice, an audio output unit 37 as a second output unit for performing audio output to inform the driver of various information, and a communication unit 38 as a transmission/reception unit functioning as a communication terminal. The GPS sensor 15, data recording unit 16, operation unit 34, display unit 35, audio input unit 36, audio output unit 37 and communication unit 38 are connected to the navigation processing unit 17. Further, a vehicle speed sensor 44 or the like as a vehicle speed detecting unit which detects a vehicle speed is connected to the navigation processing unit 17. The GPS sensor 15 detects time besides the vehicle position and the vehicle direction. Note that it is also possible to provide a direction sensor (not shown) independently from the GPS sensor 15 for detecting the vehicle direction with the direction sensor.


In the data recording unit 16, a database constituted of a map data file is provided, and map data are recorded in the database. The map data include road data regarding roads connecting intersections (including branch points), node data regarding nodes representing end points (start points and end points) of the roads, intersection data regarding the intersections, search data processed for searching, facility data regarding facilities, and the like, as well as feature data regarding features on the road. Further, the road data include data representing road links from start points to end points of the roads on the database, and data of shape interpolation points (hereinafter referred to as “interpolation point data”) as a plurality of set points which are set along roads for representing road shapes on the road links. The interpolation point data include the numbers, coordinates, and the like of the shape interpolation points. Note that each node represents a start point or end point on a road link, and hence is also a shape interpolation point.


Furthermore, in the data recording unit 16, a database constituted of a statistical data file, a database constituted of a travel history data file, and the like are provided. Statistical data are recorded in the statistical data file, and travel history data are recorded in the travel history data file, both as actual data.


The data recording unit 16 includes a disk (not shown) such as a hard drive, CD, DVD or optical disk for recording the various aforementioned data, and further includes a head (not shown) such as a reading/writing head for reading or writing various data. Further, a memory card or the like can be used for the data recording unit 16.


The navigation processing unit 17 has a CPU 31 as a control device for overall control of the navigation device 14 and, as an arithmetic unit, a RAM 32 used as a working memory when the CPU 31 performs various arithmetic processing, a ROM 33 in which various programs for performing a search for a route to a destination, route guidance, a control program and the like are recorded, a flash memory (not shown) used for recording various data and programs, and the like.


As the operation unit 34, a keyboard (not shown), mouse, and/or the like provided independently from the display unit 35 can be used. Further, as the operation unit 34, it is possible to use a touch panel which allows a predetermined input operation by touching or clicking image operation units such as various keys, switches, buttons, and the like which are displayed as images on a screen formed on the display unit 35.


The display unit 35 displays on various screens, the vehicle position, vehicle direction and the like, as well as a map, a searched route, guidance information along the searched route, traffic information, and so on, and the distance to the next intersection on the searched route and a traveling direction at the next intersection.


Further, the audio input unit 36 includes a microphone (not shown) or the like, and enables input of necessary information by voice. Furthermore, the audio output unit 37 includes an audio synthesis device (not shown) and a speaker (not shown), and provides route guidance along the searched route by audio output.


The communication unit 38 includes a beacon receiver (not shown) for receiving various information such as current traffic information and general information transmitted from a vehicle information and communication system center, a FM receiver (not shown) for receiving the various information as an FM multiplex broadcast via an FM broadcast station, and so on. Further, the communication unit 38 is able to receive data such as map data, statistical data, and traveling history data in addition to traffic information and general information via a network (not shown) from an information center (not shown).


Note that the navigation processing unit 17, CPU 31, and so on function as a computer independently or in combination of two or more, and perform arithmetic processing based on various programs, data, and the like. Further, the data recording unit 16, RAM 32, ROM 33, flash memory, and so on form a storage device and a recording medium. As the arithmetical unit, a MPU or the like can be used instead of the CPU 31.


Note that numeral 10 denotes an automatic transmission, numeral 11 denotes an automatic transmission control device, numeral 51 denotes an engine control unit, and numeral 52 denotes an engine.


Next, basic operation of the navigation device 14 with the above structure will be explained.


First, the operation unit 34 is operated by the driver and the navigation device 14 is started, a current position reading processing unit (not shown) of the CPU 31 performs current position reading processing so as to read the vehicle position and the vehicle direction detected by the GPS sensor 15. Then a matching processing unit (not shown) of the CPU 31 performs matching processing and identifies the vehicle position by determining on which of the road links the vehicle is located based on the trace of the read vehicle position, and shapes, arrangements, and the like of road links forming roads in the surrounding area of the vehicle position.


Subsequently, a basic information obtaining processing unit (not shown) of the CPU 31 performs basic information obtaining processing to read and obtain the map data from the data recording unit 16. Note that the map data can be obtained from the information center or the like, and in this case, the basic information obtaining processing unit downloads the received map data into the flash memory.


A display processing unit (not shown) of the CPU 31 performs display processing to form various screens on the display unit 35. For example, a map display processing unit of the display processing unit performs map display processing to form a map screen on the display unit 35 so as to display a surrounding map and further display the vehicle position and the vehicle direction on the map screen.


Therefore, the driver can drive the vehicle according to the surrounding map, the vehicle position and the vehicle direction.


Further, when the driver inputs a destination through the operation unit 34, a destination setting processing unit (not shown) of the CPU 31 performs destination setting processing to set the destination. Note that a place of departure can be input and set as necessary. Further, it is possible to register a predetermined point in advance, and set the registered point as a destination. Subsequently, when the driver inputs a search condition by operating the operation unit 34, a search condition setting processing unit (not shown) of the CPU 31 performs search condition setting processing to set the search condition.


When the destination and the search condition are set in this manner, a route search processing unit (not shown) of the CPU 31 performs route search processing to read the vehicle position, vehicle direction, destination, search condition and the like and also to read the search data and the like from the data recording unit 16, to search for a route from a place of departure represented by the vehicle position to the destination with the search condition based on the vehicle position, vehicle direction, destination, search data, and the like, and to output route data representing a searched route. At this time, in the route search processing, the route having a smallest sum of link costs respectively associated with the road links is taken as the searched route. Note that the place of departure can also be a predetermined point set by the driver instead of the vehicle position, so as to search for a route from the predetermined point to the destination.


Subsequently, a guidance processing unit (not shown) of the CPU 31 performs guidance processing to provide route guidance. For this purpose, a guidance display processing unit of the guidance processing unit performs guidance display processing to read the route data and display the searched route on the map screen based on the route data.


Note that when it is necessary, for example, to turn the vehicle to the right or left at an intersection that is a point serving as a subject of route guidance, the intersection is set as a guidance intersection for providing guidance as to direction to be taken at the intersection. An audio output processing unit of the guidance processing unit then performs audio output processing to provide the route guidance by audio output before the vehicle reaches the guidance intersection.


Further, a guidance point enlarged view formation processing unit of the guidance processing unit performs guidance point enlarged view formation processing to form, in a predetermined region of the map screen, an enlarged view of the guidance intersection, namely, an intersection enlarged view, a type of guidance point enlarged view, before the vehicle reaches the guidance intersection, and thereby provides route guidance with the intersection enlarged view. For this purpose, when the vehicle reaches a location that is separated by a set distance in advance of (or on the vehicle position side of) the guidance intersection on the searched route, the intersection enlarged view is displayed. In this case, a map of the vicinity of the guidance intersection, the searched route, and a landmark, e.g. facility or the like, at the guidance intersection are displayed on the intersection enlarged view.


Now, in this embodiment, information obtained in the navigation device 14 is transmitted to the automatic transmission control device 11 so as to change the shift speed of the automatic transmission 10 in accordance with the road shape ahead of the vehicle position, and/or transmitted to the engine control device 51 so as to change the output of the engine 52 in accordance with the road shape ahead of the vehicle position, thereby controlling travel of the vehicle.


For this purpose, the CPU 31 functions as a road shape estimating device, and a road shape estimation processing unit 311 of the CPU 31 performs road shape estimation processing so as to estimate the road shape by a method which will be described later. A travel control processing unit (not shown) of the CPU 31 performs travel control processing to transmit a signal for changing the shift speed of the automatic transmission 10 and/or changing the output of the engine 52, according to the estimated road shape, to the automatic transmission control device 11, the engine control device 51, or the like.


Next, operation of the road shape estimation processing unit will be explained with reference to FIG. 2.


First, a data obtaining processing unit 3111 of the road shape estimation processing unit 311 executes data obtaining processing to set a predetermined region ahead of the vehicle position, including the road for which a road shape is to be estimated, as a road shape estimation range, reads road data from the data recording unit 16, and obtains node data within the road shape estimation range.


In this case, within the road shape estimation range, estimation of road shape is performed in sequence of road links between nodes adjacent to each other from the first node to the last node among the nodes represented by the node data. Note that in this case, each road link for which a road shape is to be estimated in the database is referred to as a shape estimation road.


Then, for performing estimation of a road shape for each shape estimation road (road link), the data obtaining processing unit 3111 obtains interpolation point data for shape interpolation points on the shape estimation road ahead of the vehicle position. Note that in this embodiment, the data obtaining processing unit obtains the node data and the interpolation point data by reading them from the data recording unit 16, but they can also be obtained by receiving via a network from the information center.


Next, a radius calculation processing unit 3112 of the road shape estimation processing unit 311 performs radius calculation processing to calculate a radius (radius of curvature) at each shape interpolation point of the shape estimation road by a three-point calculation method based on the interpolation point data of shape interpolation points on the shape estimation road ahead of the vehicle position. Subsequently, a corner detection processing unit 3113 of the road shape estimation processing unit 311 executes corner detection processing to detect a corner in the shape estimation road in the database, based on the radius, and sets, in the database, a shape interpolation point where the corner starts as a candidate start point, and a shape interpolation point where the corner ends as a candidate end point.


When a corner that should be regarded as one corner on the actual road is detected as a plurality of corners, depending on the manner of setting the shape interpolation points in the database, a corner connection processing unit of the road shape estimation processing unit 311 then executes corner connection processing so as to connect the detected corners.


Conversely, when corners that should be regarded as a plurality of corners on the actual road are detected as one corner, depending on the manner of setting the shape interpolation points in the database, first and second corner dividing processing units 3114, 3115 of the road shape estimation processing unit 311 perform first and second corner dividing processing to divide the detected corner. In this case, in the first corner dividing processing, the corner is divided when a predetermined shape interpolation point in the corner has a radius that is equal to or larger than a threshold radius for dividing a corner, and in the second corner dividing processing, the corner is divided when a segment between predetermined shape interpolation points in the corner has a length that is equal to or longer than a threshold length.


In this manner, when connection of corners or dividing of a corner is performed to set an appropriate corner based on shape interpolation points, a minimum radius calculation processing unit 3120 of the road shape estimation processing unit 311 calculates an appropriate minimum radius for estimating a clothoid coefficient for each corner.


Subsequently, a clothoid coefficient estimation processing unit 3116 of the road shape estimation processing unit 311 executes clothoid coefficient estimation processing to estimate a clothoid coefficient based on the calculated minimum radius. A clothoid curve calculation processing unit 3117 of the road shape estimation processing unit 311 performs clothoid curve calculation processing to form an equation approximating the corner based on the clothoid coefficient and to calculate a clothoid curve.


In this manner, when the clothoid curve representing the road shape of the corner is calculated, a fitting processing unit 3118 of the road shape estimation processing unit 311 executes fitting processing to correct the candidate start point and the candidate end point by matching the clothoid curve with the respective shape interpolation points on the corner, and thereby sets the start point and the end point of the corner to positions approximating a point where the corner starts and a point where the corner ends on the actual road.


Then a start point/end point recording unit 3119 of the road shape estimation processing unit 311 records the start point and the end point as data representing the road shape in the data recording unit 16. In this manner, the road shape is estimated.


Next, the flowchart of FIG. 2 will be explained.


In step S1, the radius calculation processing is performed.


In step S2, the corner detection processing is performed.


In step S3, the corner connection processing is performed.


In step S4, the first corner dividing processing is performed.


In step S5, the second corner dividing processing is performed.


In step S6, the minimum radius calculation processing is performed.


In step S7, the clothoid coefficient estimation processing is performed.


In step S8, the clothoid curve calculation processing is performed.


In step S9, the fitting processing is performed.


In step S10, the start point/end point recording processing is performed, and the processing is finished.


Next, the subroutine of the radius calculation processing in step S1 of FIG. 2 will be explained with reference to FIGS. 3 and 4.


As shown in FIG. 4, a shape interpolation point to be a target of radius calculation processing is designated as target point ma, a shape interpolation point that is located immediately preceding the target point ma (on the vehicle position side) is designated as preceding adjacent point mb, and a shape interpolation point that is located immediately succeeding the target point ma (on the side of point ma opposite the vehicle position) is designated as succeeding adjacent point mc. The segment length of a segment on a side nearer than the target point ma (hereinafter referred to as “front side segment”) that connects the target point ma and the preceding adjacent point mb is designated as L1, and the segment length of a segment on a side farther than the target point ma (hereinafter referred to as “rear side segment”) that connects the target point ma and the succeeding adjacent point mc is designated as L2. Further, when the circle passing through the target point ma, the preceding adjacent point mb and the succeeding adjacent point mc is designated as Cr, the center of the circle Cr is designated as Q and the radius of the circle Cr is designated as R. Furthermore, at the target point ma, the angle which the rear side segment forms with respect to the front side segment is designated as direction angle θ.


A direction to connect the target point ma and the succeeding adjacent point mc is x-axis direction and a direction at a right angle with respect to the x axis direction is y-axis direction, and when the distance between the target point ma and the preceding adjacent point mb in the x-axis direction is designated as X, the distance between the target point ma and the center Q in the x-axis direction is designated as A, the distance between the preceding adjacent point mb and the center Q in the x-axis direction is designated as G, the distance between the target point ma and the preceding adjacent point mb in the y-axis direction is Y, the distance between the preceding adjacent point mb and the center Q in the y-axis direction is designated as H, and the distance between the target point ma and the center Q on the y-axis direction is designated as F, following equations hold true:





X=L1cosθ






A=L2/2






G=X+A





Y=L1sinθ






F=√(R2−A2)






H=F−Y


In this case, the sum of the square of the distance G and the square of the distance H is equal to the square of the radius R, and thus the relationship of equation (1) is satisfied.






R
2
=G
2
+H
2   (1)


Next, by substituting the above respective values into the distances G, H of the equation (1), following equation (2) can be obtained.






R
2
=X
2+2XA+A2+R2−A2−2Y√(R2−A2)+Y2   (2)


Subsequently, by modifying the equation (2), equations (3) to (6) can be obtained.





2Y√(R2−A2)=X2+Y2+2XA   (3)






R
2
−A
2={(X2+Y2+2XA)/2Y}2   (4)






R
2={(X2+Y2+2XA)/2Y}2+A2   (5)






R
2=(L12+2L1·L2·cosθ+L22)/(2sinθ)2   (6)


From the equation (6), it can be seen that the radius R is a function of the segment lengths L1, L2 and the direction angle θ.


Accordingly, the radius calculation processing unit 3112 reads the interpolation point data, calculates the segment lengths L1, L2 based on respective coordinates of the target point ma, the preceding adjacent point mb and the succeeding adjacent point mc, and calculates the direction angle θ.


Subsequently, the radius calculation processing unit 3112 calculates the radius R of the circle Cr by aforementioned equation (6) based on the segment lengths L1, L2 and the direction angle θ. In this manner, the radius R at the target point ma can be calculated.


Next, the flowchart of FIG. 3 will be explained.


In step S1-1, the segment length L1 of the front side segment is calculated.


In step S1-2, the segment length L2 of the rear side segment is calculated.


In step S1-3, the direction angle θ is calculated.


In step S1-4, the radius R is calculated and the process returns to the main routine.


Note that in this embodiment, the direction angle θ and the radius R are calculated based on calculations by the radius calculation processing unit 3112, but it is also possible to record a direction angle θ and a radius R which are calculated in advance and stored in the database of the data recording unit 16 as part of the road data, and to read them from the data recording unit 16.


Next, the corner detection processing in step S2 of FIG. 2 will be explained with reference to FIGS. 5-14.


In the diagrams, ri (i=1, 2, . . . ) is a shape estimation road set between two nodes (not shown), mi (i=1, 2, . . . ) are a plurality of shape interpolation points set in order from the near side in the traveling direction of the vehicle (arrow G direction) along the shape estimation road ri, and θi (i=1, 2, . . . ) is a direction angle at each shape interpolation point mi. Note that the direction angle θi is an angle which the rear side segment forms with respect to the front side segment. When the rear side segment is located on the right side in the traveling direction of the vehicle with respect to the front side segment, the direction angle θi takes a positive value. When the rear side segment is located on the left side in the traveling direction of the vehicle with respect to the front side segment, the direction angle θi takes a negative value.


First, a candidate start point/candidate end point setting processing unit of the corner detection processing unit 3113 performs candidate start point/candidate end point setting processing to read the radius Ri (i=1, 2, . . . ) at each shape interpolation point mi, determines whether the radius Ri is equal to or smaller than a threshold Rth (1000 [m] in this embodiment) for corner detection which is set in advance, and extracts any shape interpolation point with a radius equal to or smaller than the threshold Rth. In FIG. 5, the shape interpolation points m3 to m9 are close to a straight line, and they are determined to be equal to or lower than the threshold Rth and are extracted.


Subsequently, the candidate start point/candidate end point setting processing unit determines whether or not there are a plurality of consecutive shape interpolation points (hereinafter referred to as a “consecutive shape interpolation point group”) among the extracted shape interpolation points. When there is a consecutive shape interpolation point group, the unit takes the shape interpolation point on the nearest side (vehicle position side) in the consecutive shape interpolation point group as a candidate start point. In FIG. 5 and FIG. 6, the consecutive shape interpolation point group is formed by the shape interpolation points m3 to m9, and the shape interpolation point m3 is taken as the candidate start point s1.


Further, the candidate start point/candidate end point setting processing unit reads the direction angles θ calculated by the radius calculation processing unit 3112 for the extracted shape interpolation points sequentially from the near side, and determines whether or not the direction angle reverses from positive to negative or from negative to positive at a predetermined shape interpolation point (the direction of the direction angle reverses with respect to the immediately preceding shape interpolation point). When the direction angle reverses between positive and negative at the predetermined shape interpolation point, the unit determines whether or not the direction angle reverses at the immediately succeeding shape interpolation point. Then, when the direction angle reverses between positive and negative at the predetermined shape interpolation point but does not reverse at the immediately succeeding shape interpolation point, the predetermined shape interpolation point is taken as the candidate start point. When shape interpolation points m11 to m16 are set on a shape estimation road r2 having a shape as shown in FIG. 7, direction angles θ12 and θ13 take negative values and direction angles θ14 and θ15 take positive values at the shape interpolation points m12 to m15. That is, the direction angle θ14 reverses from negative to positive at the shape interpolation point m14, and the direction angle θ15 remains positive and does not reverse at the immediately succeeding shape interpolation point m15. Thus, the shape interpolation point m14 is taken as the candidate start point s1.


Next, the candidate start point/candidate end point setting processing unit determines whether or not there is a consecutive shape interpolation point group among the extracted shape interpolation points. When there is a consecutive shape interpolation point group, the unit takes the shape interpolation point on the farthest side (side opposite the vehicle position) in the consecutive shape interpolation point group as a candidate end point. In FIG. 5 and FIG. 8, the consecutive shape interpolation point group is formed by the shape interpolation points m3 to m9, and the shape interpolation point m9 is taken as the candidate end point e1.


Further, the candidate start point/candidate end point setting processing unit determines whether the direction angle reverses at a predetermined shape interpolation point. When the direction angle does not reverse at the predetermined shape interpolation point, the unit determines whether or not it reverses at the immediately succeeding shape interpolation point. If the direction angle does not reverse at the predetermined shape interpolation point but does reverse at the immediately succeeding shape interpolation point, the predetermined shape interpolation point is taken as a candidate end point. When shape interpolation points m21 to m26 are set on a shape estimation road r3 having a shape as shown in FIG. 9, direction angles θ22 and θ23 take positive values and direction angles θ24 and θ25 take negative values at the shape interpolation points m22 to m25. That is, the direction angle θ23 remains positive and does not reverse at the shape interpolation point m23, and the direction angle θ24 reverses from positive to negative at the immediately succeeding shape interpolation point m24. Thus, the shape interpolation point m23 is taken as a candidate end point e1.


In this manner, when the candidate start point and the candidate end point are set, a corner setting processing unit of the corner detection processing unit 3113 performs corner setting processing, and sets a corner between the candidate start point and the candidate end point. On a shape estimation road r4 having a shape as shown in FIG. 10, a shape interpolation point m2 is selected from among shape interpolation points m1 to m10 as the candidate start point s1, the shape interpolation point m9 is taken as the candidate end point e1, and a corner cn1 between the candidate start point s1 and the candidate end point e1 is set.


Note that when the radius of the predetermined shape interpolation point is equal to or smaller than the threshold Rth and both the radius of the immediately preceding shape interpolation point and the radius of the immediately succeeding shape interpolation point are larger than the threshold Rth, the candidate start point/candidate end point setting processing unit sets the predetermined shape interpolation point as a single point, combining a candidate start point and a candidate end point, and the corner setting processing unit sets a corner to the predetermined shape interpolation point. On a shape estimation road r5 having a shape as shown in FIG. 11, shape interpolation point m33 is selected from among shape interpolation points m31 to m35 and set as a single point f1, and a corner cn1 is set to the shape interpolation point m33.


Furthermore, when the direction angle reverses between positive and negative at the predetermined shape interpolation point and the direction angle reverses at the immediately succeeding shape interpolation point, the candidate start point/candidate end point setting processing unit sets the predetermined shape interpolation point as a single point.


On a shape estimation road r6 having a shape as shown in FIG. 12, direction angles θ42 to θ44, θ46 and θ47 at shape interpolation points m42 to m44, m46 and m47 have positive values, and a direction angle θ45 at a shape interpolation point m45 has a negative value. In this case, the direction angle θ45 reverses from positive to negative at the shape interpolation point m45, and the direction angle θ46 reverses from negative to positive at the immediately succeeding shape interpolation point m46. Thus, the shape interpolation point m45 is set as a single point f1 and a corner cn1 is set to the shape interpolation point m45.


Subsequently, the corner setting processing unit sets three corners cn1 to cn3 before and after the single point f1 and at the single point f1. Note that, in this case, the numbers of the corners are added sequentially from the near side, and the number of a candidate start point and the number of a candidate end point are added corresponding to the numbers of the corners.


In this manner, when a corner is set on a shape estimation road, the recording processing unit of the corner detection processing unit 3113 records the number of corners in the shape estimation road, the number of a candidate start point, the number of a candidate end point, and so on in the RAM 32 (FIG. 1).


Now, as described above, when the direction angle reverses from positive to negative or from negative to positive at a predetermined shape interpolation point, and the direction angle at an immediately succeeding shape interpolation point reverses, a corner is set to the predetermined shape interpolation point. If precision of the interpolation point data in the database is low and a single point is included in the shape interpolation point mi, a corner that should be regarded as one corner may be detected as a plurality of corners. For example, on a shape estimation road r7 having a shape as shown in FIG. 13, among shape interpolation points m1 to m11, direction angles θ3 to θ5, θ7 to θ9 of the shape interpolation points m3 to m5, m7 to m9, for example, have positive values, and the direction angle θ6 of the shape interpolation point m6 has a negative value. In this case, the direction angle θ6 reverses from positive to negative at the shape interpolation point m6, and the direction angle θ7 reverses from negative to positive at the immediately succeeding shape interpolation point m7. Thus, the shape interpolation point m6 is set as a single point f1.


In this case, the direction angle θ5 does not reverse at the shape interpolation point m5, but the direction angle θ6 reverses from positive to negative at the shape interpolation point m6. Thus, the shape interpolation point m5 is taken as a candidate end point e1 of a corner cn1 on the near side of the single point f1. The angle θ7 reverses from negative to positive at the shape interpolation point m7, and the direction angle θ8 does not reverse at the shape interpolation point m8. Thus, the shape interpolation point m7 is taken as a candidate start point s3 of a corner cn3 on the far side of the single point f1.


Therefore, as shown in FIG. 14, although one corner cn1 should have been detected, corners cn1 to cn3 were detected.


Consequently, the corners cn1 to cn3 are set, a shape interpolation point m2 is set as a candidate start point s1 of the corner cn1, the shape interpolation point m5 is set as the candidate end point e1 of the corner cn1, the shape interpolation point m6 is set as a candidate start point s2 and a candidate end point e2 of the corner cn2, the shape interpolation point m7 is set as the candidate start point s3 of the corner cn3, and the shape interpolation point m10 is set as a candidate end point e3 of the corner cn3.


Accordingly, in this embodiment, when a corner which is actually one corner is detected as plural corners because of the manner of setting the shape interpolation points mi in the database, the detected corners are connected by the corner connection processing unit 3114.


Next, the subroutine of the corner connection processing in step S3 of FIG. 2 will be explained. In this case, corner connection processing for connecting the corners cn1 to cn3 with respect to the shape estimation road r7 shown in FIG. 14 will be explained with reference to FIGS. 15-24.


First, a corner information obtaining processing unit of the corner connection processing unit 3114 performs corner information obtaining processing to obtain corner information for a first corner (hereinafter referred to as “current corner”) which is a predetermined corner in a shape estimation road and corner information for a second corner (hereinafter referred to as “next corner”) which is a corner adjacent the current corner, by reading them from the RAM 32 (FIG. 1). In this embodiment, the current corner means a corner selected in order, beginning with the nearest first corner from among the corners on the shape estimation road, and the next corner refers to the corner immediately preceding the current corner. In FIG. 17, when the corner cn1 is taken as the current corner and the corner cn2 is taken as the next corner, corner information for the corners cn1, cn2 is obtained.


Note that the corner information includes the numbers of a candidate start point and a candidate end point for each corner recorded in the RAM 32 in addition to the interpolation point data In this embodiment, since the corner connection processing is performed for the shape estimation road r7 (FIG. 13), the number of each corner, the number of a candidate start point of each corner, the number of a candidate end point of each corner, and so on are obtained as the corner information.


Subsequently, a connecting condition determination processing unit of the corner connection processing unit 3114 executes connecting condition determination processing to read the direction angle at a candidate end point of the current corner and the direction angle at a candidate start point of the next corner, and determines whether or not a first connecting condition for performing the corner connection processing is met by whether or not respective direction angles change between positive and negative at the candidate end point of the current corner and the candidate start point of the next corner. In FIG. 17, the direction angle θ5 at the candidate end point e1 has a positive value, and the direction angle θ6 at the candidate start point s2 has a negative value. Thus, the direction angle reverses and the first connecting condition is met.


Next, the connecting condition determination processing unit reads the interpolation point data from the data recording unit 16, and determines whether or not a second connecting condition is met, i.e. by determining whether or not the number of shape interpolation points from the candidate start point to the candidate end point in the next corner is equal to or smaller than a threshold (2 in the embodiment) or by determining whether or not connection processing is performed for the current corner. Note that the determination whether or not connection processing is performed for the current corner is determined by whether or not a flag, indicating that connection has been performed, which will be described later, is on.


In FIG. 17, the corner cn2 is formed only by a single point f1 and is constituted by only the shape interpolation point m6. Therefore, the shape interpolation point m6 from the candidate start point s2 to the candidate end point e2 is one in number, which is less than two, and thus the second connecting condition is met.


Subsequently, the connecting condition determination processing unit reads the interpolation point data, and determines whether or not a third connecting condition is met by determining whether or not the position of a shape interpolation point succeeding the candidate end point of the current corner and the position of a candidate start point of the next corner are the same. In FIG. 17, the position of the shape interpolation point m6 succeeding the candidate end point e1 and the position of the candidate start point s2 are the same, and thus the third connecting condition is met.


Next, the connecting condition determination processing unit determines whether or not a fourth connecting condition is met based on the distance from a candidate end point of the current corner to a candidate start point of the next corner. For this purpose, the connecting condition determination processing unit reads the interpolation point data and determines whether or not the distance from the candidate end point of the current corner to the candidate start point of the next corner is equal to or smaller than a threshold Lth set in advance, and determines that the fourth connecting condition is met when the distance from the candidate end point of the current corner to the candidate start point of the next corner is equal to or smaller than the threshold Lth set in advance. In FIG. 18, when a distance LC1 from the shape interpolation point m5, that is a candidate end point of the corner cn1, to the shape interpolation point m6, that is a candidate start point of the corner cn2, is equal to or smaller than the threshold Lth, the fourth connecting condition is met.


Note that the connecting condition determination processing unit can also determine whether or not the distance from the candidate end point of the current corner to the candidate start point of the next corner is shorter than the distance between respective shape interpolation points of the current corner. In this case, the connecting condition determination processing unit reads the interpolation point data and determines whether or not the fourth connecting condition is met, i.e. whether or not the distance from the current corner to the next corner is equal to or shorter than a threshold Lthe, which is calculated based on the distance between the respective shape interpolation points of the current corner. When a reference deviation calculated based on an average value La of the distance between the shape interpolation points of the current corner is σ, the threshold Lthe is set to be






Lthe=La−


Then, when the first to fourth connecting conditions are met, a corner connection execution processing unit of the corner connection processing unit 3114 performs corner connection execution processing to connect the current corner and the next corner. For this purpose, the corner connection execution processing unit erases the attribute of the candidate end point of the current corner and the attribute of the candidate start point of the next corner from the data recording unit 16. As shown in FIG. 18, the attribute of the candidate end point e1 at the shape interpolation point m5 and the attribute of the candidate start point s2 at the shape interpolation point m6 are erased. Consequently, as shown in FIGS. 19 and 20, a new corner cn1′ is set with the shape interpolation point m2 being a candidate start point s1′ and the shape interpolation point m6 being a candidate end point e1′. At this time, the flag is turned on indicating that the new corner cn1′ is a connected corner.


Further, when one of the first to fourth connecting conditions is not met, the corner connection execution processing unit does not connect the current corner and the next corner, and takes the next corner as the current corner.


On the other hand, as described above, when the corner cn1 and the corner cn2 are connected and the new corner cn1′ is set the new corner cn1′ is taken as the current corner and a corner cn3 is taken as a next corner cn2′, and the corner information obtaining processing, the connecting condition determination processing and the corner connection execution processing are performed for the corners cn1′, cn2′.


Specifically, the corner information obtaining processing unit obtains corner information of the current corner and corner information of the next corner in the shape estimation road by reading them from the RAM 32. In FIG. 20 and FIG. 21, the corner cn1 is taken as the current corner cn1′, the corner cn3 is taken as the next corner cn2′, a candidate start point s3 is taken as a candidate start point s2′, a candidate end point e3 is taken as a candidate end point e2′, and corner information of the corners cn1′, cn2′ are obtained.


Subsequently, the connecting condition determination processing unit determines whether or not the first connecting condition for performing the corner connection processing is met by determining whether or not the respective direction angles at the candidate end point of the current corner and the candidate start point of the next corner reverse between positive and negative. In FIG. 21, the direction angle θ6 at the candidate end point e1′ has a negative value, and the direction angle θ7 at the candidate start point s2′ has a positive value. Thus, the direction reverses and the first connecting condition is met.


Next, the connecting condition determination processing unit determines whether or not the second connecting condition is met by determining whether or not the number of shape interpolation points of the next corner is equal to or smaller than the threshold (2 in this embodiment) or by whether or not connection is performed for the current corner.


In FIG. 21, the number of shape interpolation points m7 to m10 of the corner cn2′ is four, which is larger than two, but the flag is on and the current corner cn1′ is a connected corner. Thus, the second connecting condition is met.


Subsequently, the connecting condition determination processing unit determines whether the third connecting condition is met or not by determining whether or not the position of a shape interpolation point succeeding the candidate end point of the current corner and the position of a candidate start point of the next corner are the same. In FIG. 21, the position of the shape interpolation point m7 succeeding the candidate end point e1′ and the position of the candidate start point s2′ are the same, and thus the third connecting condition is met.


Next, the connecting condition determination processing unit determines whether or not the fourth connecting condition is met by determining whether or not the distance from the candidate end point of the current corner to the candidate start point of the next corner is equal to or smaller than the threshold. In FIG. 22, when a distance LC2 between the shape interpolation points m6, m7 is equal to or smaller than the threshold Lth, the fourth connecting condition is met.


Then, when the first to fourth connecting conditions are met, the corner connection execution processing unit connects the current corner and the next corner. For this purpose, the corner connection execution processing unit erases the attribute of the candidate end point of the current corner and the attribute of the candidate start point of the next corner. As shown in FIGS. 22 and 23, the attribute of the candidate end point e1′ at the shape interpolation point m6 and the attribute of the candidate start point s2′ at the shape interpolation point m7 are erased. Consequently, as shown in FIG. 24, a new corner cn1″ is set with the shape interpolation point m2 being a candidate start point s1″ and the shape interpolation point m10 being a candidate end point e1″. At this time, the flag is turned on.


Thus, in this embodiment, if precision of the interpolation point data in the database is low and plural corners are detected on a shape estimation road, the plurality of corners are connected when the first to fourth connecting conditions are met. Therefore, a corner that should be regarded as one corner will not be regarded as a plurality of corners in the database, and thus the road shape can be precisely estimated.


Next, the flowcharts of FIGS. 15 and 16 will be explained.


In step S3-1, the flag is turned off.


In step S3-2, a loop by the number of corners is started.


In step S3-3, corner information of the current corner is obtained.


In step S3-4, corner information of the next corner is obtained.


In step S3-5, it is determined whether or not the respective direction angles at a candidate end point of the current corner and a candidate start point of the next corner reverse from positive to negative or from negative to positive. When the respective direction angles at the candidate end point of the current corner and the candidate start point of the next corner reverse, the process proceeds to S3-6. When they do not reverse, the process proceeds to step S3-10.


In step S3-6, it is determined whether or not the number of shape interpolation points of the next corner is equal to or smaller than the threshold or whether or not the flag is on. When the number of shape interpolation points of the next corner is equal to or smaller than the threshold or the flag is on, the process proceeds to step S3-7. When the number of shape interpolation points of the next corner is larger than the threshold and the flag is not on, the process proceeds to step S3-10.


In step S3-7, it is determined whether or not the position of a shape interpolation point succeeding the candidate end point of the current corner and the position of the candidate start point of the next corner are the same. When the position of the shape interpolation point succeeding the candidate end point of the current corner and the position of the candidate start point of the next corner are the same, the process proceeds to step S3-8. When they are not the same, the process proceeds to step S3-10.


In step S3-8, it is determined whether or not the distance from the candidate end point of the current corner to the candidate start point of the next corner is equal to or smaller than the threshold. When the distance from the candidate end point of the current corner to the candidate start point of the next corner is equal to or smaller than the threshold, the process proceeds to step S3-9. When the distance from the candidate end point of the current corner to the candidate start point of the next corner is larger than the threshold, the process proceeds to step S3-10.


In step S3-9, connection of the corners is performed.


In step S3-10, the next corner is taken as a current corner.


In step S3-11, the flag is turned on.


In step S3-12, the flag is turned off.


In step S3-13, the loop by the number of corners is finished, and the process returns to the main routine.


Next, the clothoid coefficient estimation processing in step S7 will be explained with reference to FIG. 25 which is a clothoid coefficient map (Table) in the embodiment of the present invention.


In this case, minimum radii Rmin and square values A2 of the clothoid coefficient A of a clothoid curve are recorded in correlation with each other as a clothoid coefficient map in the ROM 33 (FIG. 1), and the clothoid coefficient estimation unit 3116 reads the minimum radius Rmin calculated in aforementioned step S6, refers to the clothoid coefficient map, and reads the value A2 corresponding to the minimum radius Rmin. Subsequently, the clothoid coefficient estimation unit 3116 calculates and estimates a clothoid coefficient A. Note that the clothoid coefficient map is created based on a relationship between the actual radius and the clothoid coefficient of a clothoid curve that is used when designing the road.


Next, the subroutine of the clothoid curve calculation processing in step S8 of FIG. 2 will be explained with reference to FIG. 26 which is a flowchart of the subroutine for clothoid curve calculation processing in the embodiment of the present invention.


The clothoid curve is a curve having a radius of curvature that becomes smaller with distance from its origin. Generally, when a road is designed, a part of the clothoid curve is assigned to the start point or end point of a corner. To express the clothoid curve by x-y coordinates, when the distance from the start point (origin) of the clothoid curve to a predetermined point S (curve length) is L, and the curve radius at the point S is Rc, the following equation (7) applies.






L·Rc=A
2   (7)


Further, normally, the X coordinate and Y coordinate of the clothoid curve can be calculated by equations (8) and (9), respectively.









X
=

L
×

(







1
-

L
2



(

40






Rc
2


)


+


L
4


(

3456






Rc
4


)


-








L
6


(

599040






Rc
6


)


+


L
8


(

175472640






Rc
8


)













)






(
8
)






X
=



L
2


(

6





Rc

)


×

(







1
-

L
2



(

56






Rc
2


)


+


L
4


(

7040






Rc
4


)


-








L
6


(

1612800






Rc
6


)


+


L
8


(

588349440






Rc
8


)

















)






(
9
)







In this case, it is conceivable to calculate the clothoid curve by aforementioned equations (8) and (9), but since equations (8) and (9) are high-order polynomials, such calculation would apply a large load on the CPU 31 due to the plotting of a large number of points.


Accordingly, in this embodiment, the clothoid curve is calculated based on the clothoid coefficient A calculated in the clothoid coefficient estimation processing by the clothoid curve calculation unit 3117. Thus, the clothoid curve calculation unit 3117 approximates and calculates the X coordinate and the Y coordinate of the clothoid curve (point S of the curve length L (k)) by equations (10) and (11) obtained based on simulation.









[

Equation





1

]











X
=


X
0

+



r







cos





φ







(
10
)






Y
=


Y
0

+



L







sin





φ







(
11
)







Here, X0 and Y0 are the X coordinate and the Y coordinate of a start point of the clothoid curve. Further, angle φ is expressed by equation (12).





φ=α+2k·L   (12)


wherein α is the direction of the start point of the clothoid curve, and value k is expressed by the following equation (13).






k=28/A2   (13)


Next, the flowchart of FIG. 26 will be explained.


In step S8-1, a loop is started by the curve length L(k) of the clothoid curve.


In step S8-2, the X coordinate of the point S of the curve length L(k) is calculated.


In step S8-3, the Y coordinate of the point S of the curve length L(k) is calculated.


In step S8-4, the loop is finished for the curve length L(k) of the clothoid curve, and the process returns to the main routine.


Next, the fitting processing in step S9 of FIG. 2 will be explained with reference to FIG. 27. In FIG. 27, reference numeral r7 denotes a shape estimation road, reference numeral cn1 denotes a corner set by the corner connection processing and the first, second corner dividing processing, reference numerals m1 to m5 denote shape interpolation points forming the corner cn1, and reference numeral s1 denotes a candidate start point of the corner cn1.


In this case, a clothoid curve movement processing unit of the fitting processing unit 3118 executes clothoid curve movement processing to place the origin Qs of a clothoid curve Q on an extended line of the segment Sg1 connecting the shape interpolation points m1, m2, and to move the clothoid curve Q in the direction of arrow E so that a tangent line on the origin Qs and the extended line of the segment Sg1 correspond.


At this time, an error calculation processing unit of the fitting processing unit 3118 calculates the sum of errors of the position of the clothoid curve Q with respect to the respective shape interpolation points m2 to m5 after the shape interpolation point m2. In this case, the errors in the position of the clothoid curve Q with respect to the respective shape interpolation points m2 to m5 are calculated based on a distance on the X-axis and a distance on the Y-axis between the shape interpolation points m2 to m5 and each point plotted when drawing the clothoid curve Q.


Then the clothoid curve movement processing unit matches the clothoid curve Q with the shape interpolation points m2 to m5 so that the sum of the errors becomes smallest, so as to set the position of the clothoid curve Q.


Subsequently, a start point/end point extraction processing unit of the fitting processing unit 3118 executes start point/end point extraction processing, and set the origin Qs of the clothoid curve Q as a start point Sa1 of the actual corner when the position of the clothoid curve Q is set.


Then the clothoid curve movement processing unit, the error calculation processing unit and the start point/end point extraction processing unit perform similar processing on the end point side of the corner cn1 so as to set the position of the clothoid curve Q, and then set the origin of the clothoid curve Q as an end point of the actual corner.


In this manner, the candidate start point and the candidate end point are corrected with the clothoid curve Q, and the corrected candidate start point and candidate end point can be set as the start point and end point of the actual corner, respectively. Thus, the candidate start point and the candidate end point of the corner cn1 can be made close to the actual start point and the end point of the corner.


Next, the start point/end point recording processing in step S10 of FIG. 2 will be explained. The start point/end point recording unit 3119 records positions of extracted candidate start point and candidate end point as data representing the road shape in the data recording unit 16.


Thus, in this embodiment, a road shape can be estimated precisely, and the start point and the end point of an actual corner are set based on the precisely estimated road shape. Therefore, it is possible to precisely control travel of the vehicle by changing the shift speed of the automatic transmission 10 according to the road shape and/or changing the output of the engine 52 according to the road shape.


In this embodiment, estimation of a road shape is performed by operation of the CPU 31 of the navigation device 14, but it is possible to estimate a road shape by operation of a CPU (not shown) of the automatic transmission control device 11 or by operation of a CPU in the information center. Note that a vehicle control device is at a higher level than the automatic transmission control device 11 and the engine control device 51, so as to perform overall control of the entire vehicle, and it is also possible to estimate a road shape by operation of a CPU of this vehicle control device.


Further, in this embodiment, estimation of road shape is performed for a shape estimation road between adjacent nodes, but when a road shape is estimated for every shape estimation road between respective nodes in the case where there is a node in a corner on the actual road, the processing becomes different before and after the node in the corner. Accordingly, in another embodiment, road following data are added between respective road links in road data, and when a road link that is expected to be entered after passing the node is clear, or when a searched route is set by route search processing, estimation of road shape can be performed for two or more consecutive shape estimation roads.


The invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims
  • 1. A road shape estimating devices comprising: a data obtaining processing unit for obtaining interpolation point data for a plurality of shape interpolation points which are set along a road and represent a shape of the road;a radius calculation processing unit for calculating a radius of curvature at each of the shape interpolation points, based on the interpolation point data for a predetermined section of the road;a corner detection processing unit for detecting a corner in the predetermined section based on the radius of curvature at each of the shape interpolation points and for setting a candidate start point and a candidate end point for the detected corner; anda corner connection processing unit for determining, when a plurality of corners are detected in the predetermined section, whether or not a predetermined connecting condition is met, based on the candidate start points and the candidate end points of the detected corners, and for connecting, when the predetermined connecting condition is met, a predetermined corner among the plurality of corners and a corner adjacent the predetermined corner.
  • 2. The road shape estimating device according to claim 1, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a direction angle at the candidate end point of the predetermined corner and a direction angle at the candidate start point of the corner adjacent to the predetermined corner.
  • 3. The road shape estimating device according to claim 2, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on the number of shape interpolation points from the candidate start point to the candidate end point of the corner adjacent the predetermined corner.
  • 4. The road shape estimating device according to claim 3, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a position of a shape interpolation point adjacent the candidate end point of the predetermined corner and a position of the candidate start point of the corner adjacent the predetermined corner.
  • 5. The road shape estimating device according to claim 4, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a distance from the candidate end point of the predetermined corner to the candidate start point of the adjacent corner.
  • 6. A road shape estimating method, comprising the steps of: obtaining interpolation point data for a plurality of shape interpolation points which are set along a road and represent a shape of the road;calculating a radius of curvature at each of the shape interpolation points based on the interpolation point data for a predetermined section of the road;detecting a corner in the predetermined section based on the radius of curvature at each of the shape interpolation points and setting a candidate start point and a candidate end point for the detected corner;determining, when a plurality of corners are detected in the predetermined section, whether or not a predetermined connecting condition is met based on the candidate start points and the candidate end points of the detected corners; andconnecting, when the predetermined connecting condition is met, a predetermined corner among the plurality of corners and a corner adjacent the predetermined corner.
  • 7. A computer-readable medium having, encoded thereon, a program that causes a computer to perform the steps of: obtaining interpolation point data for a plurality of shape interpolation points which are set along a road and represent a shape of the road;calculating a radius of curvature at each of the shape interpolation points based on the interpolation point data for a predetermined section of the road;detecting a corner in the predetermined section based on the radius of curvature at each of the shape interpolation points and setting a candidate start point and a candidate end point for the detected corner;determining, when a plurality of corners are detected in the predetermined section, whether or not a predetermined connecting condition is met based on the candidate start points and the candidate end points of the detected corners; andconnecting, when the predetermined connecting condition is met, a predetermined corner among the plurality of corners and a corner adjacent the predetermined corner.
  • 8. The road shape estimating device according to claim 1, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on the number of shape interpolation points from the candidate start point to the candidate end point of the corner adjacent the predetermined corner.
  • 9. The road shape estimating device according to claim 8, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met or not based on a position of a shape interpolation point adjacent the candidate end point of the predetermined corner and a position of the candidate start point of the corner adjacent the predetermined corner.
  • 10. The road shape estimating device according to claim 9, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a distance from the candidate end point of the predetermined corner to the candidate start point of the adjacent corner.
  • 11. The road shape estimating device according to claim 1, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met or not based on a position of a shape interpolation point adjacent the candidate end point of the predetermined corner and a position of the candidate start point of the corner adjacent the predetermined corner.
  • 12. The road shape estimating device according to claim 11, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a distance from the candidate end point of the predetermined corner to the candidate start point of the adjacent corner.
  • 13. The road shape estimating device according to claim 2, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met or not based on a position of a shape interpolation point adjacent the candidate end point of the predetermined corner and a position of the candidate start point of the corner adjacent the predetermined corner.
  • 14. The road shape estimating device according to claim 13, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a distance from the candidate end point of the predetermined corner to the candidate start point of the adjacent corner.
  • 15. The road shape estimating device according to claim 1, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a distance from the candidate end point of the predetermined corner to the candidate start point of the adjacent corner.
  • 16. The road shape estimating device according to claim 2, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a distance from the candidate end point of the predetermined corner to the candidate start point of the adjacent corner.
  • 17. The road shape estimating device according to claim 3, wherein the corner connection processing unit determines whether or not the predetermined connecting condition is met based on a distance from the candidate end point of the predetermined corner to the candidate start point of the adjacent corner.
Priority Claims (1)
Number Date Country Kind
2008-086520 Mar 2008 JP national