The present invention relates to a map information creating device, a map information creating method, and a map information creating program. However, applications of the present invention are not limited to the map information creating device, the map information creating method, and the map information creating program stated above.
Conventionally, there has been disclosed a three-dimensional (3D) model deformation operation device that, in a deformation operation of a 3D model of plant facilities, etc., carries out the reliable deformation operation without affecting a model geometry of an equipment that is not subjected to the deformation operation.
This 3D deformation operation device is provided with the 3D model, a constraint table in which cutting propriety conditions of each element of the 3D model are registered, a deformation condition input unit that inputs deformation conditions of the 3D model, and a deformation operation unit having an intersection checking function of carrying out an intersection checking of a cutting plane and the element input from the deformation condition input unit, using data of the 3D model and the constraint table, a cutting plane changing function of changing the cutting plane when it is determined that “it is the element with an intersection and cutting is not allowed” by the intersection checking function, and a deformation operation function being executed when it is determined that “it is the element with the intersection and cutting is allowed” or that “there is no intersection” by the intersection checking function, or after the plane is changed to a plane allowed to be cut by the cutting plane changing function (for example, see Patent Document 1 below).
Moreover, an element dividing method of efficiently executing an operation for dividing the 3D geometry of an object into hexahedron elements to shorten operation time has been disclosed. In this element dividing method, the 3D geometry of the object is first input as a geometrical data group combining plane elements divided into plural areas as seen transparently from a predetermined direction and their height data, and then a predetermined number of articulation points are provided in a boundary and/or outline of each area so that each area or an area inside the outline is divided into quadrangular elements by a group of parallel lines passing through the articulation points concerned. After grouping the quadrangular elements by the height data and imparting the same attribute to the quadrangular elements of the same group, the quadrangular element is then extended by a predetermined amount along its height direction in accordance with the attribute of each quadrangular element and divided by a predetermined division number in the height direction to create the hexahedron element. Finally, grouping of the groups of the hexahedron elements belonging to the respective areas is canceled to bring into one group of the hexahedron elements, so that a 3D FEM (finite element method) model is completed (for example, see Patent Document 2 below).
Patent Document 1: Japanese Patent Laid-Open Publication No. 2000-200296
Patent Document 2: Japanese Patent Laid-Open Publication No. H10-31759
However, since the amount of data of 3D map information containing a 3D object is huge, the foregoing conventional techniques have a problem that, for example, it is insufficient for reducing the amount of data of the 3D map information and requires a large-capacity memory to be used.
Particularly in an on-vehicle or a portable navigation apparatus, because available memory capacity is limited, there is a problem that, for example, the 3D map information described above cannot be applied to such navigation apparatus.
On the other hand, if simple 3D map information is used, the 3D map information can be applied to the above navigation apparatus because the amount of data is not huge. However, there is a problem that, for example, the map information drawn becomes rough and a realistic image corresponding to the geometry of an actual road or the like can not be obtained. Particularly, a curve, a slope, or the like of the road cannot be drawn realistically, and there is a problem that, for example, a user cannot recognize it intuitively.
A map information creating device according to the invention of claim 1 includes a geometry data extracting unit that extracts geometry data from map information including a three-dimensional object indicating three-dimensional geometry configured by width, height, and length, the geometry data including a cross-section constituted of at least the width and the height of the three-dimensional object; and a creating unit that creates a three-dimensional object having geometry identical to that of the three-dimensional object based on the geometry data extracted by the geometry data extracting unit.
Moreover, a map information creating method according to the invention of claim 7 includes a geometry data extracting step of extracting geometry data from map information including a three-dimensional object indicating three-dimensional geometry configured by width, height, and length, the geometry data including a cross-section constituted of at least the width and the height of the three-dimensional object; and a creating step of creating a same geometry object having geometry identical to that of the three-dimensional object based on the geometry data extracted at the geometry data extraction step.
Furthermore, a map information creating program according to the invention of claim 8 causes a computer to execute the map information creating method according to claim 7.
Exemplary embodiments of a map information creating device, a map information creating method, and a map information creating program according to embodiments of the present invention will be explained in detail below with reference to the accompanying drawings.
(Hardware Configuration of Map Information Creating Device)
First, a hardware configuration of the map information creating device according to the embodiment of the present invention will be explained.
The CPU 101 performs an overall control of the map information creating device. The graphics processor 120 controls drawing and displaying of map information. The ROM 102 stores a program such as a boot program. It may also be used as the recording medium of data. The RAM 103 is used as a work area of the CPU 101 and the graphics processor 120. It may also be used as the recording medium of the data. The HDD 104 controls reading/writing of the data from/to the HD 105 in accordance with the control by the CPU 101. The HD 105 stores the data written in by the control by the HDD 104.
The CD/DVD drive 106 controls the reading/writing of the data from/to the CD/DVD 107 in accordance with the control by the CPU 101. The CD/DVD 107 is the removable recording medium from which the recorded data is read out in accordance with the control by the CD/DVD drive 106. A writable recording medium can also be used as the CD/DVD 107. The removable recording medium may be, besides the CD/DVD 107, a CD-ROM (CD-R, CD-RW), a DVD-ROM (DVD-R, DVD±RW, DVD-RAM), an MO, a memory card, or the like.
The video/voice I/F (interface) 108 is connected to the display 109 for video displays and the speaker 110 (or a headphone) for voice outputs. On the display 109, the various data including a cursor, an icon, a menu, a window, or a toolbox, as well as a character and an image, is displayed. As the display 109, a CRT, a TFT liquid crystal display, a plasma display, or the like can be employed, for example. Voice is output from the speaker 110.
The input I/F 111 inputs the data transmitted from the remote controller/touch panel 112 or the input button 113 provided with a plurality of keys for inputting such as the character, a numeric value, or various instructions.
The communication I/F 114 is connected to the network 115 such as the Internet or the like wirelessly or through a communication line, and connected to other devices via the network 115. The communication I/F 114 manages the interface between the network 115 and the CPU 101, and controls I/O of the data to/from an external device. The network 115 includes a LAN, a WAN, a public network, a portable telephone network, or the like.
(Functional Configuration of Map Information Creating Device)
Next, a functional configuration of the map information creating device according to the embodiment of the present invention will be described.
The map information database 201 stores the map information. The map information stored in the map information database 201 is explained specifically.
The map information 300 includes a ground surface object 301 representing the ground surface, a ground surface structure object 302 representing a ground surface structure such as a building or the like on the ground surface, and a 3D road object 303 representing a road which is constructed on the ground surface and being elevated. The 3D road object 303 constitutes a 3D geometry by line segments of road width, height, and length of a road. The 3D road object 303 is not limited to the road specifically but may be applied to any structure as long as it constitutes the 3D geometry, the length direction thereof is linear, and a texture drawn is uniform. For example, it includes a tunnel, a median strip, and a road-crossing portion of a footbridge.
Specifically, these objects 301 to 303 can be expressed using the coordinate system described above. For example, each peak of the objects 301 to 303 can be specified by a coordinate of the coordinate system. The line segment between the peaks, such as the road width, the height, the length of the road, can also be specified by the coordinate of the coordinate system. Additionally, the texture depending on the objects 301 to 303 concerned is drawn on the objects 301 to 303, and a drawing position of the texture can also be specified by the coordinate of the coordinate system described above. Drawing cycle information of the texture repeatedly drawn on the respective objects 301 to 303 is stored as well. Since other specific contents of the map information 300 are well known, the description thereof is omitted here.
The road network database 202 shown in
The geometry data extracting unit 231 extracts an ID for identifying the 3D road object 303 illustrated in
The link-length information extracting unit 204 extracts link length information from the road network data 400. Specifically, a node coordinate information group of each link 401 and a 3D road object ID assigned to each link 401 are extracted. Note that the same 3D road object 303 may be assigned to the plurality of links 401.
The texture information extracting unit 232 extracts texture information constituted by the texture drawn on the surface of the 3D road object 303, drawing cycle information of the texture, and information on a representative color of said arbitrary surface from the 3D road object 303. For example, in the 3D road object 303, a road surface texture is extracted in which a road surface and a lane, such as a center line ruled on the road surface, are drawn on the top surface.
Since the road extends linearly in general, the road surface texture is drawn repeatedly in the length direction of the 3D road object 303. Accordingly, the amount of data can be reduced by extracting this repeating cycle (drawing cycle). The texture information may be drawn on the side surface or the undersurface as well. Additionally, the information extracted by the texture information extracting unit 232 includes the information on the representative color of the surface. This is used when the drawing is carried out using a single color instead of the texture or a combination of the color and the texture.
An extraction example of the 3D road object 303 using the map information extracting unit 203 and the link-length information extracting unit 204 will be explained.
As shown in
The creating unit 206 is provided with a geometry drawing unit 261, a texture drawing unit 262, and a detection unit 263. The geometry drawing unit 261 generates the 3D object with the same geometry as that of the 3D road object 303 by drawing the geometry data 310 extracted by the geometry data extracting unit 231 so as to appear as being extended in the direction perpendicular to the cross-section S. This drawing processing by extension can be performed using the peak coordinates of the cross-section S. The length to extend is determined based on the link length information, for example. The direction to extend may be the direction that inclines by a vertical interval between the node coordinates of the link length information as illustrated in
The texture drawing unit 262 generates the 3D object having the same geometry and texture as those of the 3D road object 303 based on the texture information extracted by the texture information extracting unit 232. Specifically, the extracted texture is drawn on the surface of the 3D road object 303 by the amount of the drawing cycle information P. For example, in the case of the road surface texture 501 illustrated in
When the drawing cycle information P is “10.3”, for example, that contains a fractional part “0.3” after the decimal point besides an integral value “10”, the texture for the number of sheets of the integral value as well as that for the length corresponding to the fractional part is drawn.
The detection unit 263 detects whether first end face geometry data representing an end face of one 3D object generated by the creating unit 206 intersects with second end face geometry data representing an end face of the 3D object other than the one 3D object. Specifically, the detection unit 263 detects whether the end faces intersect with each other by determining whether the peak coordinates of the first end face geometry data and the peak coordinates of the second end face data coincide with each other.
The detection unit 263 compares the coordinate of a peak a of the first end face geometry data 1011 with the coordinate of a peak e of the second end face geometry data 1012. The detection unit 263 also compares the coordinate of a peak b of the first end face geometry data 1011 with the coordinate of a peak f of the end face geometry data 1012. The detection unit 263 then compares the coordinate of a peak c of the first end face geometry data 1011 with the coordinate of a peak g of the end face geometry data 1012.
The detection unit 263 then compares the coordinate of a peak d of the first end face geometry data 1011 with the coordinate of a peak h of the second end face geometry data 1012. When all of them coincide with each other, the first end face geometry data 1011 of the one 3D object 1001 and the second end face geometry data 1012 of the other 3D object 1002 are drawn so that they are in plane contact with each other, resulting in that the both 3D objects 1001 and 1002 are connected without a gap.
Meanwhile, when any of them does not coincide with each other, a gap 1000 is generated between the 3D objects 1001 and 1002 connected by the end face geometry data 1011 of the one 3D object 1001 and the end face geometry data 1012 of the other 3D object 1002 intersecting with each other, as illustrated in
The geometry drawing unit 261 then generates, based on a detection result detected by the detection unit 263, a complementary 3D object which complements between the one 3D object 1001 and the other 3D object 1002 using the first and second end face geometry data 1011 and 1012.
As for the generation of a complementary 3D object 1100, two edges A and B in the height direction of the first end face geometry data 1011 are first extracted. Meanwhile, from two edges C and D in the height direction of the second end face geometry data 1012, the edge C that does not overlap the one 3D object 1001 is extracted. The peaks a and b of the edge A are then extended to the peaks e and f of the edge C while the peaks c and d of the edge B are extended to the peaks e and f of the edge C, thereby drawing the complementary 3D object 1100 in the shape of a triangular prism.
Note that the map information database 201 and the road network database 202 described above specifically achieve their functions using the recording medium such as the ROM 102, the RAM 103, the HD 105, and the CD/DVD 107 illustrated in
Next, a map information creating process according to a first embodiment will be explained.
The geometry drawing unit 261 then draws the geometry data 310 so as to appear as being extended in the direction perpendicular to the cross-section S of the extracted geometry data 310 (step S1203). Thereafter, the texture drawing unit 262 draws the road surface texture 501 by the amount of the drawing cycle information P on the surface of the 3D object generated by extension with the same geometry as that of the 3D road object 303 (step S1204).
According to this first embodiment, by extending the geometry data 310, the 3D object that has the same geometry and the same road surface texture 501 as that of the 3D road object 303 stored in the map information database 201 can be generated with the small amount of data.
Next, a map information creating process according to a second embodiment will be explained.
As shown in
According to the second embodiment, since the 3D object with the same geometry as that of the 3D road object 303 can be generated by extending the geometry data 310 by the length L of the link 401, the 3D object can be generated that corresponds to the road network data 400 illustrated in
Next, a map information creating process according to a third embodiment will be explained.
As shown in
According to the third embodiment, by extending the geometry data 310 along the direction indicated by the vertical interval of the link 401, the connected section of the 3D objects, such as a slope with gradient, can be drawn without the gap 1000, enabling the object with the geometry adapted to the actual road surface being generated.
Next, a texture drawing process according to a fourth embodiment will be explained.
As shown in
On the other hand, if the fractional part is contained (step S1502: YES), the texture drawing unit 262 draws the texture in the range corresponding to the decimal value of the drawing cycle information P on the object generated by the geometry drawing unit 261 (step S1503). Specifically, as illustrated in
According to this fourth embodiment, the texture corresponding to the decimal value can be drawn with the value of the decimal value (fractional part) of the drawing cycle information P.
Next, a complement processing according to a fifth embodiment will be explained.
In contrast, if the sets of the end face geometry data 1011 and 1012 intersect (step S1601: YES), the edges A through C for drawing the complementary 3D object 1100 are determined (step S1602). Specifically, two edges A and B in the height direction of the end face geometry data 1011 of one of the connected 3D objects 1001 are extracted. Additionally, from two edges C and D in the height direction of the end face geometry data 1012 of the other 3D object 1002, the edge C that does not overlap the one 3D object 1001 is extracted. Thus, the edges A through C for drawing the complementary 3D object 1100 are determined.
The complementary 3D object 1100 is then drawn using the determined edges A through C (step S1603). Specifically, the peaks a and b of the edge A are drawn as they are seen extended to the peaks e and f of the edge C while the peaks c and d of the edge B are drawn as they are seen extended to the peaks e and f of the edge C, thereby the complementary 3D object 1100 in the shape of the triangular prism can be drawn.
According to this fifth embodiment, the connected section of the 3D objects, such as the curve, can be drawn without the gap 1000, enabling the object with the geometry adapted to the actual road surface being generated.
As described above, according to the map information creating device, the map information creating method, and the map information creating program according to the embodiments of the present invention, the realistic 3D map information 300 can be generated with the small amount of data. Moreover, according to the present invention, it is not necessary to use a large-capacity memory, enabling to employ the inexpensive memory with small capacity.
Particularly, since the map information 300 within the range seen from input viewpoint coordinates is extracted when being also applied to an on-vehicle or portable navigation apparatus, the required virtual 3D road object can be displayed three-dimensionally only when required for display. Moreover, since the general-purpose 3D object can be shared, reduction in the amount of data of the map information 300 can be achieved.
Furthermore, since the realistic 3D map information 300 is reproducible, a user can recognize intuitively that the map information 300 currently displayed on a display screen is the scenery actually viewed with the naked eye. Thereby, the user would not be puzzled by the inconsistency of the map information 300 currently displayed and the scenery currently viewed, and thus the user can drive safely.
The map information creating method described in the embodiments can be realized by executing the program prepared in advance by a computer, such as a personal computer, a workstation, and a built-in device. This program is recorded on the computer-readable recording medium, such as a hard disk, a flexible disk, a CD, a DVD, an MO, a memory card, a RAM, and a ROM and is executed by being read out from the recording medium by the computer. Additionally, this program may be a transmission medium that can be distributed via the network such as the Internet.
Number | Date | Country | Kind |
---|---|---|---|
2004-108250 | Mar 2004 | JP | national |
2004-381827 | Dec 2004 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2005/004493 | 3/15/2005 | WO | 00 | 9/26/2006 |