1. Field of the Invention
The present invention relates to a three-dimensional map drawing system of drawing a three-dimensional map that expresses features three-dimensionally.
2. Description of the Related Art
There is known technology of displaying features, such as buildings and roads, three-dimensionally in display of an electronic map, for example, on the screen of a navigation device or a computer. In the three-dimensional map, the features are generally drawn by the projection method called perspective projection.
a and 1b illustrate an example of drawing a three-dimensional map by perspective projection. As shown in
Japanese Patent Nos. JP4070507B and JP 3428294B describe the conventional technologies on display in the three-dimensional map. The technology disclosed in JP4070507B determines whether a traffic sign is hidden behind a building in perspective projection of three-dimensional map information and gives an advanced notice of the traffic sign hidden behind the building. The technology disclosed in JP 3428294B displays a character representing the name with respect to only a polygon having the area of or over a predetermined level in perspective projection from the midair point of view.
As illustrated in
The technology of drawing a three-dimensional map generally aims to reproduce the real world in the user's view and does not consider the problem of changing the scale by perspective projection. Changing the scale on the map is a significant problem that may greatly damage the meaning of the map. Changing the scale means that the three-dimensional image drawn by perspective projection is useless as the map. In order to understand the accurate positional relationship between features, the user should refer to a planar map.
In order to solve the foregoing, the object of the invention is to provide a three-dimensional map with keeping the scale.
The following describes the configuration of a three-dimensional map drawing system of drawing a three-dimensional map that expresses a feature three-dimensionally according to the invention. The three-dimensional map drawing system of the invention includes a feature database, a drawing range input and a drawer. The feature database stores feature data as two-dimensional drawing data of a feature projected on a plane by parallel projection from an oblique direction inclined by a predetermined projection direction to a vertical direction. The system of the invention draws the feature not by perspective projection but by parallel projection. The feature data may be provided in the form of raster data or in the form of polygonal data. The polygonal data has the advantage that reduces the total data volume and provides a high-quality map since the polygonal data does not give a grainy image even in enlargement.
The drawing range input inputs specification of a drawing range in which the three-dimensional map is to be drawn, and the drawer reads feature data corresponding to the input specification from the feature database and makes a drawing. The drawing includes both displaying a map on a display of, for example, a computer or a navigation device and printing a map with, for example a printer. The range to be drawn (hereinafter referred to as drawing range) may be specified by any of various methods, for example, a method of directly specifying the drawing range with the coordinates of latitude and longitude or a method of specifying the drawing range by setting a representative point of the drawing range and the scaling factor. The drawing range may be manually set by the user's input or may be automatically set by the drawing range input with the user's entry of the current position during route guidance.
In the map drawn by the three-dimensional map drawing system of the invention, each feature is expressed three-dimensionally by parallel projection.
b shows an example of a map drawn by parallel projection. For example, the edges of buildings BLD1, BLD2 and BLD3 are formed by parallel lines. In parallel projection, the parallel edges in the actual features are also shown in parallel on the map. This means that a building actually having a fixed width is shown in fixed width on the map. The same applies to the width of the road and the interval between buildings, as well as the width of the building. Parallel projection keeps the scale in the left-right direction and in the depth direction unchanged, irrespective of the three-dimensional expression of features.
The present invention employs parallel projection to provide a three-dimensional map with keeping the scale unchanged. The parallel projection may make a drawing with multiplication in the depth direction by a fixed scale factor, in order to match the sense of depth in the projection drawing with the actual depth. The parallel projection relatively keeps the scale unchanged in the left-right direction and in the depth direction even in the case of multiplication by the fixed scale factor.
The technology of parallel projection is widely used in the field of drawing but has not been applied to three-dimensional maps. As described previously, three-dimensional maps have aimed to faithfully draw the landscape as the user's view, so that introduction of parallel projection has not been considered at all. The inventors of the invention have introduced parallel projection by changing the point of view from faithfully drawing the landscape to placing importance on keeping the scale of the map unchanged.
The system of the invention employs parallel projection and has the following advantageous effects on the processing load of drawing a map. While perspective projection (
Parallel projection, on the other hand, can provide in advance the drawing result with respect to a predetermined projection direction, due to non-requirement for specification of the point of view. The drawing result generated independently of the point of view can be used commonly, irrespective of the specified drawing range of the map. The system of the invention accordingly does not require the rendering process in the course of drawing a three-dimensional map, thus significantly reducing the processing load during map drawing.
According to one embodiment of the invention, the feature data may be obtained by parallel projection of a virtual feature that is given by expansion of the feature in a height direction by a factor that is greater than 1. The user generally looks up a feature from the ground level. Obliquely downward parallel direction of the feature may cause the user to feel the sense of height of the feature in the projection drawing lower than the sense of height in the look-up view of the actual feature. Projection of the higher virtual feature than the actual height by multiplication by the factor effectively relieves the feeling of strangeness in sense of height. Expansion of the feature in the height direction by the factor keeps the scale in the left-right direction and in the depth direction unchanged, thus ensuring the advantageous effects of the invention.
According to another embodiment of the invention, the feature database may store feature data in a plurality of levels having different scales. Dividing the feature data in the plurality of levels provides detailed data of the narrower area in the lower level, while providing data of the wider area in the higher level with omitting details of features to reduce the data volume. The plurality of levels may be three or more levels. The drawer may draw the map by using feature data in the level corresponding to the drawing range.
In the application of multiple layers, it is preferable to generate feature data in each level by parallel projection of the same projection direction and the same projection angle (hereinafter the projection direction and the projection angle are collectively referred to as “parallel projection parameters”). The projected position of a certain coordinate point is determined by the parallel projection parameters. Employing the same parallel projection parameters in the respective levels enables the projected positions of one identical coordinate point in the respective levels to be readily correlated. This allows relatively easy scaling of the map, while fixing a specific point, such as the center position in the drawing range.
According to another embodiment, the parallel projection parameters may be changed for each level. In the case of changing the level used for drawing, it is preferable to make a drawing in the next level after analysis of the position to which the specific point, such as the center position in the drawing range, is projected in the next level.
According to another embodiment of the invention, the feature database may store different types of feature data for one identical area with respect to different projection directions. The drawing range input may further input specification of a drawing direction of the three-dimensional map, and the drawer may make the drawing by using feature data with respect to a projection direction corresponding to the specified drawing direction.
This configuration enables three-dimensional maps to be drawn in the views from various directions, thereby improving the convenience. The three-dimensional map of a feature projected from only one direction has the blind spot where the feature is hidden behind a building drawn three-dimensionally. Providing the three-dimensional maps in different projection directions eliminates the blind spot. In application of the three-dimensional maps to route guidance, the three-dimensional maps in different projection directions are selectively used along the route to be guided. This enables smooth head up display where the moving direction is upward in the display. In this embodiment, the drawing direction may be specified by the user or may be set to the user's moving direction in head up display.
According to another embodiment, the system may further include a character database that stores character data used to specify a character representing name of the feature. When there are feature data from a plurality of projection directions, the character database may be provided individually for each projection direction, or one character database may be correlated to different types of feature data with respect to different projection directions. In correlation to the different types of feature data, the character data may have a flag that specifies output or non-output of the character on the three-dimensional map according to the projection direction (hereinafter referred to as behind flag). The drawer may output the character specified to be output by the flag on the three-dimensional map. The output herein includes, for example, displaying on a display and printing with a printer.
In the three-dimensional map, a feature is visually recognized or is hidden in the blind spot according to the projection direction. It is thus preferable to control output or non-output of the character representing the feature, based on the visibility. In this embodiment, display or non-display of the character is stored in advance as the setting of the behind flag. This enables the output of the character to be adequately controlled without any complicated arithmetic operation or analysis.
The character data may include three-dimensional specification of an output position of the character including height information. The drawer may output the character at a position determined by similar parallel projection of the specified output position to parallel projection in generation of the feature data. This ensures the display reflecting the height information.
This configuration enables the character to be displayed not only on the ground level of the feature expressed three-dimensionally but in the vicinity of the wall surface of the upper stories or above the feature, thus outputting the easy-to-read three-dimensional map. The height information may be specified in any of various forms, for example, the observed value like meters or the number of stories of the feature. The specification form of the height information may be set for each feature type. The different height may be specified according to the drawing direction.
According to another embodiment of the invention, the feature database may store the feature data in meshes that are predetermined two-dimensional areas. Each mesh may preferably be allowed to store feature data of a feature that is not located in the mesh.
In the general planar electronic map, data of a feature that is located across a plurality of meshes is stored as polygons divided into the meshes. In the three-dimensional electronic map, data of a feature that is located across a plurality of meshes is stored dividedly into the meshes when the feature is dividable, while being stored collectively in one mesh when the feature is undividable. In the general electronic map data, feature data stored in each mesh is generally data of a feature that is located partly or wholly in the mesh.
In parallel projection, however, as the new problem, there is a possibility that an upper part of a feature that is located in a certain mesh is drawn in another mesh. As the measure to solve the problem, the system of the invention allows each mesh to store data of even a mesh that is not located in the mesh. This configuration draws a three-dimensional map of even a feature that is to be drawn across a plurality of meshes, by parallel projection without any complicated processing.
According to another embodiment of the invention, data of a road included in the feature data may be obtained by parallel projection by taking into account height of each point on the road, i.e., undulation. The following configuration is preferable to draw route guidance information, such as the route or the current position, on the three-dimensional map by using the feature data.
A guidance information input may input at least one of route guidance information to be drawn, i.e., at least one of a route position and a current position, as three-dimensionally specified information including height. The drawer may draw a route by similar parallel projection of the specified information to parallel projection in generation of the feature data.
When only one of the road data and the route guidance information has height information, parallel projection may cause the route guidance information to be drawn out of the road. The system of this embodiment, however, ensures the same parallel projection for both the route guidance information and the road, thus allowing the route guidance information to be drawn without any deviation.
When road data is provided based on two-dimensional data that does not consider the undulation, the route guidance information may also be drawn based on two-dimensional data without parallel projection. In this case, it is preferable to provide features other than roads, for example, buildings, as data that does not consider the undulation.
The present invention is not limited to the three-dimensional map drawing system described above but may be configured by any of various other aspects. For example, another aspect of the invention may be a feature data generating method that causes feature data used for drawing a three-dimensional map to be generated by the computer.
The feature data generating method provides a three-dimensional map database that stores in advance a three-dimensional model representing a three-dimensional shape of a feature. The three-dimensional map database may be provided in the computer that generates the feature data, maybe provided in the form stored in a medium, such as a DVD or may be stored in a server accessible by the computer via a network.
The computer inputs specification of a target area as an object of generating the feature data, and inputs a three-dimensional model that is located in the target area and an adjacent area in a predetermined range adjoining to the target area from the three-dimensional map database into a memory. The computer subsequently generates two-dimensional drawing data of a feature projected on a plane by parallel projection, from the three-dimensional model input in the memory, and stores the generated two-dimensional drawing data in the memory. The computer then extracts data in the specified target area from the two-dimensional drawing data, generates the feature data from the extracted data, and outputs the generated feature data.
As described above, the target of parallel projection according to the invention is the three-dimensional model that is located in not only the target area but in the adjacent area in the predetermined range adjoining to the target area. There is a possibility that parallel projection causes a feature that is located in a certain mesh to be drawn out of the certain mesh in the projection drawing. There is also a possibility that parallel projection causes a feature that is not located in the target area to be partly drawn within the target area in the projection drawing. By taking into account these possibilities, the method of the invention includes the adjacent area in the predetermined range adjoining to the target area as the target of parallel projection. This configuration provides a projection drawing of even the feature that is not located in the target area.
The predetermined range may be set according to the size of the mesh and the parallel projection parameters. The predetermined range should be narrowed at the projection angle closer to the vertical axis and widened at the projection angle closer to the horizontal axis.
The predetermined range may not be set evenly around the target area on the center. For example, projection from the easterly direction causes a projection drawing to be extended in the westerly direction. The feature that is located in a western mesh from the target area is accordingly not drawn in the target area in the projection drawing. By taking into account the projection direction, the predetermined range may be only the mesh in the direction having the possibility of drawing a feature, which is located in the mesh, in the target area, i.e., only the adjacent mesh on the side of the projection direction from the target area.
The invention may also be configured by a three-dimensional map drawing method that causes the computer to draw a three-dimensional map, as well as by a computer program that causes the computer to draw a three-dimensional map. The invention may further be configured by a computer readable storage medium, in which such a computer program is stored. The storage medium may be a flexible disk, a CD-ROM, a magneto-optical disk, an IC card, a ROM cartridge, a punched card, a printed matter with a barcode or another code printed thereon, any of internal storage devices (memories such as RAM and ROM) and external storage devices of the computer, or any of various other computer readable media.
a and 1b illustrate an example of drawing a three-dimensional map by perspective projection;
a and 2b illustrate an example of drawing a three-dimensional map by parallel projection;
a through 15c illustrate an output example of a three-dimensional map according to an embodiment;
a through 21c illustrate an example of route guidance.
Some embodiments of the invention are described in the following sequence:
A. Device Configuration
B. Feature Data Structure
C. Feature Data Generating Method
D. Character Data Structure
E. Map Display Process
F. Route Guidance Process
The cell phone 300 has various functional blocks working under a main controller 304. The main controller 304 and the respective functional blocks are provided by installing software that implements the respective functions according to this embodiment, but part or all of such functional blocks may be provided by hardware configuration. A transmitter/receiver 301 makes communication with the server 200 via the network NE2. According to this embodiment, map data for displaying a three-dimensional map and commands are mainly transmitted and received by communication.
A command input 302 enters the user's instructions through the operations of a keyboard 300k. According to this embodiment, the instructions may include specifying the display area and the scaling of a three-dimensional map and setting the place of departure and the destination in route guidance. A GPS input 303 obtains the coordinates values of latitude and longitude, based on GPS (Global Positioning System) signals. In the route guidance, the direction of movement is also computed based on a variation of the latitude and the longitude.
A map information storage 305 is provided as a buffer to temporarily store map data provided from the server 200. When the map to be displayed changes continually as in the route guidance, map data that is not obtainable from the map information storage 305 is received from the server 200 to display the map. A map matching transformer 307 makes the coordinate values of the route positions and the current positions subjected to required coordinate transformation, in order to display the found route and the current position with accuracy on the roads of the three-dimensional map displayed by parallel projection during the route guidance. The method of coordinate transformation will be described later.
A display controller 306 displays a three-dimensional map on a display 300d of the cell phone 300, based on data provided from the map information storage 305 and the map matching transformer 307. The server 200 has illustrated functional blocks. These functional blocks are provided by installing software that implements the respective functions according to this embodiment, but part or all of such functions may be implemented by hardware configuration.
A map database 210 is provided as a database for displaying three-dimensional maps. According to this embodiment, map data including feature data 211, character data 212 and network data 213 are stored in the map database 210. The feature data 211 are used to display features, such as roads and buildings, three-dimensionally and are provided as two-dimensional polygonal data obtained by parallel projection of the three-dimensional model of the features. The character data 212 represent letters and characters to be displayed on the map, for example, the names of the features and the place names. The network data 213 represent the roads expressed by a set of nodes and links. The nodes are provided as data corresponding to the intersections of the roads and the end points of the roads. The links are lines interconnecting the nodes and are provided as data corresponding to the roads. According to this embodiment, the positions of the nodes and the links included in the network data 213 are specified by three-dimensional data of the latitude, the longitude and the height.
A transmitter/receiver 201 sends and receives data to and from the cell phone 300 via the network NE2. According to this embodiment, map data for displaying a three-dimensional map and commands are mainly transmitted and received. The transmitter/receiver 201 also makes communication with the data generation device 100 via a network NE1. According to this embodiment, generated map data are mainly sent and received by communication.
A database manager 202 controls reading and writing data from and into the map database 210. A route finder 203 uses the network data 213 in the map database 210 for route search. The Dijkstra's algorithm may be employed for route search. The data generation device 100 has illustrated functional blocks. These functional blocks are provided by installing software that implements the respective functions in the personal computer according to this embodiment, but part or all of such functions may be implemented by hardware configuration.
A transmitter/receiver 105 sends and receives data to and from the server 200 via the network NE1. A command input 101 enters the operator's instructions through the operations of, for example, a keyboard. According to this embodiment, the instructions include specifying a target area for generation of map data and specifying parallel projection parameters.
A 3D map database 104 is provided as a database for storing the three-dimensional model used to generate map data. Electronic data representing the three-dimensional shapes of features such as roads and buildings are stored in the 3D map database 104. The 3D map database 104 may utilize the three-dimensional model generally provided for displaying a three-dimensional map by perspective projection. A parallel projector 102 generates feature data by parallel projection drawing based on the 3D map database 104. The projection drawings are stored as parallel projection data 103 and are eventually stored via the transmitter/receiver 105 as the feature data 211 into the map database 210 of the server 200. The parallel projector 102 determines whether each feature is located in a blind spot of another feature in the course of parallel projection process and transfers the result of determination to a behind flag setter 106.
The behind flag setter 106 inputs character data representing the name of each feature from the 3D map database 104 and sets a behind flag that specifies whether the character string is to be displayed or hidden on the map, based on the result of determination received from the parallel projector 102. The behind flag is set to a value representing hidden characters when the feature is located in a blind spot of another feature, while being set to a value representing display characters when the feature is not located in a blind spot of another feature. According to this embodiment, feature data 211 are prepared with respect to a plurality of projection directions. The blind spot depends on the projection direction, so that the behind flag is set for each projection direction.
According to this embodiment, on the other hand, the building M3 is projected on a plane P2 in an oblique direction (direction of arrow A2 in the drawing) that is inclined by a predetermined projection angle to the vertical direction by parallel projection. The resulting data D2 represents the building M3 three-dimensionally as a building M2. Although the building M2 is expressed three-dimensionally, the data D2 is the projected two-dimensional drawing data. According to this embodiment, polygonal data for drawing the building M2 are specified by a sequence of points, such as coordinate values (u1,v1) and (u2,v2), in the uv coordinates in the projection plane. The side walls and the roof of the building M2 may be provided as separate polygonal data, or alternatively the whole building M2 may be provided as integral polygonal data. Windows W may be provided as textures to be attached to the wall surfaces of the building, i.e., raster data, or may alternatively be provided as separate polygonal data. The feature data of this embodiment is constructed by two-dimensional data obtained by projecting each feature by parallel projection in the oblique direction as described above.
Parallel projection (arrow CH01) of the building BL01 located at this position causes the building BL01 to be drawn three-dimensionally as shown in meshes M03 and M04. According to this embodiment, the latitude and the longitude P02 (LAT02, LON02) at the lower left corner is identical with the latitude and the longitude P01 of the mesh M01. In other words, the meshes M03 and M04 are defined to have the latitudes and the longitudes of the respective apexes that are identical with the latitudes and the longitudes of the respective apexes of the meshes M01 and M02 on the plane. Alternatively the meshes M03 and M04 on the projection plane may be set independently of the meshes M01 and M02 on the plane.
As the result of parallel projection, the building BL01 is drawn by a part BL04 in the mesh M04, in addition to a part BL03 in the mesh M03. According to this embodiment, as shown by arrows CH03 and CH04, a polygon of one building BL01 is divided into the part BL03 belonging to the mesh M03 and the part BL04 belonging to the mesh M04, which are managed as separate polygonal data.
The rightmost drawings illustrate the structures of the respective polygonal data. The name, the position, the shape, the type, the character, and the attribute are stored as data of each polygon. According to this embodiment, the name is the name of the building BL01. The common name is assigned to the part BL03 belonging to the mesh M03 and the part BL04 belonging to the mesh M04, so that these two parts BL03 and BL04 are identifiable as polygons related to one identical building. Alternatively the name may be a proper name of each polygon. In this case, it is preferable to provide additional information interlinking the polygons related to one identical building.
The position is the latitude and the longitude expressed by the coordinates (LATb, LONb) where the building BL01 is located. The shape is data on the sequence of points forming the polygon in the relative two-dimensional coordinates uv defined in each mesh. The shape data regarding the part BL03, such as Pb1 (u1, v1) and PB2 (u2, v2) are the uv coordinate values in the mesh M03 representing the positions of apexes Pb1 and Pb2. The shape data regarding the part BL04, such as Pb3 (u3, v3) and Pb4 (u4, v4) are the uv coordinate values in the mesh M04 representing the positions of apexes Pb3 and Pb4.
The type of the feature represented by the polygon is stored as the type. The character is data representing the name of the feature. Since the character data is provided separately from the feature data according to the embodiment, data representing the storage link of the character data (LINK in the illustrated example) is stored as the character of the feature data. The data representing the storage link may be a path, an address, or a URL (Uniform Resource Locator) of the character data relating to the building BL01. The attribute is additional information regarding the feature. For example, the height and the number of stories may be the attribute of the building, and the lane width and the road type such as national road may be the attribute of the road.
The upper right drawing IMG1 is the parallel projection drawing at the projection angle PA1. The lower right drawing IMG2 is the parallel projection drawing at the projection angle PA2. At the smaller projection angle Ang, the positional relationship between buildings is easily understandable like the plane map as shown in the drawing IMG1. At the larger projection angle Ang, on the other hand, the shape of each building is intuitively understandable as shown in the drawing IMG2. The projection angle may be set by taking into account such visual effects. Alternatively, a plurality of feature data at different projection angles may be provided to allow the user's selection.
According to this embodiment, the actual building is not directly projected by parallel projection but is projected by parallel projection after multiplication of a factor that is greater than 1 only in the height direction. As shown in the left drawing, parallel projection of a virtual building of a height C·h obtained by multiplying the actual height h of a building BLD by a factor C gives the right drawings IMG1 and IMG2.
The user generally looks up the building. Parallel projection of the overhead view may cause the sense of height from the projection drawing of the building to differ from the sense of height from the actual look-up view of the building. Parallel projection of the virtual building enlarged only in the height direction by multiplication of the factor C as described above, on the other hand, relieves such feeling of strangeness in sense of height.
The factor C may be set arbitrarily by taking into account the visual effects. As clearly understood from the comparison between the drawings IMG1 and IMG2, the sense of height of the building is also affected by the projection angle Ang. In parallel projection at a plurality of different projection angles Ang, the factor C may be changed according to the projection angle Ang. When the feeling of strangeness is negligible, parallel projection may be performed without multiplication of the factor.
The parallel projection parameters include the projection angle and the projection direction. The projection angle is the parameter representing the inclination of the projecting direction relative o the vertical direction as described in
The feature data are provided with respect to eight directions according to this embodiment, but may be provided with respect to four directions or with respect to sixteen or a greater number of directions. According to the results of the inventors' examination, when feature data are provided with respect to sixteen directions and the projection drawings in the respective directions are sequentially changed over, the resulting display causes the user to feel like looking at the area AR while moving around the area AR. From this standpoint, it may be preferable to provide the feature data with respect to sixteen directions.
According to this embodiment, the same parallel projection parameters are employed for parallel projection in all the levels LV1 to LV3. As illustrated, a specific area (hatched area) on a ground level GL is projected similarly in any of the levels LV1 to LV3. Even when the level is changed according to the scaling of the map display, the area in the current level corresponding to the display area in the previous level can be readily identified. This accordingly enables the smooth display by the relatively simple processing.
As a result, the low level LV1 and the middle level LV2 have different coordinate systems, i.e., coordinate system u1, v1 and coordinate system u2, v2. It is accordingly difficult to identify the area in each level corresponding to the hatched area on the ground level GL. When the display is changed from the low level LV1 to the middle level LV2 in this state, the process is required to specify the area on the ground level GL corresponding to the display area in the low level LV1 and subsequently identify the area in the middle level LV2 corresponding to the specified area.
When such processing load is acceptable, the parallel projection parameters may be changed as shown in
The following describes generation of feature data corresponding to a hatched mesh MP in
The procedure of this embodiment, on the other hand, reads three-dimensional feature data of adjacent meshes (M22, M23, M24, M25, M32, M34, M42, M43 and M44) adjoining to the mesh MP as the processing object and further adjacent meshes (M11 to M15, M21, M25, M31, M35, M41, M45 and M51 to M55) adjoining to the adjacent meshes during processing of the mesh MP. The procedure then projects all the meshes M11 to M55 by parallel projection and cuts out a polygon corresponding to the mesh MP to generate feature data. This enables parallel projection of the feature B34 located in the adjacent mesh M34 during processing of the mesh MP, thus obtaining the feature data without missing of the upper portion.
The procedure of the embodiment uses the meshes located within the two mesh-range from the mesh MP as the processing object as described above, but the range used for generation of feature data may be set arbitrarily. When the size of each mesh is sufficiently larger than the sizes of features and there is substantially no possibility that a feature located in a next but one mesh is projected in the mesh as the processing object, the range of parallel projection may be only the meshes directly adjacent to the mesh as the processing object, i.e., the one mesh-range. When the size of each mesh is relatively small to the sizes of features, on the other hand, the range of parallel projection may be the three mesh-range or the wider range.
The range of parallel projection may be not arranged evenly around the mesh MP as the processing object, but may be localized by taking into account the projection direction. For example, during parallel projection in the direction shown by the arrow Vpj34 shown in
Similarly, during parallel projection in the projection direction shown by an arrow Vp in
On the start of processing, the CPU inputs specification of a mesh as the processing object (step S100). This step corresponds to specification of the mesh MP in
The CPU subsequently inputs parallel projection parameters, i.e., projection direction and projection angle (step S101). The parallel projection parameters may be specified by the operator every time feature data is generated. Alternatively default parallel projection parameters may be set in advance in the data generation device 100.
The CPU then reads 3D map data with respect to the object mesh and peripheral meshes in a specified range around the object mesh (step S102). The procedure of this embodiment reads 3D map data with respect to the meshes within the two-mesh range from the object mesh MP as shown in
The CPU then processes the read 3D feature data by parallel projection using the parallel projection parameters input at step S101 (step S103). This processing gives a projection drawing where the respective features are drawn three-dimensionally by parallel projection. According to this embodiment, the drawing result is temporarily stored as two-dimensional polygonal data in the memory of the data generation device 100. Alternatively the drawing result may be stored as raster data.
On completion of parallel projection, the CPU cuts out an area corresponding to the object mesh from the generated polygonal data (step S104). With respect to a polygon drawn across a plurality of meshes, only a part located in the object mesh is extracted and is set as new polygonal data as described in
The CPU then stores the cut-out area as feature data (step S105). The procedure sends data as well as an instruction for storage into the feature data 211 to the server 200 (
A plurality of character records that record character information representing the names of the respective features are stored in the character data. The character records are also provided in the respective levels according to this embodiment. Information LINK on the storage location of the character record indicating the name of the feature is recorded in each record of the feature data. According to this embodiment, one character record is commonly used for feature data with respect to a plurality of directions in each level. The information LINK of the same content is accordingly stored in the feature BL03 in the level LV1. The arrow in the drawing shows mapping of one character record to a plurality of feature records.
Pieces of information, such as the name, the display content, the font, the color, the size, the behind flag, the position and the height are stored in the character record. The name is the name of a feature corresponding to the character record and may be the name of a polygon representing the feature. When one feature is drawn by a plurality of polygons as illustrated in
The display content is a character string representing the name of a feature. The font, the color and the size are information defining the display mode of the character string. The behind flag is a flag controlling approval or disapproval of character display, and is set corresponding to the direction of the feature data. In the illustrated example, the behind flag is set as “1,1,1,1,0,0,1,1”. This means that the character is to be displayed (setting=1) for the directions 1 to 4 and the directions 7 and 8 and the character is to be not displayed (setting=0) for the directions 5 and 6. The method of setting the behind flag will be described later.
The position represents the coordinates where the character is displayed and may be equal to the coordinates of the representative point of the corresponding feature, i.e., may be the same value as that of the “position” information of the feature data. The height represents the height where the character is to be displayed. The height may be expressed in a unit of length such as in meters, or may be expressed as the pixel value in display or the number of stories of the feature. Specifying the height information enables the character to be displayed at the higher position than the ground level of the feature and thereby ensures the easy-to-understand display of the relationship between the character and the feature. The height is a common value set for all the directions according to the embodiment, but may be separately set for the respective directions like the behind flag.
The character data is provided for each level according to this embodiment but may be commonly provided for all the levels. In the latter case, a flag controlling display/non-display of character data may be set for each level. The similar format to that of the behind flag may be employed for this flag.
The lower right projection drawing PIC4 is the parallel projection drawing from the direction 4. As illustrated, the building BL01 is not in the blind spot of the building BL04. The behind flag BF is accordingly set to “1”. The lower center projection drawing PIC5 is the parallel projection drawing from the direction 5. In this state, the building BL01 is in the blind spot of the building BL04. Displaying the name of the building BL01 in this state makes the user confused what building the name indicates. The character of the building BL01 is accordingly to be not displayed in this state, so that the behind flag BF is set to “0” representing non-display. The lower left projection drawing PIC6 is the parallel projection drawing from the direction 6. In this state, the upper portion of the building BL01 is slightly observed as illustrated. The behind flag may be set either of display/non-display when the building is partly observable. In the illustrated example, since only a little portion of the building is observed, so that the behind flag BF is set to “0” representing non-display. Alternatively the behind flag BF may be set to “1”, since even a portion of the building BL03 is observable.
The location in the blind spot depends on the height of the building, as well as the planar positional relationship of the buildings. It is accordingly preferable to set the behind flag, based on the result of parallel projection. In the illustrated example of
The behind flag may be set manually by the operator or may be set automatically based on the determination of whether each feature is in the blind spot of another feature in parallel projection of the feature in the feature data generating process (
In this process, the CPU first inputs specification of the display position, the direction and the range (step S300). The user may specify these parameters through the operation of the keyboard or the current position obtained by GPS may be used as the display position. The CPU extracts map information corresponding to the specification from the map information previously obtained in the previous cycle of the map display process and stored in the cell phone 300 (step S301). The map information is the collective designation of various data required for displaying a map, such as feature data, character data and network data.
An example of such extraction is illustrated in the drawing. A hatched area in map information ME divided in meshes represents map information previously stored in the cell phone 300. An area IA represents a range corresponding to the user's specification. A part of the stored map information overlapping with the area IA, i.e., a part other than meshes ME3 and ME4, is extracted in this illustrated example. The meshes M3 and ME4 that do not overlap with the area IA may be deleted as unnecessary information or may be left within the allowable memory capacity of the cell phone 300.
When the extracted map information is insufficient to display the map (step S302), the CPU obtains map information corresponding to the insufficient part from the server 200 (step S303). In the illustrated example, meshes ME1 and ME2 are insufficient to display the area IA, so that map information corresponding to these meshes ME1 and ME2 is obtained.
After obtaining the map information, the CPU draws the feature (step S304). According to this embodiment, feature data is two-dimensional polygonal data generated by parallel projection, so that the three-dimensional map can be displayed by drawing a polygon according to the obtained feature data. The general method of drawing a three-dimensional map uses a three-dimensional model and generates a perspective projection drawing by rendering process. While this method has extremely heavy processing load for rendering, the method of this embodiment has the significant advantage of drawing a three-dimensional map by extremely light load.
The CPU then displays the character with the behind flag set to 1 in the map (step S305). Displaying the character may be performed simultaneously with drawing the feature (step S304). The display position of the character in the displayed map may be set by the following procedure.
Since the latitude and the longitude are known for each apex in each of the meshes constituting the feature data, the procedure first specifies a point in the mesh corresponding to the position information (latitude and longitude) included in the character record. The uv coordinate values specified in each mesh may be determined according to the latitude and the longitude included in the character record by interpolating the latitudes and the longitudes of the apexes of the mesh.
The procedure then shifts the display position of the character in the u-axis direction according to the height information. When the height information is specified by the pixel value in display, the specified value is used directly. When the height information is specified in meters or by the number of stories, the specified value may be converted into the pixel value by multiplying a factor corresponding to the projection angle.
The procedure of this embodiment uses the behind flag to display the character only for the feature that is not in the blind spot. The behind flag is set with respect to each direction, so that display/non-display of the character may be changed according to the specified direction. The general method of rendering the three-dimensional model determines whether a feature is in the blind spot and controls display/non-display of the character in the course of rendering process. This causes extremely heavy processing load for controlling display/non-display of the character. The procedure of this embodiment, on the other hand, has significant advantage of controlling display/non-display of the character by extremely light load.
a through 15c illustrate an output example of a three-dimensional map of a specific area corresponding to the photograph of
b shows an output example of a two-dimensional map. The buildings BL1 and BL2 of
c shows an output example of this embodiment. The buildings BL1 and BL2 are also encircled by the dotted lines in
The character strings “XX BUILDING” and “2ND ** BUILDING” are displayed in the illustrated example of
The user of the cell phone 300 first specifies the place of departure and the destination in route search (step S210). The place of departure may be the current position obtained by GPS. The destination may be set by any of various methods, such as the name of a feature, the postal address or the coordinate values of latitude and longitude. The cell phone 300 sends the results of specification to the server 200. The server 200 inputs the specification of the place of departure and the destination (step S200) and performs route search using the network data 213 (
The cell phone 300 receives the search result (step S211) and performs route guidance by the following procedure. The cell phone 300 first inputs the user's current position and moving direction (step S220). The current position may be identified by GPS. The moving direction may be determined, based on a positional change from the previous position to the current position. The cell phone 300 subsequently performs a display area determination process (step S220). This process determines the map display area based on the current position and the moving direction (step S220).
The process of determining the direction of the map is illustrated in the drawing. A rectangle on the center represents an area to be displayed, and the eight directions corresponding to those of
As shown in the directions 1 and 8, the angle range allocated to each direction may be greater than 45 degrees and there may be an overlapped area between the directions. The range shown by the dashed-dotted lines is the angle range of greater than 45 degrees. Allocating such wider angle ranges to the directions 1 and 8, there is an overlapped area, such as a hatched area HA, between the directions 1 and 8. Such setting enables this overlapped area to be used as a hysteresis area in the process of determining the direction. For example, during a change of the moving direction from the direction 8 to the direction 1, the direction 8 is used even when the moving direction enters the overlapped area HA. During a change of the moving direction from the direction 1 to the direction 8, on the contrary, the direction 1 is used even when the moving direction enters the overlapped area HA. Setting the hysteresis advantageously prevents the frequent change of the displayed map when the moving direction frequently changes near the boundary between the direction 1 and the direction 8.
In the illustrated example, the overlapped area HA is set between the direction 1 and the direction 8. A similar overlapped area may be set between other directions. After determining the direction of the displayed map, the cell phone 300 subsequently determines the display area based on the current position and the determined direction (step S222).
The right drawing illustrates an example of determining the display area in the route guidance. It is assumed that the user moves from a position POS1 to a position POS2 and further to a position POS3 along a route PS shown by the broken line. At the position POS1, the moving direction DR1 is upward in the drawing, i.e., direction 5 (refer to the drawing of step S221). The cell phone 300 accordingly uses the feature data of the direction 5 to set an area of XAr in width and YAr in length as a display area Ar1. The width and the length of the area may be set manually by the user's specification or may be set automatically according to the user's moving speed. The moving speed may be calculated from a time change of the current position.
When the user moves to the position POS2, the moving direction DR2 slightly changes rightward. The moving direction DR2 is, however, still in the angle range of the direction 5. The cell phone 300 accordingly selects the direction 5 at the position POS2 and determines a display area AR2. As a result, during a move from the position POS1 to the position POS2, although the moving direction slightly changes rightward, the map display for guidance shifts in parallel in the direction 5.
When the user further moves to the position POS3, the moving direction DR3 changes more rightward. The moving direction DR3 is then out of the angle range of the direction 5 and enters the angle range of the direction 6. The cell phone 300 accordingly selects the direction 6 at the position POS3 and determines a display area AR3. The map display is then changed from the map in the direction 5 to the map in the direction 6 along the course from the position POS2 to the position POS3.
The description goes back to the route guidance process of
In order to adequately display the route on the roads, the procedure of this embodiment determines the display position by parallel projection of the current position and the network data. This is the coordinate transformation process (step S230). The details of the coordinate transformation will be described later.
On completion of the coordinate transformation, the cell phone 300 performs a map display process according to the determined display area (step S300). The details of this process are similar to the processing flow shown in
The cell phone 300 repeats the processing of and after step S220 to continue the route guidance until the user arrives at the destination (step S311).
According to this embodiment, the current position is given by three-dimensional position coordinates, e.g., a point P3D (X,Y,Z). This coordinate point corresponds to a two-dimensional position Cpg (latitude, longitude) and corresponds to a point P2D (X,Y) in the mesh M2D on the plane A2D where a two-dimensional map is drawn. Parallel projection projects the point P3D to a point Pp2 in the mesh Mp2 on the plane Ap. On the assumption that two-dimensional elements (X,Y) of the three-dimensional coordinates at the point P3D are coordinate values of parallel projection, the point P3D is projected to a point Pp1 in a mesh Mp1 that is different from the proper mesh Mp2 on the plane Ap. There is accordingly an error Vc from the proper point Pp2. The procedure of this embodiment performs coordinate transformation corresponding to the shift of the point P3D in the plane Ap by the error Vc, so as to enable adequate parallel projection of the point P3D.
The correction vector Vc is determinable by an affine transformation matrix as the combination of rotation and parallel translation. The process first obtains a transformation matrix corresponding to a vector Vc0 for parallel translation in the −X-direction with keeping the height of the point P3D. Since the magnitude of the vector Vc0 is given by the product of the height z of the point P3D and tan(Ap) where Ap represents a projection angle, the vector Vc0(Vc0x,Vc0y,Vc0z) is expressed as:
Vc0x=−z×tan(Ap);
Vc0y=0; and
Vc0z=0.
The correction vector Vc is obtained by rotating the vector Vc0 around the z-axis by a projection direction (−Ay). The correction vector Vc(Vcx,Vcy,Vcz) is accordingly expressed as:
Vcx=−z×tan(Ap)×cos(Ay);
Vcy=z×tan(Ap)×sin(Ay); and
Vcz=0.
Application of this correction vector Vc to the vertically projected point Pp1 of the point P3D determines the point Pp2. The correction vector Vc is substantially equivalent to a two-dimensional vector (Vcx,Vcy) and thereby enables correction in the projection plane of parallel projection.
The correction vector Vc is given on the assumption that the y-axis, the x-axis and the z-axis respectively represent the northerly, the easterly and the height direction and that the projection direction is expressed by the angle in the easterly, the southerly, the westerly, and northerly direction where the northerly direction is set to 0 degree.
The cell phone 300 subsequently inputs the current position and the network data in the display area (step S303) and performs coordinate transformation of the current position 8 step S304). The cell phone 300 also performs coordinate transformation of the network data with respect to the whole network (steps S305 and S306). The coordinate transformation of the network data may be performed prior to the coordinate transformation of the current position or may be performed simultaneously. On completion of the coordinate transformation of the current position and the network data, the cell phone 300 terminates the coordinate transformation process. The map is displayed by using this transformation result (step S310 in
a through 21c illustrate an example of route guidance. The display for guidance sequentially changes with movement along the route from
b shows a display after the right turn. A map is drawn by using feature data in the different projection direction from that of
c shows a display after the right turn. A map is drawn by using feature data in the further different projection direction from that of
In the illustrated example of
The foregoing describes the embodiment of the invention. The three-dimensional map display system may not have all the functions of the embodiment described above but may implement only part of the functions. The three-dimensional map display system may have additional functions.
The invention is not limited to the above embodiment but various modifications and variations may be made to the embodiment without departing from the scope of the invention. For example, the hardware configuration of the embodiment may be replaced with the software configuration and vice versa.
The invention is applicable to draw a three-dimensional map that expresses features three-dimensionally.
Number | Date | Country | Kind |
---|---|---|---|
2010-053885 | Mar 2010 | JP | national |
This application is a Continuation of International Patent Application No. PCT/JP2010/058151, filed on May 14, 2010, which claims priority to Japanese Patent Application No. 2010-053885, filed on Mar. 11, 2010, each of which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2010/058151 | May 2010 | US |
Child | 13609199 | US |