The present disclosure relates to map rendering systems, such as electronic map display systems, and more specifically to a map rendering system that renders a set of indicators for out-of-view map features.
While digital maps may be commonly implemented in a wide variety of devices (e.g., mobile phones, car navigation systems, hand-held global positioning system (GPS) units, computers, and many websites), methods of displaying digital maps remain challenging due to limitations in display device screen sizes and resolutions. Unlike paper maps in which a user may unfold the entirety of a map and view any portion of the map at leisure, a map rendering system may only display a small portion of a map surface at a time. Moreover, the size of the portion that may be displayed may limited by a desired viewing resolution or viewing magnification of the map image being rendered.
Because only a portion of a map surface may be viewed at one time, some map features may not be contained in a current viewing window or viewport of a map surface at a user-selected level of detail or magnification. However, out-of-view map features that are external to a current viewing window or out-of-view (also referred to herein as “OOV”) may be important enough that a user may desire information on the OOV map features.
A computer-implemented method for rendering a map on a display device includes determining a first viewing window of a map surface, the first viewing window defined by a set of viewing parameters including a position, a set of viewing boundaries, and a magnification. The method displays a first area of the map surface in the first viewing window based on the set of viewing parameters. The method receives a user context associated with at least one of a user selected map feature or a search request for a set of map features. The method generates an out-of-view indicator of an out-of-view map feature based on the received user context, wherein the out-of-view map feature is external to the displayed first area of the map surface. The method displays the out-of-view indicator within the first viewing window. The displayed out-of-view indicator includes a directional indicator and includes a second viewing window of a second area of the map surface that contains the out-of-view map feature, wherein the first area and the second area are not contiguous.
The current application generally relates to techniques for providing information on a set of map features that is not contained within a viewable portion of a map surface that is currently displayed within a viewing window. An external map feature may be referred to herein as an out-of-view (referred to herein as “OOV”) map feature or out-of-viewport map feature. When an out-of-view map feature is determined to be important to a user or is determined to be of high priority, an out-of-view (OOV) indicator may be generated and displayed within the primary viewing window to display information to a user about the OOV map feature.
Map features that are external to a current viewing window may be important for many reasons. In one situation, a first map feature may have a relation or association to a second map feature (e.g., an OOV map feature) that is important to a user. For example, a current viewing window may represent an area about an origin of a user's route, where an out-of-view map feature may be a possible destination of interest. The user may be interested in plotting a route from the origin contained in the current viewing window to an OOV destination. In this situation, the user may require details of a map feature in the current viewing window as well as some general information about a second location/area that is not contained in the map area of a primary or current viewing window. Because the route may not fit into a current viewing window at a particular magnification, an OOV indicator may be displayed (e.g., along a displayed portion of the route) to indicate a continuation of the route off screen.
In another situation, a user may indicate an interest in an in-view map feature (e.g., via a mouse click of or a mouse hover-over through the map feature). The indication of interest may correspond to a possibility that the user may require information on an out-of-view map feature that is related or otherwise relevant to the in-view map feature. In this case, an OOV indicator may provide some high level detail about an OOV map feature that is related to the in-view map feature. In an another situation, the OOV map feature may be important to a user context involving a search, where the search results may indicate points of interest that are not in the user current viewing window. The above examples illustrate only some of the many situations in which an OOV indicator of an OOV map feature may be important or useful to a user.
Referring now to
The map database 12 may store any desired types or kinds of map data including raster image map data and vector image map data. However, the image rendering systems described herein are best suited for use with vector image data which defines or includes a series of vertices or vertex data points for each of numerous sets of image objects, elements or primitives within an image to be displayed. Generally speaking, each of the image objects defined by the vector data will have a plurality of vertices associated therewith and these vertices will be used to display a map related image object to a user via one or more of the client devices 16-22. As will also be understood, each of the client devices 16-22 includes an image rendering engine having one or more processors 30, one or more memories 32, a display device 34, and in many cases a rasterizer or graphics card 36 which are generally programmed and interconnected in known manners to implement or to render graphics (images) on the associated display device 34. The display device 34 for any particular client device 16-22 may be any type of electronic display device such as a liquid crystal display (LCD), a light emitting diode (LED) display, a plasma display, a cathode ray tube (CRT) display, or any other type of known or suitable electronic display.
Generally, speaking, the map-related imaging system 10 of
Referring now to
During operation, the map logic of the map application 48 executes on the processor 30 to determine the particular image data needed for display to a user via the display device 34 using, for example, user input, GPS signals, prestored logic or programming, etc. The display or map logic of the application 48 interacts with the map database 12, using the communications routine 43, by communicating with the server 14 through the network interface 42 to obtain map data, preferably in the form of vector data or compressed vector data from the map database 12. This vector data is returned via the network interface 42 and may be decompressed and stored in the data memory 49 by the routine 43. In particular, the data downloaded from the map database 12 may be a compact, structured, or otherwise optimized version of the ultimate vector data to be used, and the map application 48 may operate to transform the downloaded vector data into specific vertex data points using the processor 30a. In one embodiment, the image data sent from the server 14 includes vector data generally defining data for each of a set of vertices associated with a number of different image elements or image objects to be displayed on the screen 34 and possibly one or more lookup tables which will be described in more detail below. If desired, the lookup tables may be sent in, or may be decoded to be in, or may be generated by the map application 48 to be in the form of vector texture maps which are known types of data files typically defining a particular texture or color field (pixel values) to be displayed as part of an image created using vector graphics. More particularly, the vector data for each image element or image object may include multiple vertices associated with one or more triangles making up the particular element or object of an image. Each such triangle includes three vertices (defined by vertex data points) and each vertex data point has vertex data associated therewith. In one embodiment, each vertex data point includes vertex location data defining a two-dimensional or a three-dimensional position or location of the vertex in a reference or virtual space, as well as an attribute reference. Each vertex data point may additionally include other information, such as an object type identifier that identifies the type of image object with which the vertex data point is associated. The attribute reference, referred to herein as a style reference or as a feature reference, references or points to a location or a set of locations in one or more of the lookup tables downloaded and stored in the data memory 43.
Style parameters may include a fill color (e.g., for area objects), an outline color, an outline width, an outline dashing pattern and an indication of whether to use rounded end caps (e.g., for road objects), an interior color, an interior width, an interior dashing pattern, and interior rounded end caps (e.g., for road objects), a text color and a text outline color (e.g., for text objects), an arrow color, an arrow width, an arrow dashing pattern (e.g., for arrow objects), a text box fill color and a set of text box outline properties (e.g., for text box objects) to name but a few. Of course, different ones of the vertex style attributes provided may be applicable or relevant to only a subset of image objects and thus the vertex style data points associated with a particular type of image object may only refer to a subset of the vertex attributes listed for each style.
The block 402 may determine a viewing window by at least a position, a size, and a zoom level of the viewing window. In three-dimensional map renderings, the viewing window may be further defined by a direction of view as well as a tilt angle (or angle of incidence of the viewing window with a map surface plane). The position of the viewing window may be a position of the viewing window with respect to a portion of a map surface (to be displayed). While any portion of the viewing window may relate to a position on a map surface, the center position of the viewing window may be used as a reference point herein when describing the position of the viewing window. For example, a center position of the viewing window may relate to or correspond with a point or area of a map surface that is or will be displayed on the viewing window. In this case, the center of the viewing window may correspond to a center of the displayed first map area.
The size of the viewing window may be defined by a physical set of dimensions of a display device or display screen. In some embodiments, a viewing window may only represent a portion of a total display screen. For example, a mapping application or other application (e.g., a computer operating system) may allocate only a portion of a total display screen for use as a viewing window of a digital map. Generally, a viewing window size may be denoted using pixel-based dimensions. For example, in a rectangular viewing window, the size of the viewing window may be defined by a length and width measurement such as 640×480 pixels. Instead of pixel lengths, the viewing window may also be defined in a common distance metric such as millimeters or inches. The viewing window size may also correspond to a set of boundaries of a displayed portion of a map surface. In a rectangular viewing window embodiment, the size of the viewing window may be defined by two 640 pixel borders and two 480 pixel borders, wherein the pixel lengths of the borders may be equivalent to distances of the displayed first map area depending on a scaling factor or magnification (to be discussed further below). Accordingly, the viewing window size may be described by the portion of a map surface that is displayed and defined by the set of viewing window boundaries.
A zoom level corresponds, in part, to a magnification which is used, in part, to define a displayable area of a map surface within a viewing window. A magnification of the viewing window may correspond with a scale for which the map surface is rendered or drawn. For example, where magnification or scale is expressed as a ratio such as 1:1000, one of any unit of measurement on the viewing window (e.g., pixel length) may correspond exactly or approximately to 1,000 actual units (e.g., miles or kilometers) of a map surface. In particular, when the viewing window size is measured in inches, the distance scale may translate an inch of the viewing window to a length of 1,000 miles (or kilometers). Some computerized maps allow users to zoom in or zoom out of a map surface, where a zoom level generally corresponds to a magnification of the viewing window that displays the map surface. Unlike a paper map that displays all map surface data in one fixed rendering, computer mapping applications may only display certain map features at a certain zoom level or magnification while excluding other map features. In these computer mapping applications, increasing a zoom level of a viewing window may not only enlarge features already displayed on a map, but may also cause the mapping application to draw additional features of the map. The number of map features displayed for a given magnification may be referred to herein as a density of map features. The density of map features and magnification may be specific to a zoom level.
While a map database may contain an enormous amount of map data for use in rendering a digital map, only a small fraction of the map data may be useful to a digital map user for any given mapping session. In particular, viewing of a digital map often requires a certain level of map detail, which may be based on a density of map features shown as well as a magnification. As discussed above, the density of map features and the magnification or scale of a map may be defined by a zoom level. Accordingly, a map user may need to view a map surface at a certain zoom level to be meaningful and useful. More particularly, while the zoom level may be adjusted so that any size area may be displayed to a user within a device screen, only certain zoom levels may be useful to a user for a particular user task or context. For example, at a sufficiently low zoom level, the entire United States may fit on any display device screen, but the details of a city like Chicago at such a low zoom level may be insufficient for a map user. Thus, the need to view a map at a particular zoom level may limit the amount of map area that can be displayed at any one time. Furthermore, a viewing window is generally limited in size by the physical limitations of the display device rendering the viewing window. The result of these limitations is that a given viewing window can only show a fraction of a meaningful map area at any one time.
By setting the appropriate viewing window position, viewing boundaries (or size), and zoom level, the viewing window may define a displayable portion of a map surface. In particular, the position of the viewing window may be centered about a center of the first map area, the size of the viewing window may define a set of boundaries of the displayed first map area, and the size of the first map area may be determined by the zoom level or magnification at which the displayed first map area is rendered. In an embodiment, the size of the viewing window may be fixed (e.g., limited by the physical size of a display device screen) while a magnification and position of the viewing window may be adjusted by a user.
In an embodiment, the block 402 may determine the viewing window by receiving an input from a user of a computer device. For example, the user may input a particular longitude, latitude, and altitude, as well as a zoom level. The size may be fixed by the display device or a program of the display device. In some embodiments, the determination of the viewing window may be made based on a pre-determined set of parameters for an initial rendering of a map location (e.g., a default location or area) or may be based on pre-stored settings that are in turn based on user preferences or a user profile. In another example, the determination may be made in response to a user input that initiates, for example, a pan action (e.g., a selection of an arrow indicating a pan in a particular direction, a swipe, etc.), a selection of a map feature or map point (e.g., via a touch screen press, a mouse click, etc.), etc. The block 404 may display a first area of the map surface in response to the received user input.
The block 406 may receive a user context that may be used in the block 408 to generate an OOV indicator. Generally, the user context may include a selection of an in-view map feature or a search for a map feature. A map feature may include any element of a map surface such as a road, a point location, an area, a building, etc. The block 406 may receive a selection of an in-view map feature in a number of manners known in the art. For example, a user may use a pointer device such as a mouse and direct a displayed pointer over the in-view map feature using the pointer device and click or select the in-view map feature. In an embodiment, the user may select a map feature by using the pointer device to hover a pointer over a map feature (without necessary clicking on the map feature).
The user context may also include a search for a map feature that may include a location, an area, a road, or other map feature. The user context may include a result of a search or may include parameters that may be used to initiate a search. In a situation where the user context includes the parameters for a search, the block 406 may be configured to initiate the search (e.g., execute a function to search) based on the search parameters to generate a search result(s). The search result(s) for a map feature may be a single map feature or a plurality of map features. If the search is for a type of map feature, such as a search for restaurants, a plurality of map features may result.
The user context may include a search that is performed after rendering or displaying an initial viewing window (e.g., after the block 404 is initiated). The search parameters may be based on the information displayed in the initial viewing window. For example, after viewing the initial viewing window, a user may initiate a search for a map feature(s) that is prompted by what the user sees in the initial viewing window. In an embodiment, a search may be performed before the initial viewing window is rendered (e.g., before the block 404 is initiated). This may be the case in which a map session begins with a user initiated search before any viewing window is displayed.
The block 408 may be configured to generate an OOV indicator based on the user context. Generally, an OOV indicator may be used to address some of the problems of trying to fit relevant map features into a limited viewing window by providing high level details (e.g., a label and a direction) of an OOV map feature. The OOV indicator may be useful in showing relationships between one or more features of a map. The OOV indicator may also provide functionality for retrieving additional information on the OOV map feature (to be discussed further below).
The block 410 may display the out-of-view indicator within a viewing window.
The map within a map OOV indicator 704 of
In an embodiment, the density of map feature data for the second viewing window may be adjusted. As discussed above, the density of map feature data may also correspond with zoom level, where certain zoom levels may provide higher density map feature data for a given magnification. In an embodiment, the block 408 may generate the second viewing window with only a portion of a total amount of map feature data available for the zoom level. The block 408 may remove or add certain types of features based on a user or application setting. The modification of feature data for the second viewing window may be useful in reducing clutter and in improving the readability of the map of the second viewing window.
In an embodiment, the block 410 may generate the second viewing window 710 to include or not include certain map features that are in a vicinity of the OOV map feature 708. For example, the map features may include landmarks that are helpful in identifying the location of the OOV map feature 708. Some map features normally included with a map but do not provide useful information may be removed from the second map area 706 to reduce clutter and improve the visual aesthetic of the second viewing window 710.
In an embodiment, the OOV indicator may be rendered along a vector originating from the center of the primary viewing window and directed toward the OOV map feature.
After an initial rendering of an OOV indicator, a user may adjust the viewing window to change a position (i.e., panning) or a zoom level of the viewing window. In an embodiment, the block 408 may regenerate the OOV indicator and the block 410 may re-render the OOV indicator based on the change in the viewing window. For example, panning the viewing window may cause the block 410 to modify the placement and/or rotation of the OOV indicator where the OOV indicator is positioned based on the vector discussed above.
In an embodiment, the mapping application may partition the primary viewing window so that a portion of the primary viewing window is designated for rendering one or more OOV indicators. This designated OOV section may be used in lieu of or in addition to placing an OOV indicator along the vector from the viewing window position (e.g., center) to the OOV map feature as described above. In an embodiment, the OOV indicator section may be used to display any and all OOV indicators for the viewing window.
The block 410 may display the out-of-view indicator within the first viewing window based on a type of OOV indicator. In an embodiment, the block may be configured to position the OOV indicator along the vector from the viewing window (center) to the OOV map feature or in the designated OOV indicator area based on the priority of directional information. For example, in situations in which direction is important, the vector placement may be used so that a user may obtain a better sense of direction to the OOV map feature. In situations in which the existence or the identification of an OOV map feature is more important than direction, the designated area may be used.
For a given viewing window (as defined by at least viewing window position, size, and zoom level), certain map features may only be partially displayed in a current viewing window. In this situation, a user may desire to view additional information about the partially displayed map features, however a highly detailed OOV indicator may be overkill The block 410 may display or render a simplified OOV indicator to provide some details of the partial OOV map feature. While the user may pan (i.e., adjust the position of the viewing window) the viewing window to display more or all of a partially displayed map feature, a less complicated OOV indicator may be used to show cursory information of the partially shown OOV map feature. The cursory information on the partially out-of-view map feature may be enough for a user to forego the need to pan towards the out-of-view map feature, thereby saving time and effort.
A particular situation in which partially out-of-view map features may need indicators is when a mapping application plots a route that does not entirely fit within a current viewing window (at a particular zoom level).
The user context received in the block 406 may be used to determine which OOV map features to generate OOV indicators of. As discussed above, the block 408 may generate the OOV indicator based on a selection of a map feature or a search for one or more map features. When the user context includes a selection of an in-view map feature, the block 408 may determine associations of the in-view map feature with other map features that may be important to the user. In an embodiment, the out-of-view map feature may be a map feature that corresponds with, is related to, or is otherwise associated with a selected feature of the first map area.
Generally, when the received user context is a selection of an in-view map feature, the block 406 may perform a search for map features that are associated with the selected map feature. The block 406 may use the data diagram of
The user context received at the block 406 may be a search for a set of map features, such as a category of map features. If the result of the search is a single map feature, the block 408 may generate an OOV indicator of the single map feature and the block 410 may display the OOV indicator. If the result of the search is a plurality of map features, the block 408 may determine which one or more of the search results to generate an OOV indicator of. The block 410 may be configured to only display a single OOV indicator. In this case, the block 408 may determine the single OOV map feature from the plurality of search results based on a priority of the resulting map features. As discussed above, a search result may be generated by a function of the mapping application and in some embodiments, the block 406 may initiate a search based on received search parameters. The block 406 may order or rank the search results by relevancy to the search parameters (e.g., relevancy to a set of search terms comprising the search parameters) and the block 408 may generate an OOV indicator of the highest ranked (highest relevancy) result of the search result.
In an embodiment, the block 408 may be configured to generate a plurality of OOV indicators. In this embodiment, the block 408 may still select map features based on a priority of the search results. The block 408 may generate OOV indicators of the top ranked map features until a threshold is reached. For example, where the block 408 may be configured to generate no more than three OOV indicators, the block 408 may generate OOV indicators of the top three ranked search results. In an embodiment, the number of top ranked results used to generate OOV indicators may be limited by the size of the designated area for the OOV indicators. Alternatively, the block 408 may be configured to generate OOV indicators if the priority of the search result is above a threshold priority. The block 408 may adjust the threshold priority to generate more or less indicators based on a user setting of the map application or based on a monitored or determined current processor capacity or current bandwidth of a connection to a map server. For example, the block 408 may produce more OOV indicators when the current processor capacity or current bandwidth of a connection to a map server is above a threshold. When the current processor capacity or current bandwidth of a connection to a map server is below a threshold, the block 408 may produce less OOV indicators or none at all.
The block 406 may rank or prioritize the search results in a number of manners. As discussed above, a user context search may provide a default ranking of the results based on relevancy to a set of search parameters. Alternatively or in conjunction with relevancy to the search terms, the search results may be prioritized based on user profile data. User profile data may include a set of parameters that identify user preferences during a map session. The user profile data may include at least a user location or a set of locations. The user profile may also indicate an area of interest to the user. In an embodiment, the search results may be ranked based on a proximity of a search result to one or more of the user locations or areas included in the user profile. The search results that are closer to one or more of these locations may be ranked higher than search results further from the locations. The search results may be ranked by other data as well.
The block 408 may generate different types of OOV indicators based on the user context. In an embodiment, the block 408 may generate an OOV indicator based on any of the parameters of the data diagram of
Generally, the amount of data required to generate an OOV indicator may be proportional to the complexity or level of detail of the type of OOV indicator being generated. For example, the map within a map OOV indicator of
In an embodiment, the block 408 may index the types of OOV indicators by the amount of data required to generate the OOV indicators, where more complicated and more detailed OOV indicators (such as the map within a map OOV indicator of
In an embodiment, the OOV indicator may be configured to be selectable or capable of being activated.
In an embodiment, a block 1216 may determine whether the activation of the OOV indicator requires a regeneration or redrawing of the OOV indicator. For example, the block 1216 may direct the process 1200 back to the block 1208 to re-generate the OOV indicator to provide more or less data. In an embodiment, the block 1208 may generate the OOV indicator as an operational item that, when activated or selected, can provide additional options, such as a menu or action buttons, and additional information. For example, in an embodiment, clicking the OOV indicator (block 1212) may bring up additional menu screens (via the blocks 1214, 1216, 1208, and 1210) for generating and displaying additional information on the OOV map feature. In an embodiment, the type of OOV indicator may be modified when the OOV indicator is clicked or selected (the block 1212), wherein each selection prompts the generation and display of a more complex and more information-rich OOV indicator type.
Generally, information on an out-of-view map feature may be obtained by panning (or reposition) the viewing window about or towards the out-of-view map feature without using OOV indicators. However, panning may not provide any of the benefits of an OOV indicator as described herein. First, panning a viewing window requires a map user to be aware of the existence of the out-of-view map feature(s). Second, the user may need to apply additional work to re-position the viewing window to bring the out-of-view map features into view. Without a directional indicator, this may be difficult, even if the user knows of the existence of an OOV map feature of interest. Third, panning the viewing window may not eliminate the problem that map features brought into view may be done at the expense of losing sight of previous map features that are also important to the user or to a user context. Thus, panning requires additional work and occasionally, the need to memorize information in a current viewing window before the viewing window is panned and the previous information is no longer in view.
The OOV indicators described herein provide a user additional contextual information on OOV map features that are not shown in a current viewing window as defined by a given set of viewing window parameters (i.e., at least position, viewing boundaries, and zoom level). The out-of-view indicator alerts the user to the existence of an out-of-view map feature that may be relevant or important to a user context. The out-of-view indicator may indicate contextual information that the user may need to make decisions during an information search or a navigation task. For example, once the user is informed of the existence of an out-of-view map feature the user may decide to pan or adjust the current viewing window towards the out-of-view map feature. As discussed above, the OOV indicators may provide functional shortcuts to the OOV map feature (e.g., auto panning to the OOV map feature). Alternatively, the out-of-view indicator may provide the user enough information about the out-of-view map feature such that the user may forego additional map application usage (including the need to pan towards the out-of-view indicator). Thus, being made aware of such out-of-view map features via the described OOV indicators may potentially add greater context for users of mapping applications, thereby allowing them to fully understand the environment that they are currently viewing through the mapping application and be more efficient in their information search.
Any suitable subset of the blocks may be implemented in any suitable order by a number of different devices (e.g., client or server) and remain consistent with the method and system described herein. Moreover, additional determination blocks may be added to refine the filtering of style parameters subject to interpolation processing.
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
For example, the network 25 may include but is not limited to any combination of a LAN, a MAN, a WAN, a mobile, a wired or wireless network, a private network, or a virtual private network. Moreover, while only four client devices are illustrated in
Additionally, certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the term hardware should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware and software modules can provide information to, and receive information from, other hardware and/or software modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware or software modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware or software modules. In embodiments in which multiple hardware modules or software are configured or instantiated at different times, communications between such hardware or software modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware or software modules have access. For example, one hardware or software module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware or software module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware and software modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs).)
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” or a “routine” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms, routines and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other. The embodiments are not limited in this context.
As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the description. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Still further, the figures depict preferred embodiments of a map rendering system for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for a system and a process for rendering map or other types of images using the principles disclosed herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.