This application relates generally to the field of data processing and, in an example embodiment, to presentation of maps for multiple floors of a building.
Electronic mapping applications or systems, in which a two-dimensional plan or map view of an area, such as a street map or other kind of geographical map, have largely supplanted the use of paper maps due to several factors, including, but not limited to, the increased availability of navigational devices, smart phones, tablet computers, and other mobile systems; the enhanced access to the Global Positioning System (GPS) and other communication systems via such devices; and the updateability of electronic maps on these devices via the Internet.
Over time, the use of electronic maps has expanded into intra-building environments, especially for large buildings with multiple floors, each of which may include a significant number of possible internal destinations and routes therebetween. Typically, each floor is presented in isolation as a separate map, depicting the various rooms, open areas, passageways, and other significant features of that floor.
The present disclosure is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that exemplify illustrative embodiments. In the following description, for purposes of explanation, numerous specific details are set forth to provide an understanding of various embodiments of the subject matter. It will be evident, however, to those skilled in the art that embodiments of the subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.
The user devices 110 may be communicatively coupled to a mapping server 120 or similar system via a communication network 130, which may include any one or more networks or communication connections, such as, for example, a local area network (LAN) (e.g., Ethernet or WiFi®), a wide area network (WAN) (e.g., the Internet), a cellular network (e.g., third-generation (3G) or fourth-generation (4G) network), a Bluetooth® connection, or another communication network or connection.
The mapping server 120 may access mapping data from a mapping database 122 or other data storage device or system and provide the mapping data to the user devices 110 via the communication network 130. The mapping data may include building map data, such as floor maps for each floor of one or more buildings or other public and non-public venues, including, but not limited to, office buildings, apartment buildings, hotels, sports venues (e.g., stadiums, arenas, and so on), private residences, and the like. The mapping data may also include information associated with various features of each of the floor maps, such as, for example, information associated with various organizations (e.g., corporate groups, touring groups, fraternal associations, and so on), information regarding individuals (e.g., name, contact information, organizational information, personal preferences, and so forth), and/or any other information possibly corresponding to the floor maps. The mapping database 122 may also include mapping data for external areas surrounding the one or more buildings (e.g., geographical features, street and building locations and names, and the like). In other examples, the user device 110 may store the mapping data locally, thereby possibly rendering the communication network 130, the mapping server 120, and/or the mapping database 122 superfluous in some embodiments.
The user input interface 202 may be configured to receive user input indications and selections for directing the user device 200 to perform the various operations and functions discussed in greater detail below. Examples of the user input interface 202 may include, for example, a keyboard, a mouse, a joystick, a touchscreen, and/or the like. In at least some embodiments, a user of the user device 200 may employ the user input interface 202 to select one of a plurality of displayed floor maps, select one or more features of a particular floor map, and so on.
The display interface 204 may be configured to present floor maps and other visual information to a display device (e.g., the display device 112 of
The communication network interface 206 may be configured to communicate with a mapping server (e.g., mapping server 120 of
The presentation module 208 may be configured to present one or more floor maps for multiple floors of one or more buildings on a display (e.g., display device 112 of
As depicted in
In the method 300, a first view of one or more floor maps for one or more floors of a building may be presented by the presentation module 208 for display (operation 302). In one example, the first view may include multiple floor maps of the building, arranged in parallel to each other according to the position of their corresponding floors in the building. Further, to facilitate visualization of at least a portion of each of the floor maps, the presentation module 208 may present the floor maps at a perspective or angled view at some angle between perpendicular to the floor maps and parallel to the floor maps. As a result, some of the information contained in at least some floor maps may be obscured by displayed adjacent floor maps.
A user selection of one of the floor maps may then be received via the user input interface 202 (operation 304). As is discussed more fully below, such a user selection may be a graphical selection (e.g., a mouse click, a single or double tap of a touchscreen, and the like) on the display device 112 of the floor map, a selection of a graphical region on the display device 112 that corresponds to the floor map, or the like.
In response to the user selection, the presentation module 208, using the animation module 210, may present an animation from the first view of the floor maps to a second view of the selected floor map (operation 306). In one example, the presentation module 208 may present the selected floor map in a plan, or top, view. Further, the selected floor map may be presented for display in isolation, without any other floor maps being presented at that time, so that the selected floor map may be displayed without obstruction from another floor map. Further, the animation from the first view to the second view may facilitate spatial understanding regarding the relative position of the selected floor map within the building. In addition, the received user selection, or a separate user selection, may indicate a particular location or feature (e.g., a particular room or cubicle) of the selected floor map. In that example, the presentation module 208 may focus the presentation of the selected floor map on the selected location or feature. Further, the presentation module 208 may scan across the selected floor map and/or zoom in to the selected location or feature of the floor map.
While
The presentation module 208 may receive, via the user input interface 202, a user indication or selection of a first one of the floor maps (operation 404). In one example, the user indication may be a hovering of a cursor or other pointer over the floor map being selected. In other examples, the user selection may be a single touch of a touchscreen. In yet other examples, the user indication may be a cursor hovering over a graphical area or region of the display corresponding to the floor map to be selected. Many other types of user input indications may be employed in other embodiments.
In response to the user indication, the presentation module 208 may modify at least one other floor map (operation 406) so that the selected floor map may not be obscured or obstructed. In one example, the transparency of the at least one other floor map that is obscuring at least a portion of the selected or indicated floor map may be increased so that at least some previously-obscured features of the indicated floor map may be seen through the at least one other floor map.
Also shown in
In this particular example, the floor maps 501-504 may be presented so as to provide the user a view of the floor maps 501-504 directly from the front of the building. In other examples, the floor maps 501-504 may be presented so that the viewer is viewing the floor maps 501-504 from a side, rear, or corner of the building.
As a result of the arrangement of the floor maps 501-504 and the point of view provided to the user, portions of the first three floor maps 501-503 are partially hidden or obscured from the view of the user by one or more of the adjacent floor maps 502-504 positioned above the floor map 501-503 of interest. To select a particular floor map 501-504 for a more detailed viewing, the user may position or “hover” a cursor, tap a touchscreen, or perform some other input operation with respect to the particular floor map 501-504 or a region 510 corresponding thereto. For example, as shown in a graphical representation of a multi-floor map 600 in
In
In some examples, not all of the floor maps 501-504 associated with a particular building may be displayed simultaneously or concurrently. For example, in buildings of high numbers of floors, a contiguous subset of the floors may be represented by floor maps on the display device 112 at any one time. The user may then manipulate a graphical slider bar or provide some other user input to change the specific floor maps being shown at any particular time while the presentation module 208 provides visual cues to the user as to the position of the displayed floor maps relative to those floor maps that are not currently being displayed.
While the first floor map 501 of
As shown in
In some examples, the presentation module 208 may provide more detailed information (e.g., textual information) regarding a user-selected feature of the first floor map 801. For example,
Continuing from the example of
In some examples, the animation between
In various embodiments, the presentation module 208 and the animation module 210 may provide additional animation to indicate a route between two features of either the same floor map or different floor maps. For example, presuming the user has selected the feature 904, as depicted in
Various examples discussed thus far may also be incorporated within an external map that may depict the building relative to other buildings or structures of the surrounding environment. To that end,
In other examples, the street map 1200 of
Exemplifying the method 1300 of
In various embodiments, the user may select any of the other regions 1410 to facilitate the display of the corresponding floor map within the street map 1400. In some examples, the user may provide an additional user input to transition from the street map 1400 of
In at least some of the embodiments described above, multiple floor maps corresponding to the floors or levels of a building may be presented to a user. The multiple floor maps may be presented in a kind of “2.5-dimensional” view in which the floor maps are displayed according to their arrangement within a building. User inputs selecting or indicating a particular floor map may cause the selected floor map to be presented to the user in an unobstructed 2.5-dimensional mode or a two-dimensional mode, possibly with zooming and/or scanning operations to focus the attention of the user on a particular feature of interest to the user. In addition, transitions between the various views may be animated to facilitate a greater understanding of how the selected floor maps and included features are spatially related to the building and other floors. By facilitating this greater understanding, the technical effect of at least some of the various embodiments may include reduced consumption of communication bandwidth and/or reduced consumption of processing resources, including graphics processing resources, due to the user not requiring as many separate map views to navigate a particular building.
The machine is capable of executing a set of instructions 1524 (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example of the processing system 1500 includes a processor 1502 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 1504 (e.g., random access memory), and static memory 1506 (e.g., static random-access memory), which communicate with each other via bus 1508. The processing system 1500 may further include video display unit 1510 (e.g., a plasma display, a liquid crystal display (LCD), or a cathode ray tube (CRT)). The processing system 1500 also includes an alphanumeric input device 1512 (e.g., a keyboard), a user interface (UI) navigation device 1514 (e.g., a mouse), a disk drive unit 1516, a signal generation device 1518 (e.g., a speaker), and a network interface device 1520.
The disk drive unit 1516 (a type of non-volatile memory storage) includes a machine-readable medium 1522 on which is stored one or more sets of data structures and instructions 1524 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The data structures and instructions 1524 may also reside, completely or at least partially, within the main memory 1504, the static memory 1506, and/or the processor 1502 during execution thereof by processing system 1500, with the main memory 1504, the static memory 1506, and the processor 1502 also constituting machine-readable, tangible media.
The data structures and instructions 1524 may further be transmitted or received over a computer network 1550 via network interface device 1520 utilizing any one of a number of well-known transfer protocols (e.g., HyperText Transfer Protocol (HTTP)).
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., the processing system 1500) or one or more hardware modules of a computer system (e.g., a processor 1502 or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may include dedicated circuitry or logic that is permanently configured (for example, as a special-purpose processor, such as a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also include programmable logic or circuitry (for example, as encompassed within a general-purpose processor 1502 or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (for example, configured by software) may be driven by cost and time considerations.
Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules include a general-purpose processor 1502 that is configured using software, the general-purpose processor 1502 may be configured as respective different hardware modules at different times. Software may accordingly configure the processor 1502, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Modules can provide information to, and receive information from, other modules. For example, the described modules may be regarded as being communicatively coupled. Where multiples of such hardware modules exist contemporaneously, communications may be achieved through signal transmissions (such as, for example, over appropriate circuits and buses that connect the modules). In embodiments in which multiple modules are configured or instantiated at different times, communications between such modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple modules have access. For example, one module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further module may then, at a later time, access the memory device to retrieve and process the stored output. Modules may also initiate communications with input or output devices, and can operate on a resource (for example, a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors 1502 that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors 1502 may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, include processor-implemented modules.
Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors 1502 or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors 1502, not only residing within a single machine but deployed across a number of machines. In some example embodiments, the processors 1502 may be located in a single location (e.g., within a home environment, within an office environment, or as a server farm), while in other embodiments, the processors 1502 may be distributed across a number of locations.
While the embodiments are described with reference to various implementations and exploitations, it will be understood that these embodiments are illustrative and that the scope of claims provided below is not limited to the embodiments described herein. In general, the techniques described herein may be implemented with facilities consistent with any hardware system or hardware systems defined herein. Many variations, modifications, additions, and improvements are possible.
Plural instances may be provided for components, operations, or structures described herein as a single instance. Finally, boundaries between various components, operations, and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of the claims. In general, structures and functionality presented as separate components in the exemplary configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the claims and their equivalents.
This written description uses examples to disclose various embodiments, including the best mode thereof, and also to enable any person skilled in the art to practice the embodiments, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the embodiments is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if those examples include structural elements that do not differ from the literal language of the claims, or if the examples include equivalent structural elements with insubstantial differences from the literal language of the claims.
Number | Name | Date | Kind |
---|---|---|---|
6201544 | Ezaki | Mar 2001 | B1 |
7075512 | Fabre | Jul 2006 | B1 |
7280105 | Cowperthwaite | Oct 2007 | B2 |
7383148 | Ahmed | Jun 2008 | B2 |
7469381 | Ording | Dec 2008 | B2 |
7512450 | Ahmed | Mar 2009 | B2 |
7548833 | Ahmed | Jun 2009 | B2 |
7567844 | Thomas | Jul 2009 | B2 |
7610910 | Ahmed | Nov 2009 | B2 |
7665670 | Ahmed | Feb 2010 | B2 |
7705863 | Rye et al. | Apr 2010 | B2 |
7728853 | Plocher et al. | Jun 2010 | B2 |
7760187 | Kennedy | Jul 2010 | B2 |
7954070 | Plocher | May 2011 | B2 |
8229176 | Seegers et al. | Jul 2012 | B2 |
8233008 | Jin | Jul 2012 | B2 |
8464181 | Bailiang | Jun 2013 | B1 |
8681153 | Houllier | Mar 2014 | B2 |
8862532 | Beaulieu | Oct 2014 | B2 |
9134886 | Bailiang | Sep 2015 | B2 |
9323420 | Bailiang | Apr 2016 | B2 |
20010016796 | Ata | Aug 2001 | A1 |
20050252984 | Ahmed | Nov 2005 | A1 |
20080104531 | Stambaugh | May 2008 | A1 |
20080172632 | Stambaugh | Jul 2008 | A1 |
20080177510 | Jin | Jul 2008 | A1 |
20100115455 | Kim | May 2010 | A1 |
20110214050 | Stambaugh | Sep 2011 | A1 |
20120293527 | Hoffknecht et al. | Nov 2012 | A1 |
20120297346 | Hoffknecht et al. | Nov 2012 | A1 |
20130030702 | Yamamoto | Jan 2013 | A1 |
20130226515 | Pershing | Aug 2013 | A1 |
20130325319 | Moore | Dec 2013 | A1 |
20130325343 | Blumenberg | Dec 2013 | A1 |
20140046627 | Pershing | Feb 2014 | A1 |
20140111520 | Cline | Apr 2014 | A1 |
20140245232 | Bailiang | Aug 2014 | A1 |
20140253538 | Bailiang | Sep 2014 | A1 |
20140278060 | Kordari | Sep 2014 | A1 |
20140365121 | Cline | Dec 2014 | A1 |
20150020008 | Pensack-Rinehart | Jan 2015 | A1 |
20150094952 | Moeglein | Apr 2015 | A1 |
20150193416 | Hagiwara | Jul 2015 | A1 |
20150193469 | Hagiwara | Jul 2015 | A1 |
Number | Date | Country |
---|---|---|
103234539 | Aug 2013 | CN |
203204637 | Sep 2013 | CN |
103335659 | Oct 2013 | CN |
Entry |
---|
Fallon, M. F., et al., “Sensor fusion for flexible human-portable building-scale mapping”, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct. 7-12, 2012, Vilamoura, Algarve, Portugal, (2012), 4405-4412. |
Iocchi, L., et al., “Building multi-level planar maps integrating LRF, stereo vision and IMU sensors”, IEEE International Workshop on Safety, Security and Rescue Robotics, SSRR 2007, (2007), 1-6. |
Number | Date | Country | |
---|---|---|---|
20160102983 A1 | Apr 2016 | US |