Systems and methods to provide search results based on time to obtain

Information

  • Patent Grant
  • 10991022
  • Patent Number
    10,991,022
  • Date Filed
    Thursday, November 16, 2017
    7 years ago
  • Date Issued
    Tuesday, April 27, 2021
    3 years ago
Abstract
Systems and methods for providing search results based on time obtain an item are discussed. In an example, a method can include receiving a search query, generating search results, calculating a time to obtain for each listing in the search results, and arranging the search results for display. The search query can include a current location of a mobile device. The search results can include a plurality of listings, with each listing including a location. The locations representing either a physical location proximate the current location or a delivery time to the current location. The search results can be arranged for display in various manners according to the calculated time to obtain for each result.
Description
BACKGROUND

The Internet and the World Wide Web have given rise to a wide variety of on-line retailers that operate virtual stores from which consumers can purchase products (i.e., merchandise, or goods) as well as services. Although the popularity of these on-line retail sites is clearly evidenced by their increasing sales, for a variety of reasons, some consumers may still prefer to purchase products and services in a more conventional manner—i.e., via a brick-and-mortar store. Even when purchasing products and services from a brick-and-mortar store, various web-based applications and tools may be used to perform preliminary research about products and services. Increasingly, brick-and-mortar stores are providing online and mobile-focused purchasing options to augment traditional in-store sales and to compete with purely online retailers.





BRIEF DESCRIPTION OF THE DRAWINGS

Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings:



FIG. 1A is a block diagram depicting a system for delivering search results, where the search results are arranged in ascending order of the distance and/or travel time between a user's location and the location of a product or service presented with a search result listing, according to an example embodiment.



FIG. 1B is a block diagram depicting a system for providing a network-based navigated shopping service, according to an example embodiment.



FIG. 2 is a block diagram illustrating an environment for operating a mobile device, according to an example embodiment.



FIG. 3 is a block diagram illustrating a mobile device, according to an example embodiment.



FIG. 4 is a block diagram illustrating a network-based system for delivering search results arranged by distance and/or time and/or navigated shopping services, according to an example embodiment.



FIG. 5 is a block diagram illustrating shopping modules, according to an example embodiment.



FIG. 6 is a flowchart illustrating a method for processing a search query, according to an example embodiment.



FIG. 7 illustrates an example user interface showing search results presented consistent with some embodiments of the invention.



FIG. 8 illustrates an example user interface of an application for a mobile device including an interactive map on which search results can be shown, according to some embodiments of the invention.



FIG. 9 illustrates a user interface showing a map, and having a multi axis table presenting the search results arranged by a combination of their price and distance and/or travel time, according to an example embodiment of the invention.



FIGS. 10A-C illustrate example user interfaces presenting an interactive map on a tablet or mobile computing device for use in defining a corridor or route with which to identify or filter relevant search results, according to some embodiments of the invention.



FIG. 11 illustrates an example user interface displaying search results within a corridor on an interactive map, according to some embodiments of the invention.



FIG. 12 is a flowchart illustrating a method for navigated shopping, according to an example embodiment.



FIG. 13 is a flowchart illustrating a method for providing navigated shopping services, according to an example embodiment.



FIG. 14A is an illustration of a mobile device display, according to an example embodiment.



FIG. 14B is an illustration of a shopping list for use with a navigated shopping system, according to an example embodiment.



FIG. 15 is an illustration of a mobile device display depicting a shopping route provided by a navigated shopping service, according to an example embodiment.



FIG. 16 is a flowchart illustrating a method for providing location-based services, according to an example embodiment.



FIG. 17 is a user interface diagram depicting location-based services operating on a client device, according to an example embodiment.



FIG. 18 is a diagram depicting a location-based service operating on a client device, according to an example embodiment.



FIG. 19 is a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.





DEFINITIONS

Location—For the purposes of this specification and the associated claims, the term “location” is used to refer to a geographic location, such as a longitude/latitude combination or a street address. The term “location” is also used within this specification in reference to a physical location associated with a retail outlet (e.g., store).


Real-time—For the purposes of this specification and the associated claims, the term “real-time” is used to refer to calculations or operations performed on-the-fly as events occur or input is received by the operable system. However, the use of the term “real-time” is not intended to preclude operations that cause some latency between input and response, so long as the latency is an unintended consequence induced by the performance characteristics of the machine.


DETAILED DESCRIPTION

Example systems and methods for providing and using various location-enabled shopping functions are described. In an example, the systems and methods can provide data processing techniques for processing a search query to identify a set of search results relating to products and/or services that are being offered either online or via a brick-and-mortar retail store, and then present those search results in such a manner that conveys to the viewer in a simple and intuitive manner the distance and/or travel time between a location of the user, and a product or service presented within each individual search result associated with a brick-and-mortar store. In another example, the systems and methods can provide data processing techniques for establishing a search query, and then processing the search query to identify a set of search results that satisfy the search query, at least in part, because the search results are associated with brick-and-mortar stores that are in close proximity with a user-defined corridor, or commute route. In yet another example, the systems and methods can provide a navigated shopping service allowing a user to use a location-aware mobile device to shop en route to a desired destination. In an example, the navigated shopping service can work in coordination with registered merchants to provide convenient access to items on a user's shopping list with minimal disruption or deviation from a planned route.


In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the various aspects of different embodiments of the present invention. It will be evident, however, to one skilled in the art, that the present inventive subject matter may be practiced without all of the specific details.


Consistent with some embodiments of the invention, a web-based search engine processes search queries and returns, to a requesting client computing device, search results that are arranged based at least in part on the travel time required to travel from a first location (e.g., the user's current location) to the location of a store at which a product or service associated with a search result is being offered. Accordingly, the search results may be presented such that the closest store (in terms of travel time) that is offering a product satisfying the search query is presented first in the list of search results. With some embodiments, the user can toggle the presentation of search results between distance and time, such that the search results can be ordered based on distance (e.g., the geodesic distance, or distance as the crow flies) or, the more practical and useful measure, based on the travel time required to travel between the location of the user and the location of the store offering the product or service presented in a search result. With some embodiments, the user can specify a mode of transportation (e.g., walking, biking, automobile, public transportation, etc.) and the travel time will be derived based on the routes available when travelling via the selected mode. With some embodiments, various filtering criteria may be applied. For example, the user may request that only search results satisfying the query and associated with a store that is within a predefined travel time, or distance, be presented. Similarly, a user may select one or more specific stores (e.g., Apple® Store, Best Buy®, Walmart®, and so forth) to see only search results showing products/services being offered by those stores.


With some embodiments, parameters for the search query may be established by a user interacting with a map. In particular, with some embodiments, a user may interact with a map to specify any one or more of: the starting location to be used for deriving the distance or travel time to the products or services associated with the individual search results; a specific geographical region of interest, from which to display search results; a corridor or commuting route from which any point along the corridor can be used as the starting point to derive the distance to a store associated with a search result. The map may be presented as part of a web-based map application, a desktop computer application, or an application that is specific to a particular mobile computing platform (e.g., such as Apple's iOS, or Google's Android operating systems.)


For instance, with some embodiments, a user may simply select a location on a map (using a pointing device, such as a mouse, or a finger or stylus with a touch screen display) to indicate his current location, or a location at which the user will be at some future time and from which the distance or travel time to another locations should be derived. With some embodiments, the user may define a circular region by simply dragging his or her finger or stylus (or taking a similar action with a pointing device) to specify a diameter, or radius, of a circle making up a circular region (see example illustrated in FIG. 8) from which search results should be presented. In other embodiments, the user may simply draw any closed shape to specify the region to be searched for local results. In yet other embodiments, the user may trace or highlight a specific road or set of roads that comprise a route that the user frequently travels. This route can then be used to identify stores associated with search results, such that those stores closest in distance or travel time to any point along the route can be presented.


With some embodiments, the search results may be shown in a simple list, with each individual result including any one or more of: a price at which a product or service is being offered; a store name (virtual online, or physical, e.g., local brick and mortar); a brief description of the product or service being offered; a physical distance from a user's location to the store at which the product or service is being offered; a link to share the listing with someone; a link to a map showing directions from the user's current location, or some user-specified location, to the store; a travel time indicating how long (in time) it would take the user to travel via a particular mode to the location of the store at which the product is being offered; and a quantity of a product being offered at the store. With some embodiments, search results for online virtual stores are shown separately from the search results for locally available products and/or services, while in some embodiments, the search results are intermingled, and/or arranged according to some other specific aspect or attribute, such as price. With some embodiments, a user may filter the search results to only view products and/or services: located at stores within a threshold distance; located at stores within a threshold travel time (where the mode of travel can be specified), that have prices that are less than, or exceed, some threshold price, and so forth.


With some embodiments, the search results may be presented via a multi-axis table, with each axis representing a single attribute of interest. For example, with some embodiments, the search results may be represented as graphics or icons presented along an X and Y axis with the X axis representing distance or travel time, and the Y axis representing price (see example user interface of FIG. 9). With such an embodiment, the user can very quickly get a sense of which local store is both closest (in distance and/or travel time), and has a product at a reasonable price relative to other stores. Other beneficial aspects of the embodiments of the present invention will be readily apparent from the description of the figures that follows.


Consistent with some embodiments of the invention, an application executing on a client computing device allows a user to interact with a user interface presented via the client computing device to provide one or more search terms for use in a search query to be executed against a set of marketplace listings hosted at a publication server system. The search query is then communicated to a search engine operating in conjunction with the publication server system, and processed to identify search results that satisfy the user-provided search terms. In addition to communicating the search term(s) as part of the search query, the client-computing device communicates to the search engine information identifying a corridor or route, such as the portions of one or more roads on which a user tends to travel as part of a daily commute. The search engine then processes the search query to identify the marketplace listings that satisfy the search query, and are associated with a brick-and-mortar store that is within some predefined distance of a point along the corridor or commute route.


With some embodiments, the information identifying the corridor or commute route is established by the user interacting with an interactive map that is part of the application that facilitates the search query. For example, with some embodiments, a user may select (e.g., with a finger or stylus on a touch sensitive display, or a pointing device with a conventional display) a starting location on a map, and then trace or drag his finger (or stylus, or pointing device) along a road, or set of roads, displayed in the map, thereby selecting or otherwise specifying a route frequently travelled by the user. With other embodiments, a software application may analyze location information generated by a location based device (e.g., a global positioning system (GPS) device) that is part of a mobile computing device. Accordingly, the corridor or commute route may be automatically determined by analyzing the historical location information generated by the mobile computing device.


With some embodiments, the user may specify a threshold distance that s/he is willing to travel from his or her commute path to visit a store and obtain a product/service associated with a search result. Accordingly, only those search results that are associated with products/services being offered at locations within the predefined threshold will be displayed or presented. For example, the search engine will use the information identifying the corridor or commute path and the threshold distance to identify only the relevant search results. Alternatively, the search engine, or an application on the client computing device, will filter the search results to only display the relevant results (i.e., those satisfying the distance threshold defined by the user). With some embodiments, the search results are presented on a map, for example, as pins or icons positioned on the map in the location of the store associated with a product/service being offered via the search result. In such instances, the pins or icons are generally interactive, such that, when selected, additional information about a relevant product/service and/or store is presented.


In accordance with an example embodiment, a network-based system can provide a platform to provide both a navigated shopping service and integrated payment capabilities. In certain examples, the navigated shopping service and the payment service can be provided by separate systems. In some examples, the user interacts with a network-based system (e.g., navigated shopping service and/or payment service) via a mobile device, such as a smartphone, a tablet computing device, or an in-vehicle intelligent navigation system, among others. In an example, the network-based system can include a publication module capable of delivering location-based offers to a user based on stored user profile data, local merchant campaigns, and current or future location data. In certain examples, using the navigated shopping service can include configuring automatic payments via an integrated or partner payment processing system, such as the PayPal™ payment system (from eBay Inc. of San Jose, Calif.).


Example System



FIG. 1A is a block diagram depicting a system 100 for delivering search results, according to an example embodiment. The system 100 can include a user 110, a network-based publication system 120 with a search engine, and one or more merchants 130 (and merchant systems). In an example, the user 110 can connect to the network-based publication system 120 via a mobile device 115 (e.g., desktop, laptop, smart phone, PDA, or similar electronic device capable of some form of data connectivity). The network-based publication system 120 will receive and process a query from the user's mobile device 115. Generally, location information specifying the physical or geographical location of the user 110 will be received with the query. For example, the mobile device 115 can include a GPS unit to inform the mobile device 115 of its location, such that the location information of the mobile device 115 can be shared with the network-based publication system 120. Other known techniques for deriving location information may be used with both mobile and non-mobile client computing devices, for example, such as desktop computers, etc. For instance, with some embodiments, the location information indicating the location of the user 110 may be explicitly specified by the user 110, for example, by the user 110 interacting with a map.


In an example, the merchant 130 can operate computer systems, such as an inventory system 132 or a POS system 134. The network-based publication system 120 can interact with any of the systems 132, 134 used by merchant 130 for operation of the merchant's retail or service business. In an example, the network-based publication system 120 can work with both POS system 134 and inventory system 132 to obtain access to inventory available at individual retail locations run by the merchant 130. This inventory information can be used in both generating product or service listings, and selecting and ordering search results served by the network-based publication system 120.



FIG. 1B is a block diagram depicting a system 100B for providing a network-based navigated shopping service. The system 100B can include a vehicle 150, a user 110, a network-based navigated shopping service system 120 (also referred to as network-based shopping system 120), a merchant 130, and a network-based payment service 140. In an example, the user 110 can connect to the network-based shopping system 120 via a mobile device 115A (e.g., smart phone, PDA, laptop, or similar mobile electronic device capable of some form of data connectivity). In another example, a user, such as user 110, can be within a vehicle 150 and connect to the network-based navigated shopping system 125 via a mobile device 115B. In certain examples, the mobile device 115B can include an in-vehicle programmable navigation system. For simplicity, mobile device 115B and mobile device 115A will be referred to collectively as mobile device 115 (unless specifically noted, analogous to mobile device 115 illustrated in FIG. 1A). In an example, the merchant 130 can operate computer systems, such as an inventory system 132 or a point of sale (POS) system 134, among others. The network-based shopping system 125 can interact with any of the systems 132, 134 used by merchant 130 for operation of the merchant's retail or service business. In an example, the network-based shopping system 125 can work with both the POS system 134 and inventory system 132 to obtain access to inventory available at individual retail locations run by the merchant 130 and match merchandise to item desired by users of the network-based shopping service 125. Additionally, the network-based payment service 140 can also interact with the merchant 130 via the POS system 134, enabling the network-based shopping system 125 to provide payment services offered by the network-based payment service 140 and make payments directly to the merchant 130. In certain examples, the network-based shopping system 125 and the payment service 140 can be integrated.


Example Operating Environment


With some embodiments, the user may explicitly indicate or specify his current location for use in deriving a distance or travel time to stores offering products/services. However, with some embodiments, location information of the user may be derived with a mobile computing device of the user. In some additional embodiments, the user may explicitly indicate or specify his daily travelled commute route (or some other route) for use in determining a set of search results based on the distance or travel time to stores offering products/services. However, with some embodiments, location information (and particularly, the commute or travel path) of the user may be derived with a mobile computing device of the user.



FIG. 2 is a block diagram illustrating an environment 200 for operating a mobile device 115, according to an example embodiment. The environment 200 is an example environment within which methods of serving search results can be operated. In another example, environment 200 can host a navigated shopping service, which can also include the methods of serving search results discussed herein. The environment 200 can include a mobile device 115, a communication connection 210, a network 220, servers 230, a communication satellite 270, a merchant server 280, and a database 290. The servers 230 can optionally include location based service application 240, location determination application 250, publication application 260 with search engine 261, and payment application 265. The database 290 can optionally include merchant databases 292, user profile database 294, and/or location history database 296. The mobile device 115 represents one example device that can be utilized by a user to receive offers and share context information associated with the user. The mobile device 115 may be any of a variety of types of devices (for example, a cellular telephone, a PDA, a Personal Navigation Device (PND), a handheld computer, a tablet computer, a notebook computer, or other type of movable device). The mobile device 115 may interface via a connection 210 with a communication network 220. Depending on the form of the mobile device 115, any of a variety of types of connections 210 and communication networks 220 may be used.


For example, the connection 210 may be Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular connection. Such connection 210 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1×RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, or other data transfer technology (e.g., fourth generation wireless, 4G networks). When such technology is employed, the communication network 220 may include a cellular network that has a plurality of cell sites of overlapping geographic coverage, interconnected by cellular telephone exchanges. These cellular telephone exchanges may be coupled to a network backbone (for example, the public switched telephone network (PSTN), a packet-switched data network, or other types of networks).


In another example, the connection 210 may be Wireless Fidelity (Wi-Fi, IEEE 802.11x type) connection, a Worldwide Interoperability for Microwave Access (WiMAX) connection, or another type of wireless data connection. In such an embodiment, the communication network 220 may include one or more wireless access points coupled to a local area network (LAN), a wide area network (WAN), the Internet, or other packet-switched data network.


In yet another example, the connection 210 may be a wired connection, for example an Ethernet link, and the communication network may be a LAN, a WAN, the Internet, or other packet-switched data network. Accordingly, a variety of different configurations are expressly contemplated.


A plurality of servers 230 may be coupled via interfaces to the communication network 220, for example, via wired or wireless interfaces. These servers 230 may be configured to provide various types of services to the mobile device 115. For example, one or more servers 230 may execute location based service (LBS) applications 240, which interoperate with software executing on the mobile device 115, to provide LBSs to a user. LBSs can use knowledge of the device's location, and/or the location of other devices and/or retail stores, etc., to provide location-specific information, recommendations, notifications, interactive capabilities, and/or other functionality to a user. With some embodiments, the LBS operates in conjunction with the publication application 260 and search engine 261, in particular, to provide search results that are arranged based on the distance or travel time between a mobile device 115 (or other computer device) and a retail store. For example, an LBS application 240 can provide location data to a network-based publication system 120, which can then be used to arrange a set of search results, based on distance and/or travel time between two locations. Knowledge of the mobile device's location, and/or the location of other devices, may be obtained through interoperation of the mobile device 115 with a location determination application 250 executing on one or more of the servers 230. Location information may also be provided by the mobile device 115, without use of a location determination application such as application 250. In certain examples, the mobile device 115 may have some limited location determination capabilities that are augmented by the location determination application 250.


Example Mobile Device



FIG. 3 is a block diagram illustrating the mobile device 115, according to an example embodiment. The mobile device 115 may include a processor 310. The processor 310 may be any of a variety of different types of commercially available processors suitable for mobile devices (for example, an XScale architecture microprocessor, a Microprocessor without Interlocked Pipeline Stages (MIPS) architecture processor, or another type of processor). A memory 320, such as a Random Access Memory (RAM), a Flash memory, or other type of memory, is typically accessible to the processor 310. The memory 320 may be adapted to store an operating system (OS) 330, as well as application programs 340, such as a mobile location enabled applications that may provide LBSs to a user. The processor 310 may be coupled, either directly or via appropriate intermediary hardware, to a display 350 and to one or more input/output (I/O) devices 360, such as a keypad, a touch panel sensor, a microphone, and the like. Similarly, in some embodiments, the processor 310 may be coupled to a transceiver 370 that interfaces with an antenna 390. The transceiver 370 may be configured to both transmit and receive cellular network signals, wireless data signals, or other types of signals via the antenna 390, depending on the nature of the mobile device 115. In this manner, the connection 210 with the communication network 220 may be established. Further, in some configurations, a GPS receiver 380 may also make use of the antenna 390 to receive GPS signals.


Additional detail regarding providing and receiving location-based services can be found in U.S. Pat. No. 7,848,765, titled “Location-Based Services,” granted to Phillips et al. and assigned to Where, Inc. of Boston, Mass., which is hereby incorporated by reference.


An example geo-location concept discussed within U.S. Pat. No. 7,848,765 is a geofence. A geofence can be defined as a perimeter or boundary around a physical location or mobile object (e.g., a user). A geofence can be as simple as a radius around a physical location defining a circular region around the location. However, a geofence can be any geometric shape or an arbitrary boundary drawn on a map. A geofence can be used to determine a geographical area of interest for the calculation of demographics, advertising, presenting search results, or similar purposes. Geofences can be used in conjunction with identifying and presenting search results, as described herein. For example, a geofence can be used to assist in determining whether a user (or mobile device associated with the user) is within a geographic area of a particular merchant. If the user is within a geofence established by the merchant or the publication system, the systems discussed herein can use that information to identify and present search results (e.g., via a mobile device associated with the user).


Example Platform Architecture



FIG. 4 is a block diagram illustrating a network-based system 400 for processing a search query, and presenting search results, as described more fully herein. The network-based system 400 can also provide navigated shopping services as described more fully herein. The block diagram depicts a network-based system 400 (in the exemplary form of a client-server system), within which an example embodiment can be deployed. A networked system 402 is shown, in the example form of a network-based location-aware publication, advertisement, or marketplace system, that provides server-side functionality, via a network 404 (e.g., the Internet or WAN) to one or more client machines 410, 412. FIG. 4 illustrates, for example, a web client 406 (e.g., a browser, such as the Internet Explorer browser developed by Microsoft Corporation of Redmond, Wash. State) and a programmatic client 408 executing on respective client machines 410 and 412. In an example, the client machines 410 and 412 can be in the form of a mobile device, such as mobile device 115.


An Application Programming Interface (API) server 414 and a web server 416 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 418. The application servers 418 host one or more publication modules 420 (in certain examples, these can also include search engine modules, commerce modules, advertising modules, and marketplace modules, to name a few), payment modules 422, dynamic offer modules 432, and optionally shopping modules 434. The application servers 418 are, in turn, shown to be coupled to one or more database servers 424 that facilitate access to one or more databases 426. In some examples, the application server(s) 418 can access the database(s) 426 directly without the need for a database server(s) 424.


The publication modules 420 may provide a number of publication and search functions and services to users that access the networked system 402. The payment modules 422 may likewise provide a number of payment services and functions to users. The payment modules 422 may allow users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are advertised or made available via the various publication modules 420, within retail locations, or within external online retail venues. The payment modules 422 may also be configured to present or facilitate a redemption of offers, generated by the location-aware (dynamic) offer modules 432, to a user during checkout (or prior to checkout, while the user is still actively shopping). The offer modules 432 may provide dynamic context sensitive offers (e.g., coupons or immediate discount deals on targeted products or services) to users of the networked system 402. The offer modules 432 can be configured to use all of the various communication mechanisms provided by the networked system 402 to present offer options to users. The offer options can be personalized based on current location, time of day, user profile data, past purchase history, or recent physical or online behaviors recorded by the network-based system 400, among other things (e.g., context information). While the publication modules 420, payment modules 422, and offer modules 432 are shown in FIG. 4 to all form part of the networked system 402, it will be appreciated that, in alternative embodiments, the payment modules 422 may form part of a payment service that is separate and distinct from the networked system 402. Additionally, in some examples, the offer modules 432 may be part of the payment service or may form an offer generation service separate and distinct from the networked system 402.


In certain examples, the shopping modules 434 may provide real-time navigation route updates listing physical retail outlets that have items on a shopping list uploaded by a user of the networked system 402. The shopping modules 434 can be configured to use all of the various communication mechanisms provided by the networked system 402 to transmit information to a client machine, such as client machine 412. The shopping modules 434 can also present special offers available from retail merchants, such as merchant 130, along a user's current navigation route. The offers can be personalized based on current location, time of day, user profile data, past purchase history, or recent physical or online behaviors recorded by the network-based system 400, among other things.


Further, while the system 400 shown in FIG. 4 employs a client-server architecture, the present invention is of course not limited to such an architecture, and could equally well find application in a distributed, or peer-to-peer, architecture system, for example. The various publication modules 420, payment modules 422, and offer modules 432 could also be implemented as standalone systems or software programs, which do not necessarily have networking capabilities.


The web client 406 accesses the various publication modules 420, payment modules 422, and offer modules 432 via the web interface supported by the web server 416. Similarly, the programmatic client 408 accesses the various services and functions provided by the publication modules 420, payment modules 422, and offer modules 432 via the programmatic interface provided by the API server 414. The programmatic client 408 may, for example, be a smartphone application that enables users to communicate search queries to the system 400 while leveraging user profile data and current location information provided by the smartphone or accessed over the network 404.



FIG. 4 also illustrates a third party application 428, executing on a third party server machine 440, as having programmatic access to the networked system 402 via the programmatic interface provided by the API server 414. For example, the third party application 428 may, utilizing information retrieved from the networked system 402, support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more promotional, marketplace or payment functions that are supported by the relevant applications of the networked system 402. Additionally, the third party website may provide merchants with access to the offer modules 432 for configuration purposes. In certain examples, merchants can use programmatic interfaces provided by the API server 414 to develop and implement rules-based pricing schemes that can be implemented via the publication modules 420, payment modules 422, offer modules 432, and optionally shopping modules 434.


Example Shopping Modules



FIG. 5 is a block diagram illustrating shopping modules 434, according to an example embodiment. In this example, the shopping modules 434 can include a routing module 505, a user profile module 510, a mapping engine 520, a location module 530, and a merchant module 540. In an example, the payment modules 422 can access database 426 to store and/or retrieve payment transaction data, user profile data, and location data, as well as other information to enable navigated shopping, such as merchant inventory data, among other things.


The routing module 505 can use information obtained from a user, the location module 530, and the merchant module 540 to create a shopping route that allows a user to purchase items of interest along a navigation route. The routing module 505 can process merchant location information, inventory information, use location information, and navigation route information to develop shopping routes.


The user-profile module 510 can manage user preferences, shopping lists, and recent destinations, among other things. In an example, the user-profile module 510 can communicate with database 426 and well as directly with a client machine, such as client machine 412.


The merchant module 540 can manage communications with merchants registered to offer goods or services, such as merchant 130, via the networked system 402. The merchant module 540 can maintain location data and real-time inventory for participating merchants.


The location module 530 can monitor real-time location information on clients, such as by monitoring (or receiving) location information regarding client machines, such as client machine 410. The location module 530 can communicate real-time location information to the routing module 505 and the mapping engine 520 as needed.


The mapping engine 520 can work with the routing module 505, location module 530, user-profile module 510, and merchant module 540 to map a location associated with the user, desired route, as well as merchant retail locations selling desired items.


Additional details regarding the functionality provided by the shopping modules 434 are detailed below in reference to FIG. 12 and FIG. 13.


Example Search Query Methods



FIG. 6 is a flowchart illustrating a method 600 for processing a search query, according to an example embodiment. In an example, the method 600 can include operations such as: receiving a search query at operation 605, identifying search results at operation 610, arranging search results at operation 615, and communicating the search results at operation 620. In an example, the method 600 can begin at operation 605 with the networked system 402 receiving a search query from a client, such as client machine 410. The search query can include location information associated with the client device, such as location information from a location-enabled mobile device (e.g., mobile device 115). At operation 610, the method 600 can continue with the networked system 402 identifying a set of search results satisfying the search query. In an example, the search results can include geographic information associated with each result. At operation 615, the method 600 can continue with the networked system 402 arranging the search results according to predetermined criteria. In an example, the networked system 402 can arrange the search results geographically based on a physical brick-and-mortar store (e.g., retail location) in reference to a location associated with a mobile device of the user requesting the results. In an example, the search results can be arranged based on travel time from the mobile device location. In another example, the search results can be arranged based on distance from the mobile device location. At operation 620, the method 600 can conclude with the networked system 402 communicating the search results to a user device, such as mobile device 115, for display. In an example, the search results can be displayed in a map view. In another example, the search results can be displayed in a list view, sorted according to the arrangements discussed above. Additional display options are discussed below in reference to the example user-interfaces.


Example Search Query User Interfaces



FIG. 7 illustrates an example user interface showing search results presented consistent with some embodiments of the invention. In the example user interface of FIG. 7, the search results are grouped according to their type, for example, search results associated with virtual or online stores, and search results associated with local brick-and-mortar stores. In addition to showing information associated with a product satisfying the search query, the individual listings for the local brick-and-mortar stores are arranged according to the time needed for the user to travel from his current location to the store at which the product is being offered. For example the top search result in the list for local brick-and-mortar stores indicates that the product, (e.g., the TV) is being offered by Best Buy for $2499. The travel time to the particular Best Buy where the product is being offered is 5 min., as indicated by the graphic with reference number 700. In addition, the example user interface shows the address of the store, a telephone number for the store, and store hours. Also, the user can select the checkbox with reference number 705 to have the store location shown on a map. For example, with some embodiments, by checking or selecting one or more of the check boxes associated with each product or service listing, the user can indicate that he would like to view the location of the stores offering the product or service on a map.


As illustrated in FIG. 7, the user interface includes a slider mechanism 710 that serves as a filter, allowing the user to filter the search results by travel time. In this example the user has positioned the slider mechanism so that only local search results that are within 60 min. of the user's current location are presented in the search results page.



FIG. 8 illustrates an example user interface of an application for a mobile device including an interactive map on which search results can be shown, according to some embodiments of the invention. As illustrated in FIG. 8, the user can indicate a geographical region of interest. In this example the user has simply placed his finger in a first location on the map and then dragged his finger to generate a circle, which will serve as the geographical area of interest for purposes of filtering and presenting the search results. Assuming the user selects the center of the circle as his current location, the user will be presented with a quick visual presentation of the locations of products being offered in the search results. Selecting an icon representing a search result may cause a pop-up window to present more detailed information about the store and/or the product being offered via the search result for that store. Several navigational icons are shown at the bottom of the page allowing the user to very quickly switch to alternative user interfaces. For example by selecting a particular icon the user can have the search results presented in a list, as shown in FIG. 7. Another navigational icon allows the user to view a multi-axis display of the search results, as shown and described in FIG. 9.



FIG. 9 illustrates an example user interface showing a map, and having a multi-axis table presenting the search results arranged by a combination of their price and distance and/or travel time. In this example, the vertical axis, or Y axis, represents the price of the products or services. The horizontal axis, or X axis, represents the distance, or travel time, from the user's current location to the location at which the product or service can be obtained. As such, the user can very quickly and easily weigh the trade-offs between price and travel time to identify a local store that is both close in terms of travel time and offering the product at a reasonable price relative to the other stores.



FIG. 10A illustrates another example user interface presenting an interactive map on a tablet or mobile computing device for use in defining a corridor or route with which to identify or filter relevant search results, according to some embodiments of the invention. For example, as illustrated in FIG. 10A, a user may use a finger or stylus to designate or select a road, or set of roads, as a corridor or commute route of interest for purposes of a particular search for a product or service. By simply dragging his finger along the route, the application will identify a road, and then highlight the road to indicate the selected corridor or route of interest.


After selecting or otherwise specifying a particular road, or set of roads, the user can specify a distance that he is willing to travel from the route to obtain the particular product or service being searched for. For instance, as illustrated in the example user interface of FIGS. 10B and 10C, the user can select one of the three icons or buttons labelled, “15 Mins”, “10 Mins”, or “5 Mins” to indicate the time the user is willing to travel away from the defined corridor or commute path to obtain a relevant product or service (FIG. 10B illustrating the 5 Mins selection and corresponding corridor, and FIG. 10C illustrating the 10 Mins selection and corresponding corridor). Of course, a wide variety of other techniques could be used to establish the threshold distance or travel time. With some embodiments, upon a threshold distance or travel time being selected, the map is updated to indicate the general geographical area that is within proximity of the corridor or commute path. For example, as illustrated in FIG. 10C, the corridor or commute path is shown to be a width consistent with the user's defined threshold travel time of 10 minutes.


Once the user has defined the corridor and the threshold distance or travel time, the search engine will process the search query, or filter previously identified search results, to identify only those search results that are relevant because they are associated with a store that is located within the geographical area defined by the corridor and threshold distance (or travel time). For example, as illustrated in the example user interface of FIG. 11, several icons (or pins) representing stores associated with a particular search result are shown to be positioned within the highlighted corridor. By selecting a particular pin or icon, the user can learn more about the product/service being offered at the particular store represented by the pin/icon, and/or about the store itself—such as the actual address of the store, telephone number of the store, store hours for the store, and distance (or travel time) from the defined corridor.


Example Navigated Shopping Methods



FIG. 12 is a flowchart illustrating a method 1200 for navigated shopping, according to an example embodiment. In an example, the method 1200 can include operations such as: creating a shopping list at operation 1205, selecting items at operation 1210, entering a destination at operation 1215, selecting items for purchase at operation 1225, authorizing payment for selected items at operation 1235, receiving a receipt for selected items at operation 1240 and receiving purchased items from merchant locations at operation 1245. Optionally, the method 1200 can also include operations such as: reviewing route at operation 1220 and selecting purchase options at operation 1230. In an example, the method 1200 is performed via a mobile device, such as mobile device 115.


In an example, the method 1200 can begin at operation 1205 with a user creating a wish or shopping list via mobile device 115. In another example, the user can create a shopping list via a web browser running on client machine 410. In certain examples, the user may not create a wish list or shopping list in advance of a particular trip. In these examples, the mobile device 115 can be used to enter items on-the-fly (e.g., during a trip) at operation 1210. In the example where the user has a shopping list created, the method 1200 can continue at operation 1210 with the user selecting items from the shopping list via the mobile device 115.


At operation 1215, the method 1200 can continue with the user entering a destination into the mobile device 115 in order to obtain a navigation route (refer to FIG. 12 for an example route display on a mobile device, such as mobile device 115). The operations in method 1200 do not necessarily need to be performed in the illustrated order. For example, the user can enter a destination at operation 1215 prior to selecting items from a shopping list at operation 1210.


At operation 1220, the method 1200 can optionally continue with the navigation route being displayed for review on the mobile device 115. In certain examples, the items selected at operation 1210 can also be displayed if available from merchants along the navigation route.


At operation 1225, the method 1200 can continue with the user selecting items for purchase during the planned trip via the mobile device 115. Item selection can occur via a touch screen interface on the mobile device 115 (if available), with the user selecting items displayed on the map display. In another example, the mobile device 115 can display a list view of the items available at merchant locations within a pre-defined distance from the navigation route. In an example, the display of available items can change in real-time based on actual location (route taken) and available inventory from merchant locations. For example, if the user deviates from the planned navigation route, the network-based navigated shopping system 125 can recalculate the available items from merchants within a certain distance of the new route or current location.


At operation 1230, the method 1200 can optionally continue with the mobile device 115 presenting purchase options to the user. In an example, the purchase options can include store (merchant) pickup (typically the default), home delivery, or delay. The delay option allows the user to delay a decision regarding pickup, and the network-based navigated shopping system 125 will request instructions regarding delivery at a later time. In an example, if the user selects delay and then passes the merchant location on the navigation route, the network-based navigated shopping system 125 can automatically revert to home delivery or can be instructed to locate a different merchant location along the route. In an example, the purchase options can also include payment options. The payment options can include immediate payment (including method of payment), pay after delivery, pay at pickup, or some form of deferred payment, among others. In certain examples integrated into mobile device 115B, a vehicle registration number (e.g., license plate) can be used to validate payment upon merchant pickup. In other examples, the mobile device 115 can include biometric (e.g., finger print or voice recognition) authentication and authorization systems that can be used in conjunction with a navigated shopping service.


At operation 1235, the method 1200 can continue with the user authorizing payment for the selected items via the mobile device 115. At operation 1240, the method 1200 continues with the mobile device 115 receiving a receipt for purchased items. In an example, the receipt received by the mobile device 115 can include pickup or delivery instructions for the selected items.


Finally, at operation 1245, the method 1200 can conclude with the user receiving the purchased items from the merchant locations. In some examples, the user can collect the purchased items from each of the merchant locations along the navigation route. In other examples, the purchased items (or some of the purchased items) can be delivered to the user's home. Alternatively, the purchased items can be delivered to the planned destination address. The mechanism of receipt of the purchased items may be dependent on the purchase option(s) selected in operation 1230.



FIG. 13 is a flowchart illustrating a method 1300 for providing navigated shopping services, according to an example embodiment. In an example, the method 1300 can include operations such as receiving a list of items at operation 1305, receiving a desired route at operation 1310, locating merchants along the route at operation 1315, transmitting merchant and available inventory information at operation 1320, receiving items selected for purchase at operation 1325, receiving payment authorization at operation 1330, transmitting purchase requests to merchants at operation 1335, receiving confirmations from merchants at operation 1340, and transmitting purchase confirmation at operation 1345. In an example, the method 1300 can be performed on the network-based navigated shopping system 125 or the networked system 402.


In an example, the method 1300 can begin at operation 1305 with the networked system 402 receiving a list of items for purchase via the navigated shopping service. At operation 1310, the method 1300 can continue with the networked system 402 receiving a desired route from a client machine, such as client machine 412. In an example, instead of a route, the networked system 402 can receive a desired destination, and the networked system 402 can develop the route (such as via the routing module 505).


At operation 1315, the method 1300 can continue with the networked system 402 locating merchant locations along the desired navigation route that sell one or more of the items received for purchase via the navigated shopping service. In an example, the networked system 402 can also determine whether the merchant location has the desired item or items in inventory. In certain examples, the networked system 402 can communicate with a real-time inventor service, such as MILO (from eBay Inc. of San Jose Calif.), to determine available inventory.


At operation 1320, the method 1300 can continue with the networked system 402 transmitting merchant and available inventor information to a client machine, such as the client machine 412. In some examples, the networked system 402 also transmits route and map information to the client machine 412.


At operation 1325, the method 1300 can continue with the networked system 402 receiving items selected for purchase via the navigated shopping service. At operation 1330, the method 1300 can continue with the networked system 402 receiving payment authorization for the selected items. Additionally, the networked system 402 can also receive purchase options associated with each of the items selected for purchase (not specifically illustrated within method 1300). See discussion above in reference to FIG. 6 for available purchase options that can be handled by the networked system 402.


At operation 1335, the method 1300 can continue with the networked system 402 transmitting purchase requests to the merchants associated with the purchased items. In an example, the networked system 402 can also transmit delivery instructions associated with each of the purchased items. For example, one of the purchased items may need to be delivered, while a different purchased item may be prepared for pickup.


At operation 1340, the method 1300 can continue with the networked system 402 receiving confirmations from the merchants regarding the purchased items. The confirmations can include an availability indictor and shipping information, among other things. At operation 1345, the method 1300 can conclude with the networked system 402 transmitting purchase confirmation information to the client machine 412. The confirmation sent to the client machine 412 can include a receipt, pickup instructions, delivery confirmation, and any other information needed for the user to receive the purchased items.


Example Navigated Shopping System User Interfaces



FIG. 14A is an illustration of a mobile device display 1400, according to an example embodiment. In an example, a user can download and install a navigated shopping application onto a mobile device, such as in-vehicle mobile device 115B. Once installed, the navigated shopping application can allow a user to create a wish list of items, such as the wish list illustrated in FIG. 14B. In certain examples, the navigated shopping system can provide web-based (e.g., browser) and/or smart phone application interfaces for wish list maintenance.



FIG. 15 is an illustration of a mobile device display 1500 depicting a shopping route provided by a navigated shopping service, according to an example embodiment. The mobile device display 1500 can include a map view 1505, a highlighted route 1510, retail locations 1515A-1515N (collectively referred to as retail location(s) 1515), and purchase price indicators 1520A-1520N (collectively referred to as purchase price indictor(s) 1520). In an example, the user can specify a distance off-route for display of retail locations 1515. In another example, the map view 1505 can be limited to displaying only the retail locations 1515 selected by the navigated shopping application.


Example Local Listings Method



FIG. 16 is a flow chart illustrating a method 1600 for providing location-based services, according to an example embodiment. In an example, the method 1600 can include operations such as: retrieving listings at operation 1605, determining a location of a mobile device at operation 1610, retrieving search results at operation 1615, generating a display of search results at operation 1620, and transmitting the display at operation 1625. At operation 1605, the method 1600 can begin with the networked system 402 receiving a request to view listings received from a client device and retrieving listings matching the search request. The request may be generated by a mobile application (e.g., a programmatic application, a web browser application) executing in the client device 115. The request may be a search query containing one or more search parameters by which to search listings stored within networked system 402. Alternatively, the request may be a request to navigate among categories and sub-categories of stored listings. Listings may pertain to items or services offered for sale, or other varieties of published information. In some embodiments, listings may pertain to events, such as garage or estate sales, which have finite time duration and which may offer one or more items for sale.


At operation 1610, the method 1600 can continue with the networked system 402 determining, or in some examples receiving, a location associated with the client device 115. In some embodiments, the location may be transmitted with the request to view listings. In some embodiments, location-related data may be transmitted from the client device 115, and a location may be ascertained based on the location-related data. For example, a set of longitude and latitude coordinates may be transmitted, and from the coordinates, a location (e.g., cross-streets, city, state, region) may be determined. In other examples, the networked system 402 can operate to ascertain the location of the mobile device 115 through other means, such as communication with a network service provider that tracks location of mobile devices, such as mobile device 115.


At operation 1615, the method 1600 can continue with the networked system 402 using the location information to retrieve listings having locations identified as being with a predetermined geographic proximity to the client device 115. The listings may be stored with location information indicating where the item or service being offered is located.


At operation 1620, the method 1600 can continue with the networked system 402 generating an interface that depicts retrieved listings relative to the location of the client device (see discussion related to FIGS. 17 and 18 below for additional details). In some embodiments, the interface may be a map with listings and the client device may be overlaid. In some embodiments, the interface may be a web page containing a list of listings ordered according to one of a variety of attributes, such as distance from the client device. At operation 1625, the method 1600 can conclude with the networked system 402 transmitting the interface to the user.


Example Local Listings User Interfaces



FIG. 17 is a user interface diagram depicting location-based services operating on a client device, according to an example embodiment. Referring to FIG. 17, an example client device 115 is shown. Client device 115 may have a display screen area 1702 on which a map is shown. The map may be an interface belonging to an application capable of being selected for execution by a user of the client device 115. The map may display a region surrounding a determined location, such as location 1704, of the client device 115. For illustration purposes, the map illustrated in FIG. 17 is a rudimentary map. The map may illustrate a location of the client device 115 as determined by location generating circuitry, such as the GPS Receiver 380 (FIG. 3). One or more listings 1708A-1708N (collectively referred to as listing(s) 1708) that are within a predefined radius 1706 of client device 115 and may be displayed on the map. In some embodiments, the user may define a radius 1706 within which the user is interested in having listings presented. The radius 1706 may be defined as a distance (e.g., miles) or as an amount of time to travel to the location of the listing (e.g., minutes). In an example, the user interface can provide various methods for a user to input radius 1706, such as dragging a circle on the display screen area 1702, a scroll bar, or a numerical input, among others.


In some embodiments, any listings located within the radius 1706 of the current location of the user may be displayed. In some embodiments, listings corresponding to saved searches, categories or sub-categories indicated as of interest to the user, groups of interest to the user, specific items of interest to the user, and so forth may be presented. For example, if a user is looking for a chainsaw, the user may specify and save certain criteria, such as one or more of a maximum price, a brand, a condition of the item, a minimum reputation score of a seller of an item, and a distance of the item from the location of the user, within the networked system 402. When a user travels within a predefined proximity of a listing 1708 meeting the user's saved criteria, the listing 1708 may be presented to the user as being a local listing. In an example, the user interface on the mobile device 115 can automatically change to displaying a detailed description of the local listing selected from listings 1708 based on proximity.


In some embodiments, a subset of listings that are available for presentation may be presented to the user. By displaying only a subset of listings, the display screen area 1702 of the client device 115 may avoid becoming overcrowded. In other embodiments, if the number of available listings 1708 meeting the criteria set forth for presentation to the user exceeds a predetermined threshold, the map may be zoomed in to a finer granularity to display a more manageable number of listings 1708.


In some embodiments, icons representing certain listings on the map may be color coded, enlarged, or otherwise emphasized to highlight the listings to the user. Various factors may cause listings to be emphasized. For example, if a listing corresponds to an event, such as a garage sale or estate sale, the listing may have a finite time period before expiration. The listings may be visually emphasized to notify a user that the listing corresponding to the event may be expiring soon. In other embodiments, icons representing certain listings may be emphasized if the listings meet certain saved criteria. For example, listings meeting the criteria of a user may be emphasized to alert a user that an item desired by the user is located nearby to the user.


In some embodiments, a notification may be presented to a user that a local listing, selected from listings 1708, is available. The notification may be a pop-up, a message, a sound, or other type of notification that alerts a user visually and/or audibly as to the availability of a listing 1708.



FIG. 18 is a diagram depicting a location-based service operating on a client device, according to an example embodiment. FIG. 18 illustrates the mobile device 115, (also illustrated in FIG. 17) in which a map depicting a location of mobile device 115 and listings located within a radius 1706 of the mobile device 115 are shown on a display screen area 1702 of the mobile device 115. In the example embodiment of FIG. 18, a user may select one of the listings 1708 displayed on the map by, for example, using a touch-based gesture to select an icon representing the listing, such as listing 1708A. In some embodiments, information pertaining to the listing 1708A may be presented on the display screen area 1702. For example, listing information, such as a title and description of an item offered for sale, may be shown (e.g., interface elements 1802 and 1804). In some embodiments, a distance of the seller of the item from the current location of the user may be shown as well. One or more images of the item may also be shown. In some embodiments, information permitting a user to contact the seller may be provided, such as a telephone number, an email address, an instant message user name, and the like. In some embodiments, the application may facilitate communication between the user and the seller upon selection of a user interface element, such as a button or link. In some embodiments, an address and directions to the opposing party's location may be provided.


In some embodiments, the identity of the user (e.g., buyer) interested in a listing may be revealed to a user (e.g., seller) associated with a listing. In some embodiments, the identity may be revealed when the buyer attempts to communicate with the seller regarding a listing. In this respect, given that a listing may facilitate a local in-person transaction, the seller may be able to gather information about the potential buyer before the buyer and seller meet in person.


Although example embodiments disclosed herein illustrate listings 1708 that fall within a circular radius 1706 of a user's location, it is contemplated that listings within other predetermined geographic proximities to the user may be presented. For example, listings that fall within a corridor of travel of a user may be presented, such as illustrated in FIG. 11. The corridor of travel may be determined based on the location and direction of movement of a client device of a user. In some embodiments, listings that are in the directional path of a user may be presented, while in other embodiments, listings that are both in front of and behind the directional path of the user's travel may be presented.


Electronic Apparatus and System


Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Example embodiments may be implemented using a computer program product, for example, a computer program tangibly embodied in an information carrier, for example, in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, for example, a programmable processor, a computer, or multiple computers.


A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.


In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., a FPGA or an ASIC).


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.


Example Machine Architecture and Machine-Readable Medium



FIG. 19 is a block diagram of machine in the example form of a computer system 1900 within which there may be executed instructions 1924 for causing the machine to perform any one or more of the methodologies discussed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a PDA, a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example computer system 1900 includes a processor 1902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 1904 and a static memory 1906, which communicate with each other via a bus 1908. The computer system 1900 may further include a video display unit 1910 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 1900 also includes an alphanumeric input device 1912 (e.g., a keyboard), a cursor control device 1914 (e.g., user interface (UI) navigation device or computer mouse), a disk drive unit 1916, a signal generation device 1918 (e.g., a speaker) and a network interface device 1920.


Machine-Readable Medium


The disk drive unit 1916 includes a machine-readable medium 1922 on which is stored one or more sets of data structures and instructions 1924 (e.g., software) embodying or used by any one or more of the methodologies or functions described herein. The instructions 1924 may also reside, completely or at least partially, within the main memory 1904, static memory 1906, and/or within the processor 1902 during execution thereof by the computer system 1900, the main memory 1904 and the processor 1902 also constituting machine-readable media.


While the machine-readable medium 1922 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 1924 or data structures. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the embodiments of the present invention, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including by way of example, semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. A “machine-readable storage medium” shall also include devices that may be interpreted as transitory, such as register memory, processor cache, and RAM, among others. The definitions provided herein of machine-readable medium and machine-readable storage medium are applicable even if the machine-readable medium is further characterized as being “non-transitory.” For example, any addition of “non-transitory,” such as non-transitory machine-readable storage medium, is intended to continue to encompass register memory, processor cache and RAM, among other memory devices.


Transmission Medium


The instructions 1924 may further be transmitted or received over a communications network 1926 using a transmission medium. The instructions 1924 may be transmitted using the network interface device 1920 and any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a LAN, a WAN, the Internet, mobile telephone networks, Plain Old Telephone (POTS) networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.


Although the present inventive subject matter has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the inventive subject matter. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.


The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.


All publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated reference(s) should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.


In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended; that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” and “third,” and so forth are used merely as labels, and are not intended to impose numerical requirements on their objects.


The Abstract of the Disclosure is provided to comply with 37 C.F.R. § 1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims
  • 1. A method comprising: receiving, via a mobile device, a search query, the search query including a current location of the mobile device and a user-selected mode of transportation;identifying a set of search results based on the search query, the set of search results including a plurality of listings, each listing of the plurality of listings having an associated location;calculating, based on the current location of the mobile device, a travel time for each listing of the plurality of listings, the travel time for each listing of the plurality of listings being the amount of time to travel from the current location to the location associated with each of the plurality of listings using the user-selected mode of transportation; andproviding the set of search results for display at the mobile device in a sort order, the sort order based on the travel time of each of the plurality of listings.
  • 2. The method of claim 1, wherein the travel time represents one of the following: a time of travel between the current location and a location associated with each listing of the plurality of listings that is associated with a physical address; or a time of delivery to the current location associated with each listing of the plurality of listings that is associated with at least one online merchant.
  • 3. The method of claim 1, wherein identifying the set of search results includes identifying a first plurality of listings associated with at least one local merchant and identifying a second plurality of listings associated with at least one online merchant, and including both the first plurality of listings and the second plurality of listings in the plurality of listings.
  • 4. The method of claim 1, wherein providing the set of search results for display at the mobile device further comprises: providing, for display at the mobile device, a map view including a first indicator depicting the current location and a plurality of second indicators, each second indicator depicting a location associated with each listing of the plurality of listings that includes a physical location within the map view.
  • 5. The method of claim 4, further including receiving, via the mobile device, a user input indicating a maximum travel time; and wherein providing the map view includes filtering the plurality of second indicators based on the maximum travel time.
  • 6. The method of claim 5, wherein the user-selected mode of transportation is selected from one of a group of modes of transportation including: private automobile; public transportation; bicycle; or walking.
  • 7. The method of claim 1, wherein providing the set of search results for display at the mobile device includes providing, for display at the mobile device, a travel time timeline, the travel time timeline including a first plurality of listing indicators arranged according to travel time from the current location and a second plurality of listing indicators arranged according to shipping time to the current location.
  • 8. The method of claim 7, wherein providing the travel time timeline includes providing, for display at the mobile device, a map view correlated to a selection mechanism, the selection mechanism enabling selection of a maximum travel time along the travel time timeline.
  • 9. A machine-readable storage medium including instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving, via a mobile device, a search query, the search query including a current location of the mobile device and a mode of transportation;identifying a set of search results based on the search query, the set of search results including a plurality of listings, each listing of the plurality of listings having an associated location;calculating, based on the current location of the mobile device, a travel time for each listing of the plurality of listings, the travel time calculated based on a travel time from the current location to a location associated with each of the plurality of listings, the travel time based on the mode of transportation; andproviding the set of search results for display at the mobile device in a sort order, the sort order based on the travel time of each of the plurality of listings.
  • 10. The machine-readable storage medium of claim 9, wherein the travel time represents one of the following timeframes: a time of travel between the current location and a location associated with each listing of the plurality of listings that is associated with a physical address; ora time of delivery to the current location associated with each listing of the plurality of listings that is associated with at least one online merchant.
  • 11. The machine-readable storage medium of claim 9, wherein identifying the set of search results comprises identifying a first plurality of listings associated with at least one local merchant and identifying a second plurality of listings associated with at least one online merchant, and including both the first plurality of listings and the second plurality of listings in the plurality of listings.
  • 12. The machine-readable storage medium of claim 9, wherein the providing the set of search results for display at the mobile device further comprises: providing, for display at the mobile device, a map view including a first indicator depicting the current location and a plurality of second indicators, each second indicator depicting a location associated with each listing of the plurality of listings that includes a physical location within the map view.
  • 13. The machine-readable storage medium of claim 12, wherein the operations further comprise receiving a user input indicating a maximum travel time; and wherein providing the map view includes filtering the plurality of second indicators based on the maximum travel time.
  • 14. The machine-readable storage medium of claim 9, wherein the mode of transportation is selected from one of a group of modes of transportation including: private automobile; public transportation; bicycle; or walking.
  • 15. The machine-readable storage medium of claim 9, wherein providing the set of search results for display at the mobile device includes providing, for display at the mobile device, a travel time timeline including a first plurality of listing indicators arranged according to travel time from the current location and a second plurality of listing indicators arranged according to shipping time to the current location; wherein the first plurality of listing indicators represent a first portion of the plurality of listings, wherein each listing of the first portion of the plurality of listings is associated with a merchant having a physical location within a predefined distance of the current location; andwherein the second plurality of listing indicators represent a second portion of the plurality of listings, wherein each listing of the second portion of the plurality of listings is associated with an online merchant.
  • 16. The machine-readable storage medium of claim 15, wherein providing the travel time timeline includes providing, for display at the mobile device, a map view correlated to a selection mechanism, the selection mechanism enabling selection of a maximum travel time along the travel time timeline.
  • 17. A mobile device comprising: a display;a processor coupled to a memory device and the display, the memory device including instructions that, when executed by the processor, cause the mobile device to perform operations comprising:receiving a search query, the search query including a current location of the mobile device and a selection of a mode of transportation, from among a set of modes of transportation;identifying a set of search results based on the search query, the set of search results including a plurality of listings, each listing of the plurality of listings having an associated location;calculating, based on the current location of the mobile device, a travel time for each listing of the plurality of listings, the travel time calculated based on a travel time from the current location to a location associated with each of the plurality of listings, the travel time based on the selection of the mode of transportation; anddisplaying the set of search results at the display of the mobile device in a sort order, the sort order based on the travel time of each of the plurality of listings.
  • 18. The mobile device of claim 17, wherein the travel time represents one of the following timeframes: a time of travel between the current location and a location associated with each listing of the plurality of listings that is associated with a physical address; ora time of delivery to the current location associated with each listing of the plurality of listings that is associated with at least one online merchant.
  • 19. The mobile device of claim 17, wherein the instructions that cause the mobile device to identify search results further include instructions that cause the mobile device to identify a first plurality of listings associated with at least one local merchant and identify a second plurality of listings associated with at least one online merchant, and include both the first plurality of listings and the second plurality of listings in the plurality of listings.
  • 20. The mobile device of claim 17, wherein the instructions that cause the mobile device to display further include instructions that cause the mobile device to display a travel time timeline, the travel time timeline including a first plurality of listing indicators arranged according to travel time from the current location and a second plurality of listing indicators arranged according to shipping time to the current location; wherein the first plurality of listing indicators represent a first portion of the plurality of listings, wherein each listing of the first portion of the plurality of listings is associated with a merchant having a physical location within a predefined distance of the current location; and wherein the second plurality of listing indicators represent a second portion of the plurality of listings, wherein each listing of the second portion of the plurality of listings is associated with an online merchant.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/859,601 filed Sep. 21, 2015, which is a continuation of U.S. patent application Ser. No. 13/725,669, filed Dec. 21, 2012, which claims the benefit of U.S. Provisional Application No. 61/649,061, filed May 18, 2012, U.S. Provisional Application No. 61/649,091, filed May 18, 2012, U.S. Provisional Application No. 61/601,822, filed Feb. 22, 2012, and U.S. Provisional Application No. 61/614,930, filed Mar. 23, 2012. Each of the applications referenced above is incorporated herein by reference in its entirety.

US Referenced Citations (159)
Number Name Date Kind
5948040 DeLorme et al. Sep 1999 A
5963948 Shilcrat Oct 1999 A
6026375 Hall et al. Feb 2000 A
6259381 Small Jul 2001 B1
6263278 Nikiel et al. Jul 2001 B1
6879836 Nakamoto et al. Apr 2005 B2
6970871 Rayburn Nov 2005 B1
7006982 Sorensen Feb 2006 B2
7308356 Melaku et al. Dec 2007 B2
7427023 Suenbuel Sep 2008 B2
7600034 Nasu et al. Oct 2009 B2
7698062 McMullen et al. Apr 2010 B1
7756757 Oakes, III Jul 2010 B1
7797267 Horvitz Sep 2010 B2
7848765 Phillips et al. Dec 2010 B2
7933395 Bailly et al. Apr 2011 B1
7933797 Sorensen Apr 2011 B2
7974873 Simmons Jul 2011 B2
8069120 Buehler et al. Nov 2011 B2
8115656 Bevacqua et al. Feb 2012 B2
8195519 Bonner Jun 2012 B2
8249807 Barbeau et al. Aug 2012 B1
8295854 Osann, Jr. Oct 2012 B2
8538686 Gruen et al. Sep 2013 B2
8600659 Scherzinger Dec 2013 B1
8615359 Alam et al. Dec 2013 B2
8630897 Prada Gomez Jan 2014 B1
8700316 Su et al. Apr 2014 B2
8738292 Faaborg et al. May 2014 B1
8874366 Su et al. Oct 2014 B2
9129332 Oakes, III Sep 2015 B1
9141987 Perks et al. Sep 2015 B2
9141988 Howard et al. Sep 2015 B2
9171327 Nallu et al. Oct 2015 B2
9547812 Oya Jan 2017 B2
9547872 Howard Jan 2017 B2
9581463 Nallu et al. Feb 2017 B2
9679325 Nallu et al. Jun 2017 B2
9852460 Howard et al. Dec 2017 B2
9858607 Nallu et al. Jan 2018 B2
9885584 Nallu et al. Feb 2018 B2
10192255 Nallu et al. Jan 2019 B2
20010047315 Siegel Nov 2001 A1
20020010625 Smith et al. Jan 2002 A1
20020026380 Su Feb 2002 A1
20020130906 Miyaki Sep 2002 A1
20020133411 Nakamoto et al. Sep 2002 A1
20030229446 Boscamp et al. Dec 2003 A1
20040011866 Saad Jan 2004 A1
20040143386 Yoshihara et al. Jul 2004 A1
20050154653 Jongebloed Jul 2005 A1
20060010037 Angert et al. Jan 2006 A1
20060025923 Dotan et al. Feb 2006 A1
20060058948 Blass et al. Mar 2006 A1
20060074763 Schmid Apr 2006 A1
20060227047 Rosenberg Oct 2006 A1
20060287810 Sadri et al. Dec 2006 A1
20070124066 Kikuchi et al. May 2007 A1
20070150369 Zivin Jun 2007 A1
20070174133 Hearn Jul 2007 A1
20070210938 Deurwaarder et al. Sep 2007 A1
20070225902 Gretton et al. Sep 2007 A1
20070226082 Leal Sep 2007 A1
20080005055 Horvitz Jan 2008 A1
20080033633 Akiyoshi et al. Feb 2008 A1
20080082424 Walton Apr 2008 A1
20080086323 Petrie et al. Apr 2008 A1
20080086368 Bauman Apr 2008 A1
20080086509 Wallace Apr 2008 A1
20080126145 Rackley, III et al. May 2008 A1
20080162032 Wuersch et al. Jul 2008 A1
20080208446 Geelen et al. Aug 2008 A1
20080248815 Busch Oct 2008 A1
20080249870 Angell et al. Oct 2008 A1
20080301102 Liang Dec 2008 A1
20090006194 Sridharan et al. Jan 2009 A1
20090043665 Tirumalareddy et al. Feb 2009 A1
20090167553 Hong et al. Jul 2009 A1
20090222438 Strandell Sep 2009 A1
20090240518 Borom et al. Sep 2009 A1
20100027527 Higgins et al. Feb 2010 A1
20100036807 Lieske, Jr. Feb 2010 A1
20100179756 Higgins et al. Jul 2010 A1
20100203901 Dinoff et al. Aug 2010 A1
20100228574 Mundinger et al. Sep 2010 A1
20100250673 Laroia et al. Sep 2010 A1
20100286901 Geelen et al. Nov 2010 A1
20100292921 Zachariah et al. Nov 2010 A1
20100305984 Ben-Yitschak Dec 2010 A1
20110066364 Hale Mar 2011 A1
20110090078 Kim Apr 2011 A1
20110112762 Gruijters et al. May 2011 A1
20110125794 Hutschemaekers May 2011 A1
20110130956 Tracton et al. Jun 2011 A1
20110136480 Osann, Jr. Jun 2011 A1
20110145093 Paradise et al. Jun 2011 A1
20110145096 Jensen Jun 2011 A1
20110160996 Terai et al. Jun 2011 A1
20110167058 Van Os Jul 2011 A1
20110172909 Kahn et al. Jul 2011 A1
20110177845 Fasold Jul 2011 A1
20110179018 Lazaridis et al. Jul 2011 A1
20110218933 Hunsaker et al. Sep 2011 A1
20110231334 Jindel Sep 2011 A1
20110313880 Paul Dec 2011 A1
20110320318 Patel et al. Dec 2011 A1
20120004841 Schunder Jan 2012 A1
20120023034 Lynch et al. Jan 2012 A1
20120036014 Sunkada Feb 2012 A1
20120101818 Scott Apr 2012 A1
20120116922 Ku May 2012 A1
20120123673 Perks et al. May 2012 A1
20120126998 Morgan et al. May 2012 A1
20120130865 Rendler May 2012 A1
20120143489 Lee et al. Jun 2012 A1
20120209749 Hammad et al. Aug 2012 A1
20120253656 Brandt Oct 2012 A1
20120253660 Dillahunt et al. Oct 2012 A1
20120276928 Shutter Nov 2012 A1
20120290383 Busch Nov 2012 A1
20120299707 Centanni, Jr. Nov 2012 A1
20120302258 Pai et al. Nov 2012 A1
20120310527 Yariv et al. Dec 2012 A1
20120316989 Wong et al. Dec 2012 A1
20120323692 Shutter Dec 2012 A1
20130036117 Fisher Feb 2013 A1
20130054135 Backsen, Jr. Feb 2013 A1
20130082874 Zhang Apr 2013 A1
20130097246 Zifroni et al. Apr 2013 A1
20130185150 Crum Jul 2013 A1
20130198030 Linden et al. Aug 2013 A1
20130198031 Mitchell et al. Aug 2013 A1
20130218463 Howard et al. Aug 2013 A1
20130218912 Howard et al. Aug 2013 A1
20130226731 Macneille et al. Aug 2013 A1
20130253832 Nallu et al. Sep 2013 A1
20130282533 Foran-Owens et al. Oct 2013 A1
20130325667 Satyavolu et al. Dec 2013 A1
20140058886 Gopalakrishnan Feb 2014 A1
20140074743 Rademaker Mar 2014 A1
20140156410 Wuersch et al. Jun 2014 A1
20140172571 Fabrikant et al. Jun 2014 A1
20140188658 Li et al. Jul 2014 A1
20140207611 Cleary et al. Jul 2014 A1
20140336925 Akin Nov 2014 A1
20140358724 Nallu et al. Dec 2014 A1
20150039393 Jain Feb 2015 A1
20150066649 Kumar et al. Mar 2015 A1
20150073941 Burrows et al. Mar 2015 A1
20150134488 Maclaurin et al. May 2015 A1
20150330806 Nallu et al. Nov 2015 A1
20160003637 Andersen Jan 2016 A1
20160012509 Howard et al. Jan 2016 A1
20160041000 Nallu et al. Feb 2016 A1
20170122746 Howard et al. May 2017 A1
20170131113 Nallu et al. May 2017 A1
20180106638 Nallu et al. Apr 2018 A1
20180150893 Nallu et al. May 2018 A1
20200264008 Nallu et al. Aug 2020 A1
Foreign Referenced Citations (21)
Number Date Country
2014348748 Feb 2017 AU
101743455 Jun 2010 CN
104272333 Jan 2015 CN
108536771 Sep 2018 CN
108537575 Sep 2018 CN
108564435 Sep 2018 CN
212013000071 Sep 2014 DE
0510172 Jul 1995 EP
1991827 Nov 2008 EP
2010-54389 Mar 2010 JP
10-2005-0009092 Jan 2005 KR
10-2007-0093135 Sep 2007 KR
10-2009-0016101 Feb 2009 KR
10-2011-0012122 Feb 2011 KR
10-2011-0134938 Dec 2011 KR
10-2013-0116091 Oct 2013 KR
2012135143 Oct 2012 WO
2013095440 Jun 2013 WO
2013126763 Aug 2013 WO
2014085657 Jun 2014 WO
2015073567 May 2015 WO
Non-Patent Literature Citations (185)
Entry
Office Action received for Korean patent Application No. 10-2018-7016579, dated Jul. 3, 2019, 14 pages (7 pages of English Translation and 7 pages of Official copy).
Response to Office Action filed on May 31, 2019, for Korean Patent Application No. 10-2018-7016579, dated Mar. 29, 2019, 21 pages(11 pages of English Translation and 10 pages of Official Copy).
Response to Office Action Filed on Sep. 3, 2019, for Korean Patent Application No. 10-2018-7016579 dated Jul. 3, 2019, 33 pages (10 pages of English Translation & 23 pages of official copy).
Response to Final Office Action filed on May 24, 2019 for U.S. Appl No. 14/474,117, dated Apr. 15, 2019, 13 pages.
Applicant Initiated Interview Summary Received for U.S. Appl No. 15/406,283, dated Aug. 12, 2019, 3 pages.
Final Office Action received for U.S. Appl No. 15/406,283, dated Jul. 3, 2019, 16 pages.
Non-Final Office Action Received for U.S. Appl. No. 15/406,283, dated Sep. 12, 2019, 17 pages.
Response to Final Office Action Filed on Aug. 27, 2019, for U.S. Appl. No. 15/406,283, dated Jul. 3, 2019, 12 pages.
Response to Non-Final Office Action filed on May 28, 2019 for U.S. Appl. No. 15/406,283, dated Feb. 28, 2019, 12 pages.
Response to Office Action filed on May 23, 2019, for Canadian Patent Application No. 2,861,822, dated Nov. 30, 2018, 7 pages.
Office Action Received for Canadian Patent Application No. 2,929,782, dated Aug. 1, 2019, 7 pages.
Dennis, “Google Mapps With Hotel Pricing—Google Says What Is The Fuss?”, Google map, PhocusWire, Jun. 16, 2010, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 14/474,117, dated Oct. 2, 2018, 23 pages.
Notice of Allowance received for U.S. Appl. No. 15/820,994, dated Sep. 17, 2018, 6 pages.
Response to Office Action filed on May 18, 2018 for Canadian Patent Application No. 2,929,782, dated Jan. 31, 2018, 17 pages.
Office Action received for Korean Patent Application No. 10-2018-7016579, dated Sep. 11, 2018, 15 pages (7 pages of English Translation and 8 pages of Official Copy).
Response to Office Action filed on Nov. 12, 2018 for Korean Patent Application No. 10-2018-7016579, dated Sep. 11, 2018, 31 pages (23 pages of English Translation of claims and 8 pages of Official Copy).
Office Action received for Canadian Patent Application No. 2,929,782, dated Oct. 24, 2018, 6 pages.
Office Action received for Canadian Patent Application No. 2,861,822, dated Nov. 30, 2018, 4 pages.
Response to Non-Final Office Action filed on Jan. 2, 2019 for U.S. Appl. No. 14/474,117, dated Oct. 2, 2018, 13 pages.
Final Office Action received for Korean Patent Application No. 10-2014-7026159, dated Dec. 28, 2016, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2014-7026159, dated Apr. 28, 2017, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2014-7026159, dated May 27, 2016, 10 pages (4 pages of English Translation and 6 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2014-7026159, dated Oct. 19, 2015, 8 pages (3 pages of English Translation and 5 pages of Official Copy).
Response to Office Action filed on Aug. 26, 2016 for Korean Patent Application No. 10-2014-7026159, dated May 27, 2016, 21 pages (including 5 pages of English Pending Claims).
Response to Office Action filed on Jan. 19, 2016 for Korean Patent Application No. 10-2014-7026159, dated Oct. 19, 2015, 23 pages (including 5 pages of English Pending Claims).
Response to Office Action filed on Mar. 29, 2017 for Korean Patent Application No. 10-2014-7026159, dated Dec. 28, 2016, 22 pages (including 5 pages of English Pending Claims).
Non-Final Office Action received for U.S. Appl. No. 13/691,291, dated Dec. 23, 2014, 13 pages.
Notice of Allowance received for U.S. Appl. No. 13/691,291, dated Jun. 24, 2015, 11 pages.
Response to Non-Final Office Action filed on Feb. 23, 2015, for U.S. Appl. No. 13/691,291, dated Dec. 23, 2014, 14 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/725,669, dated Feb. 27, 2015, 3 pages.
Non-Final Office Action received for U.S. Appl. No. 13/725,669, dated Dec. 12, 2014, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/725,669, dated Jun. 26, 2014, 9 pages.
Notice of Allowance received for U.S. Appl. No. 13/725,669, dated May 13, 2015, 9 pages.
Response to Non-Final Office Action filed on Mar. 12, 2015 for U.S. Appl. No. 13/725,669, dated Dec. 12, 2014, 12 pages.
Response to Non-Final Office Action filed on Oct. 27, 2014 for U.S. Appl. No. 13/725,669, dated Jun. 26, 2014, 12 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/725,766, dated Apr. 29, 2015, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/725,766, dated Dec. 11, 2015, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/725,766, dated Jun. 23, 2016, 3 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 13/725,766, dated Dec. 13, 2016, 2 pages.
Final Office Action received for U.S. Appl. No. 13/725,766, dated Apr. 1, 2015, 24 pages.
Final Office Action received for U.S. Appl. No. 13/725,766, dated May 3, 2016, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 13/725,766, dated Nov. 20, 2014, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 13/725,766, dated Oct. 29, 2015, 21 pages.
Notice of Allowance received for U.S. Appl. No. 13/725,766, dated Aug. 31, 2016, 9 pages.
Response to Final Office Action filed on Aug. 3, 2015 for U.S. Appl. No. 13/725,766, dated Apr. 1, 2015, 31 pages.
Response to Final Office Action filed on Aug. 3, 2016 for U.S. Appl. No. 13/725,766, dated May 3, 2016, 12 pages.
Response to Non-Final Office Action filed on Feb. 20, 2015 for U.S. Appl. No. 13/725,766, dated Nov. 20, 2014, 25 pages.
Response to Non-Final Office Action filed on Jan. 29, 2016 for U.S. Appl. No. 13/725,766, dated Oct. 29, 2015, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 13/957,745, dated Jun. 11, 2015, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 14/369,991, dated Dec. 23, 2014, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 14/369,991, dated Jul. 18, 2016, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 14/369,991, dated Sep. 22, 2015, 14 pages.
Notice of Allowance received for U.S. Appl. No. 14/369,991, dated Apr. 27, 2015, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/369,991, dated Feb. 17, 2016, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/369,991, dated Oct. 24, 2016, 5 pages.
Preliminary Amendment filed for U.S. Appl. No. 14/369,991, dated Jun. 30, 2014, 3 pages.
Response to Non-Final Office Action filed on Feb. 23, 2015 for U.S. Appl. No. 14/369,991, dated Dec. 23, 2014, 15 pages.
Response to Non-Final Office Action filed on Jan. 18, 2016 for U.S. Appl. No. 14/369,991, dated Sep. 22, 2015, 13 pages.
Response to Non-Final Office Action filed on Oct. 4, 2016, for U.S. Appl. No. 14/369,991, dated Jul. 18, 2016, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 14/474,117, dated Jun. 16, 2015, 19 pages.
Final Office Action received for U.S. Appl. No. 14/809,543, dated Jun. 6, 2017, 15 pages.
Final Office Action received for U.S. Appl. No. 14/809,543, dated Nov. 17, 2016, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 14/809,543, dated Apr. 12, 2016, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 14/809,543, dated Feb. 16, 2017, 16 pages.
Preliminary Amendment filed for U.S. Appl. No. 14/809,543, dated Aug. 7, 2015, 7 pages.
Response to Final Office Action filed on Aug. 8, 2017 for U.S. Appl. No. 14/809,543, dated Jun. 6, 2017, 10 pages.
Response to Final Office Action filed on Oct. 25, 2017 for U.S. Appl. No. 14/809,543, dated Nov. 17, 2016, 10 pages.
Response to Non-Final Office Action filed on Aug. 9, 2016 for U.S. Appl. No. 14/809,543, dated Apr. 12, 2016, 13 pages.
Response to Non-Final Office Action filed on May 16, 2017 for U.S. Appl. No. 14/809,543, dated Feb. 16, 2017, 12 pages.
Office Action received for Korean Patent Application No. 10-2016-7015678, dated Apr. 9, 2018, 4 pages (With English Translation).
Office Action received for Korean Patent Application No. 10-2016-7015678, dated Dec. 28, 2017, 6 pages (3 pages of English Translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2016-7015678, dated Jun. 16, 2017, 7 pages (5 pages of English Translation and 2 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2016-7015678, dated Nov. 4, 2016, 14 pages (6 pages of English Translation and 8 pages of Official Copy).
Response to Office Action filed on Mar. 5, 2018 for Korean Patent Application No. 10-2016-7015678, dated Dec. 28, 2017, 20 pages (9 pages of Official Copy and 11 pages of English Translation).
Response to Office Action filed on Aug. 11, 2017 for Korean Patent Application No. 10-2016-7015678, dated Jun. 16, 2017, 17 pages (with 5 pages of English Pending Claims).
Response to Office Action filed on Feb. 6, 2017 for Korean Patent Application No. 10-2016-7015678, dated Nov. 4, 2016, 30 pages (with English Pending Claims).
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/691,390, dated Mar. 08, 2016, 3 pages.
Final Office Action received for U.S. Appl. No. 13/691,390, dated Nov. 6, 2015, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 13/691,390, dated Jul. 10, 2015, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/691,390, dated Jun. 9, 2016, 21 pages.
Response to Final Office Action filed on Mar. 7, 2016 for U.S. Appl. No. 13/691,390, dated Nov. 6, 2015, 11 pages.
Response to Non-Final Office Action filed on Oct. 13, 2015 for U.S. Appl. No. 13/691,390, dated Jul. 10, 2015, 12 pages.
Response to Restriction Requirement filed on May 26, 2015 for U.S. Appl. No. 13/691,390, dated Feb. 26, 2015, 10 pages.
Restriction Requirement received for U.S. Appl. No. 13/691,390, dated Feb. 26, 2015, 9 pages.
Applicant Initiated Interview Summary for U.S. Appl. No. 13/691,390, dated Oct. 19, 2015, 3 pages.
Decision on Pre-Appeal Brief Request received for U.S. Appl. No. 13/890,631, dated Nov. 17, 2015, 2 pages.
Final Office Action received for U.S. Appl. No. 13/890,631, dated Aug. 13, 2015, 9 pages.
Final Office Action received for U.S. Appl. No. 13/890,631, dated Sep. 22, 2016, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 13/890,631, dated Dec. 18, 2014, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 13/890,631, dated Feb. 25, 2016, 8 pages.
Pre-Appeal Brief Request filed on Oct. 13, 2015, for U.S. Appl. No. 13/890,631, 5 pages.
Response to Non-Final Office Action filed on Apr. 17, 2015 for U.S. Appl. No. 13/890,631 dated Dec. 18, 2014, 12 pages.
Response to Non-Final Office Action filed on May 25, 2016 for U.S. Appl. No. 13/890,631 dated Feb. 25, 2016, 18 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/957,745, dated Jul. 15, 2015, 3 pages.
Final Office Action received for U.S. Appl. No. 13/957,745, dated Jan. 4, 2016, 18 pages.
Response to Non-Final Office Action filed on Sep. 28, 2015 for U.S. Appl. No. 13/957,745, dated Jun. 11, 2015, 38 pages_.
Final Office Action received for U.S. Appl. No. 14/474,117, dated Dec. 4, 2015, 23 pages.
Final Office Action received for U.S. Appl. No. 14/474,117, dated Feb. 9, 2018, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 14/474,117, dated Dec. 19, 2016, 26 pages.
Non-Final Office Action received for U.S. Appl. No. 14/474,117, dated Jun. 26, 2017, 25 pages.
Response to Final Office Action filed on Apr. 9, 2018 for U.S. Appl. No. 14/474,117, dated Feb. 9, 2018, 17 Pages.
Response to Final Office Action filed on Mar. 4, 2016 for U.S. Appl. No. 14/474,117, dated Dec. 4, 2015, 20 pages.
Response to Non-Final Office Action filed on Mar. 20, 2017 for U.S. Appl. No. 14/474,117, dated Dec. 19, 2016, 20 pages.
Response to Non-Final Office Action filed on Sep. 16, 2015 for U.S. Appl. No. 14/474,117, dated Jun. 16, 2015, 13 pages.
Response to Non-Final Office Action filed on Sep. 26, 2017 for U.S. Appl. No. 14/474,117, dated Jun. 26, 2017, 21 pages.
Issue Fee received for U.S. Appl. No. 14/809,543, dated Nov. 22, 2017, 1 Page.
Notice of Allowance received for U.S. Appl. No. 14/809,543, dated Aug. 24, 2017, 5 pages.
Response to Final Office Action filed on Jan. 25, 2017 for U.S. Appl. No. 14/809,543, dated Nov. 17, 2016, 10 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 14/859,601, dated May 31, 2017, 3 pages.
Final Office Action received for U.S. Appl. No. 14/859,601, dated May 16, 2017, 12 pages.
First Action Interview—Office Action Summary received for U.S. Appl. No. 14/859,601, dated Jan. 31, 2017, 8 pages.
First Action Interview—Pre-interview Communication received for U.S. Appl. No. 14/859,601, dated Oct. 14, 2016, 3 pages.
Notice of Allowance received for U.S. Appl. No. 14/859,601, dated Aug. 17, 2017, 8 pages.
Response to Final Office Action filed on May 31, 2017 for U.S. Appl. No. 14/859,601, dated May 16, 2017, 18 pages.
Response to First Action Interview—Office Action Summary filed on Mar. 31, 2017, for U.S. Appl. No. 14/859,601, dated Jan. 31, 2017, 13 pages.
Notice of Allowance received for U.S. Appl. No. 15/414,772, dated Sep. 28, 2017, 5 pages.
Preliminary Amendment U.S. Appl. No. 15/820,994, filed Nov. 30, 2017, 7 pages.
Preliminary Amendment for U.S. Appl. No. 15/846,995, filed Jan. 8, 2018, 7 pages.
Notice of Allowance received for Canada Patent Application No. 2,861,822, dated Feb. 5, 2018, 1 page.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2014/065279, dated May 26, 2016, 9 pages.
Response to Office Action filed on Sep. 13, 2017 for Canadian Patent Application No. 2,861,822, dated Apr. 20, 2017, 2 pages.
Office Action received for Canadian Patent Application No. 2,893,135, dated Aug. 11, 2016, 3 pages.
Office Action received for Canadian Patent Application No. 2,929,782, dated Feb. 21, 2017, 5 pages.
Office Action received for Canadian Patent Application No. 2,929,782, dated Jan. 31, 2018, 5 pages.
Response to Office Action filed on Aug. 8, 2017 for Canadian Patent Application No. 2,929,782, dated Feb. 21, 2017, 26 pages.
Notice of Acceptance received for Australian Patent Application No. 2013222225, dated Jan. 5, 2016, 3 pages.
First Examiner Report received for Australian Patent Application No. 2013352130, dated Mar. 11, 2016, 5 pages.
Response to First Examiner Report filed on Aug. 26, 2016 for Australian Patent Application No. 2013352130, dated Mar. 11, 2016, 18 pages.
Second Examiner Report received for Australian Patent Application No. 2013352130, dated Sep. 23, 2016, 4 pages.
Notice of Allowance Received for Chinese Patent Application No. 201380010168.4, dated Jan. 8, 2018, 4 pages.
Response to Office Action filed on Sep. 14, 2017 for Chinese Patent Application No. 201380010168.4, dated Jun. 30, 2017, 12 pages (Only Official Copy).
First Examiner Report received for Australian Patent Application No. 2014348748, dated Nov. 12, 2016, 3 pages.
Response to First Examiner Report filed on Feb. 1, 2017, for Australian Patent Application No. 2014348748, dated Nov. 12, 2016, 21 pages.
Notice of Acceptance received for Australian Patent Application No. 2016202371, dated Dec. 7, 2017, 3 pages.
Response to First Examiner Report filed on Sep. 22, 2017 for Australian Patent Application No. 2016202371, dated Jun. 29, 2017, 12 pages.
Response to Second Examiner Report filed on Nov. 30, 2017 for Australian Patent Application No. 2016202371, dated Sep. 25, 2017, 15 pages.
Second Examiner Report received for Australian Patent Application No. 2016202371, dated Sep. 25, 2017, 4 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2013/072339, dated Jun. 11, 2015, 7 pages.
International Search Report received for PCT Application No. PCT/US2013/072339, dated Apr. 28, 2014, 4 pages.
Invitation to Pay Additional Fees and Partial Search Report received for PCT Application No. PCT/US2013/072339, dated Feb. 14, 2014, 2 pages.
Written Opinion received for PCT Application No. PCT/US2013/072339, dated Apr. 28, 2014, 5 pages.
TSO Mobile &Publictransportation, “Wondering how TSO Mobile & GPS Tracking Benefit Passengers and The Public Transportation Industry”, Retrieved from the Internet: <URL: http://www.tsomobile.com/tso-mobile-public-transportation/>, Aug. 25, 2014, 4 pages.
Final Office Action received for U.S. Appl. No. 14/922,327, dated Jun. 29, 2016, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 14/922,327, dated Feb. 1, 2016, 13 pages.
Notice of Allowance received for U.S. Appl. No. 14/922,327, dated Oct. 13, 2016, 5 pages.
Response to Final Office Action filed on Sep. 19, 2016 for U.S. Appl. No. 14/922,327, dated Jun. 29, 2016, 10 pages.
Response to Non-Final Office Action filed on May 31, 2016 for U.S. Appl. No. 14/922,327, dated Feb. 1, 2016, 14 pages.
Preliminary Amendment filed for U.S. Appl. No. 15/406,283, dated Jun. 1, 2017, 7 pages.
Non-Final Office Action received for U.S. Appl. No. 15/414,772, dated Apr. 20, 2017, 7 pages.
Preliminary Amendment filed for U.S. Appl. No. 15/414,772, dated Apr. 14, 2017, 8 pages.
Response to Non Final Office Action filed on Jul. 20, 2017 for U.S. Appl. No. 15/414,772, dated Apr. 20, 2017, 12 pages.
Office Action received for Canadian Patent Application No. 2,861,822, dated Apr. 20, 2017, 3 pages.
Office Action received for Canadian Patent Application No. 2,861,822, dated May 22, 2015, 5 pages.
Office Action received for Canadian Patent Application No. 2,861,822, dated May 31, 2016, 5 pages.
Response to Office Action filed on Nov. 20, 2015 for Canadian Patent Application No. 2,861,822, dated May 22, 2015, 23 pages.
Response to Office Action filed on Nov. 22, 2016 for Canadian Patent Application No. 2,861,822, dated May 31, 2016, 14 pages.
First Examiner Report received for Australian Patent Application No. 2013222225, dated Apr. 14, 2015, 3 pages.
Response to First Examiner Report filed on Sep. 30, 2015 for Australian Patent Application No. 2013222225, dated Apr. 14, 2015, 17 pages.
Second Examiner Report received for Australian Patent Application No. 2013222225, dated Nov. 3, 2015, 3 pages.
Office Action received for Chinese Patent Application No. 201380010168.4, dated Jun. 30, 2017, 25 pages.
Office Action received for Chinese Patent Application No. 201380010168.4, dated Oct. 17, 2016, 20 pages (7 pages of English Translation, 4 pages of English Pending Claims and 9 pages of Official Copy).
Response to Office Action filed on Mar. 1, 2017 for Chinese Patent Application No. 201380010168.4, dated Oct. 17, 2016, 14 pages (including 4 pages of English Pending Claims).
First Examiner Report received for Australian Patent Application No. 2016202371, dated Jun. 29, 2017, 4 pages.
Google Play, “Moovit-Real-time Transit Info.—Android Apps on Google Play”, Retrieved from the Internet: <URL: https://play.google.com/store/apps/details?id=com .tranzmate&hl=en>, Aug. 2, 2017, 2 pages.
Owen, “The power of where: using IP geolocation to create instant local connections online”, Sep. 25, 2012, 7 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2013/027415, dated Sep. 4, 2014, 8 pages.
International Search Report received for PCT Application No. PCT/US2013/027415, dated May 3, 2013, 2 pages.
International Written Opinion received for PCT Application No. PCT/US2013/027415, dated May 3, 2013, 6 pages.
International Search Report received for PCT Patent Application No. PCT/US2014/065279, dated Feb. 18, 2015, 2 pages.
Written Opinion received for PCT Patent Application No. PCT/US2014/065279, dated Feb. 18, 2015, 7 pages.
Examiner Initiated Interview Summary Received for U.S. Appl. No. 15/406,283, dated Mar. 5, 2019, 2 pages.
Final Office Action received for U.S. Appl. No. 14/474,117, dated Apr. 15, 2019, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 15/406,283, dated Feb. 28, 2019, 14 pages.
Response to Office Action filed on Feb. 21, 2019, for Canadian Patent Application No. 2,929,782, dated Oct. 24, 2018, 16 pages.
Final Office Action received for Korean Patent Application No. 10-2018-7016579, dated Mar. 29, 2019, 7 pages (3 pages of English Translation and 4 pages of Official Copy).
Office Action Received for Korean Patent Application No. 10-2018-7016579, dated Jan. 31, 2020, 8 pages (6 pages of Official Copy and 2 pages of English Translation).
Office Action Received for Canadian Patent Application No. 2,861,822, dated Dec. 23, 2019, 5 pages.
Response to Office Action filed on Jan. 24, 2020 for Canadian Application No. 2,929,782, dated Aug. 1, 2019, 4 pages.
Non Final Office Action Received for U.S. Appl. No. 14/474,117, dated Jan. 8, 2020, 25 Pages.
Notice of Allowance received for U.S. Appl. No. 15/846,995, dated Feb. 6, 2020, 8 pages.
Woodward, “Gap will try a Variation of Order on Line and Pick up in Store”, Digital Commerce 360, Apr. 19, 2013, 7 pages.
Office Action received for Canadian Patent Application No. 2,929,782, dated Sep. 14, 2020, 10 pages.
Final Office Action received for U.S. Appl. No. 14/474,117, dated Jul. 10, 2020, 28 pages.
Preliminary Amendment filed on Aug. 19, 2020, for U.S. Appl. No. 16/868,239, 6 pages.
Related Publications (1)
Number Date Country
20180075509 A1 Mar 2018 US
Provisional Applications (4)
Number Date Country
61649061 May 2012 US
61649091 May 2012 US
61614930 Mar 2012 US
61601822 Feb 2012 US
Continuations (2)
Number Date Country
Parent 14859601 Sep 2015 US
Child 15815358 US
Parent 13725669 Dec 2012 US
Child 14859601 US