Identifying and locating users on a mobile network

Information

  • Patent Grant
  • 10382895
  • Patent Number
    10,382,895
  • Date Filed
    Friday, September 28, 2018
    6 years ago
  • Date Issued
    Tuesday, August 13, 2019
    5 years ago
Abstract
A system and method of locating “friends” having mobile devices connected to a network and associated with a user account is disclosed. The method includes sending a request to a mobile device, the mobile device determining its present geographic location and responding to the requestor with this information. This information may be in the form of a coordinate location such as a GPS location or it may be in the form of a name that the mobile device owner assigned to a particular area (e.g., “home”). Having this location information, a user is able to view the location of the friend that is associated with the mobile device.
Description
BACKGROUND

1. Technical Field


The present disclosure relates to remotely communicating with a mobile device such as a mobile telephone or a media player and more specifically, to cause a mobile device to perform a function through the transmission of one or more remote commands.


2. Introduction


Mobile devices have been adapted to a wide variety of applications, including computing, communication, and entertainment. Through recent improvements, mobile devices can now also determine their geographic location by either using a built-in global position system (GPS) antenna or extrapolating its location from the signals it receives through the network of fixed-location cellular antennas. Thus, a user may be able to use the mobile device to determine his or her location.


A mobile device user may wish to have friends or family members know of his or her location and likewise, he or she may like to know the location of his or her friends or family members. Several known systems perform such services. However, one drawback of such services is that determining locations, particularly when using UPS devices, may consume a lot of power.


Balancing battery life and mobile device performance is a chief concern for mobile device makers, and location aware programs are a big part of those concerns. Specifically, applications that must make frequent requests of a GPS device consume a lot of power. Such applications include mapping programs, and social location aware applications such as FOURSQUARE and GOOGLE LATITUDE, which allow a user to share his location with a server so that authorized friends can view the user's location on their mobile devices. Frequently, such services require an application running on the user's mobile device to periodically activate the GPS device, learn the user's location, and update the server. Such repeated use of the GPS device drastically reduces the battery life of the mobile device.


SUMMARY

Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be obvious from the description, or can be learned by practice of the herein disclosed principles. The features and advantages of the disclosure can be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features of the disclosure will become more fully apparent from the following description and appended claims, or can be learned by the practice of the principles set forth herein.


Disclosed are systems, methods, and non-transitory computer-readable storage media for determining the location of one or more mobile devices connected to a communications network. The present technology provides a system for allowing users to learn the location of other users whom have given permission to have their location shared. In a preferred embodiment a user can launch an application which allows a user to request permission from a friend to receive information describing their location. The application can list the friends whom have given their permission to a user to view their location information.


When a user desires to see the location of one or more friends, the application can request location information for each friend, or selected friends, from a system server. The server can receive and interpret the request to determine whether the application requires detailed location information or approximate location information. For example, if the application has requested location information for all friends, it would be interpreted as a request for only approximate information because, among other reasons, displaying all friends on a map on a computer screen only requires approximate locations. However, if the application recently received updated approximate information regarding a particular friend, but is now requesting additional location information on just that specific friend, it is likely that the application requires detailed location information.


The difference in detailed location information versus approximate location information is based in not only a threshold of tolerated variance of the location information but also time since updated location information was received by the server, and the power required to learn accurate location information by the friend's device. For example, detailed location information might require an accuracy of +/−3 m, and with present technology, such accuracy is most often obtained using a GPS device. Additionally, detailed location information might only be considered accurate for a duration of 1 minute or less. In contrast, approximate location information may only require a city level of accuracy (e.g., +/−1 km) and be deemed relevant for up to 15 minutes or more.


A request to locate a friend is processed by a central server. Upon receiving a request, the server may forward the request to the friend's device and wait for a response. Alternatively, the server may respond to the request without contacting the friend's device. For example, the server may have cached location information of the friend's device. Because location information is only relevant at certain accuracies and for a certain period of time, the server may compare the cached information with the request and/or any predetermined constraints before sending the cached location information rather than sending a request to the friend's device.





BRIEF DESCRIPTION OF THE DRAWINGS

In order to describe the manner in which the above-recited and other advantages and features of the disclosure can be obtained, a more particular description of the principles briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only exemplary embodiments of the disclosure and are not therefore to be considered to be limiting of its scope, the principles herein are described and explained with additional specificity and detail through the use of the accompanying drawings in which:



FIG. 1 illustrates an example system embodiment;



FIG. 2 illustrates an exemplary computing environment;



FIGS. 3a and 3b illustrate flow diagrams describing exemplary processes for locating a mobile device;



FIG. 4 illustrates a flow diagram describing an exemplary process for locating a mobile device and updating the location information;



FIG. 5 illustrates a flow diagram describing an exemplary process for sending an invitation to a mobile device user to share location information;



FIGS. 6-12 illustrate exemplary user interfaces depicting how a user may locate friends;



FIGS. 13-15 illustrate exemplary user interfaces depicting how a user may send to friends invitations to be located;



FIGS. 16-17 illustrate exemplary user interfaces depicting how a user may receive and respond to an invitation to be located; and



FIGS. 18-20 illustrate exemplary user interfaces depicting how a user may change his or her location information.



FIGS. 21-24 illustrate exemplary user interfaces depicting how an invitation to share location information until an expiration time may be configured and displayed.





DETAILED DESCRIPTION

Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure.


The present disclosure addresses the need in the art for a mechanism for transmitting location information of a user's mobile device and locating friends and family members through their respective mobile devices. A system, method and non-transitory computer-readable media are disclosed which locate a mobile device by sending a command to the device to determine its present location and report it back to the requester. A brief introductory description of a basic general purpose system or computing device in FIG. 1 which can be employed to practice the concepts is disclosed herein. A more detailed description of the methods and systems will then follow.


With reference to FIG. 1, an exemplary system 100 includes a general-purpose computing device 100, including a processing unit (CPU or processor) 120 and a system bus 110 that couples various system components including the system memory 130 such as read only memory (ROM) 140 and random access memory (RAM) 150 to the processor 120. The system 100 can include a cache 122 of high speed memory connected directly with, in close proximity to, or integrated as part of the processor 120. The system 100 copies data from the memory 130 and/or the storage device 160 to the cache 122 for quick access by the processor 120. In this way, the cache 122 provides a performance boost that avoids processor 120 delays while waiting for data. These and other modules can control or be configured to control the processor 120 to perform various actions. Other system memory 130 may be available for use as well. The memory 130 can include multiple different types of memory with different performance characteristics. It can be appreciated that the disclosure may operate on a computing device 100 with more than one processor 120 or on a group or cluster of computing devices networked together to provide greater processing capability. The processor 120 can include any general purpose processor and a hardware module or software module, such as module 1162, module 2164, and module 3166 stored in storage device 160, configured to control the processor 120 as well as a special-purpose processor where software instructions are incorporated into the actual processor design. The processor 120 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc. A multi-core processor may be symmetric or asymmetric.


The system bus 110 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. A basic input/output (BIOS) stored in ROM 140 or the like, may provide the basic routine that helps to transfer information between elements within the computing device 100, such as during start-up. The computing device 100 further includes storage devices 160 such as a hard disk drive, a magnetic disk drive, an optical disk drive, tape drive or the like. The storage device 160 can include software modules 162, 164, 166 for controlling the processor 120. Other hardware or software modules are contemplated. The storage device 160 is connected to the system bus 110 by a drive interface. The drives and the associated computer readable storage media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the computing device 100. In one aspect, a hardware module that performs a particular function includes the software component stored in a non-transitory computer-readable medium in connection with the necessary hardware components, such as the processor 120, bus 110, output device 170, and so forth, to carry out the function. The basic components are known to those of skill in the art and appropriate variations are contemplated depending on the type of device, such as whether the device 100 is a small, handheld computing device, a desktop computer, or a computer server.


Although the exemplary embodiment described herein employs a storage device 160, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, digital versatile disks, cartridges, random access memories (RAMs) 150, read only memory (ROM) 140, a cable or wireless signal containing a bit stream and the like, may also be used in the exemplary operating environment. Non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.


To enable user interaction with the computing device 100, an input device 190 represents any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth. An output device 170 can also be one or more of a number of output mechanisms known to those of skill in the art. In some instances, multimodal systems enable a user to provide multiple types of input to communicate with the computing device 100. The communications interface 180 generally governs and manages the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.


For clarity of explanation, the illustrative system embodiment is presented as including individual functional blocks including functional blocks labeled as a “processor” or processor 120. The functions these blocks represent may be provided through the use of either shared or dedicated hardware, including, but not limited to, hardware capable of executing software and hardware, such as a processor 120, that is purpose-built to operate as an equivalent to software executing on a general purpose processor. For example the functions of one or more processors presented in FIG. 1 may be provided by a single shared processor or multiple processors. (Use of the term “processor” should not be construed to refer exclusively to hardware capable of executing software.) Illustrative embodiments may include microprocessor and/or digital signal processor (DSP) hardware, read-only memory (ROM) 140 for storing software performing the operations discussed below, and random access memory (RAM) 150 for storing results. Very large scale integration (VLSI) hardware embodiments, as well as custom VLSI circuitry in combination with a general purpose DSP circuit, may also be provided.


The logical operations of the various embodiments are implemented as: (1) a sequence of computer implemented steps, operations, or procedures running on a programmable circuit within a general use computer, (2) a sequence of computer implemented steps, operations, or procedures running on a specific-use programmable circuit; and/or (3) interconnected machine modules or program engines within the programmable circuits. The system 100 shown in FIG. 1 can practice all or part of the recited methods, can be a part of the recited systems, and/or can operate according to instructions in the recited non-transitory computer-readable storage media. Such logical operations can be implemented as modules configured to control the processor 120 to perform particular functions according to the programming of the module. For example, FIG. 1 illustrates three modules Mod1162, Mod2164 and Mod3166 which are modules configured to control the processor 120. These modules may be stored on the storage device 160 and loaded into RAM 150 or memory 130 at runtime or may be stored as would be known in the art in other computer-readable memory locations.


Having disclosed some components of a computing system, the disclosure now turns to FIG. 2, which illustrates a general purpose mobile computing environment 200. A communication network 210 connects the devices and applications hosted in the computing environment 200. In this computing environment 200, different devices may communicate with and send commands to each other in various ways. The server 230, for example, may function as an intermediary between two or more user devices such as, computer 220, mobile device 240, and mobile device 245. The server 230 may pass messages sent from one user device to another. For example, the server 230 may receive a request from device 240 (the “requesting device”) to locate another device 245 (the “requested device”). In response to such a request (preferably after appropriate authentication and authorization steps have been taken to ensure the request is authorized by the user of the requested device), the server 230 may send a request to the requested device 245 and receive a response containing information relating to its location. The requested device 245 may have obtained this location information based on signals it received from, for example, UPS satellites 260. Having received a response, the server 230 may then send the information to the requesting device 240. Alternatively, the server 230 does not send a request to the requested device 245 because it has recent location information relating to the requested device 245 cached. In such an embodiment, the server 230 may respond to a request by sending cached location information to the requesting device 240 without communicating with the requested device 245.


The devices 220, 240, and 245 preferably have one or more location aware applications that may run on them. Of these applications, some may have the functionality to send requests to other user devices to enable a requesting user to locate a friend's device. Upon receiving authorization to locate, a requesting device may then be able to send location requests to requested devices and receive responses containing the location of the requested device. Authorization is preferably managed at the server level, but may also be managed at the device level in addition or as an alternative.


Referring back to Ha 2, the communication network 210 can be any type of network, including a local area network (“LAN”), such as an intranet, a wide area network


(“WAN”), such as the internet, or any combination thereof. Further, the communication network 210 can be a public network, a private network, or a combination thereof. The communication network can also be implemented using any type or types of physical media, including wired communication paths and wireless communication paths associated with one or more service providers. Additionally, the communication network 210 can be configured to support the transmission of messages formatted using a variety of protocols.


A device such as a user station 220 may also be configured to operate in the computing environment 200. The user station 220 can be any general purpose computing device that can be configured to communicate with a web-enabled application, such as through a web browser. For example, the user station 220 can be a personal computing device such as a desktop or workstation, or a portable computing device, such as a laptop a smart phone, or a post-pc device. The user station 220 can include some or all of the features, components, and peripherals of computing device 100 of FIG. 1.


User station 220 can further include a network connection to the communication network 210. The network connection can be implemented through a wired or wireless interface, and can support bi-directional communication between the user station 220 and one or more other computing devices over the communication network 210. Also, the user station 220 may include an interface application, such as a web browser or custom application, for communicating with a web-enabled application.


An application server 230 can also be configured to operate in the computing environment 200. The application server 230 can be any computing device that can be configured to host one or more applications. For example, the application server 230 can be a server, a workstation, or a personal computer. In some implementations, the application server 230 can be configured as a collection of computing devices, e.g., servers, sited in one or more locations. The application server 230 can include some or all of the features, components, and peripherals of computing device 100 of FIG. 1.


The application server 230 can also include a network connection to the communication network 210. The network connection can be implemented through a wired or wireless interface, and can support bi-directional communication between the application server 230 and one or more other computing devices over the communication network 210. Further, the application server 230 can be configured to host one or more applications. For example, the application server 230 can be configured to host a remote management application that facilitates communication with one or more mobile devices connected with the network 210. The mobile devices 240, 245 and the application server 230 can operate within a remote management framework to execute remote management functions. The application server 230 can be configured to host a notification service application configured to support bi-directional communication over the network 210 between multiple communication devices included in the computing system 200. For example, the notification service application can permit a variety of messages to be transmitted and received by multiple computing devices.


In some implementations, the notification service can include a defined namespace, in which a unique command collection topic can be created for each subscribing mobile device. A unique identifier can be used to associate a subscribing mobile device with the corresponding command collection topic, such as an assigned number or address. The unique identifier also can be embedded in a Uniform Resource Identifier (URI) that is associated with a subscribed command collection topic. Further, one or more command nodes can be created below a command collection topic, such that each command node corresponds to a particular remote command type. For example, a command collection topic can include a separate command node for a location command.


Through the use of separate command nodes, multiple commands can be transmitted to one or more mobile devices substantially simultaneously. In some implementations, if multiple commands are received in a command collection topic, server time stamps can be compared to determine an order of execution.


Through the notification service, a publisher, such as a remote management application, can publish a remote command message to a command collection topic that is associated with a particular mobile device. When a remote command message is published to the command collection topic, a notification message can be transmitted to the one or more subscribing mobile devices. The mobile device can then access the subscribed topic and retrieve one or more published messages. This communication between the publisher and the mobile device can be decoupled. Further, the remote command message can be published to the appropriate command node of the command collection topic. Additionally, a mobile device receiving a remote command message can publish a response to a result topic hosted by a notification service. A publisher such as a remote management application, can subscribe to the result topic and can receive any published response messages.


Further, the computing environment 200 can include one or more mobile devices, such as mobile device 240 and mobile device 245. These mobile devices are preferably smart phones such as an Apple iPhone® or post-pc device such as an Apple iPad®. Each of the mobile devices included in the computing environment 200 can include a network interface configured to establish a connection to the communication network 210. For example, mobile device 240 can establish a cellular (e.g., GSM, EDGE, 3G, or 4G) network connection that provides data access to the communication network 210. Such a connection may be facilitated by one or more cellular towers 250 located within the range of the mobile devices 240 and 245 and connected to the network 210. Further, mobile device 245 can establish an IEEE 802.11 (i.e., WiFi or WLAN) network connection to the communication network 210. Such a connection may be facilitated by one or more wireless network routers 255 located within the range of the mobile devices 240 and 245 and connected to the network 210. Also, either one of these mobile devices 240, 245 or an additional device may connect to the network 210 through the IEEE 802.16 (i.e., wireless broadband or WiBB) standard. Again, the devices 240, 245 may employ the assistance of a cell tower 250 or wireless router 255 to connect to the communication network 210.


Each of the mobile devices, 240 and 245 also can be configured to communicate with the notification service application hosted by the application server 230 to publish and receive messages. Further, each of the mobile devices 240 and 245 can be configured to execute a remote management application or a remote management function responsive to a remote command received through the notification service application. In some embodiments, the remote management application can be integrated with the operating system of the mobile device.


A mobile device can execute a remote command to perform one or more associated functions. For example the remote commands can include locate commands, notification commands, and message commands. A message command can be used to present a text-based message on the display of a mobile device. A locate command can be used to cause a mobile device to transmit a message indicating its location at the time the locate command is executed. The locate command may also command the mobile device to use certain resources, such as an embedded GPS system, to determine its location.


Additionally, each of the mobile devices 240 and 245 can include an input interface, through which one or more inputs can be received. For example, the input interface can include one or more of a keyboard, a mouse, as joystick, a trackball, a touch pad, a keypad, a touch screen, a scroll wheel, general and special purpose buttons, a stylus, a video camera, and a microphone. Each of the mobile devices 240 and 245 can also include an output interface through which output can be presented, including one or more displays, one or more speakers, and a haptic interface. Further, a location interface, such as a Global Positioning System (GPS) processor, also can be included in one or more of the mobile devices 240 and 245 to receive and process signals sent from GPS satellites 260 for obtaining location information, e.g., an indication of current location. In some implementations, general or special purpose processors included in one or more of the mobile devices 240 and 245 can be configured to perform location estimation, such as through base station triangulation or through recognizing stationary geographic objects through a video interface.


Having disclosed some basic system components and concepts, the disclosure now turns to exemplary method embodiments 300a and 300b shown in FIGS. 3a and 3b respectively. For the sake of clarity, the methods are discussed in terms of an exemplary system 100 as shown in FIG. 1 configured to practice the methods and operating environment shown in FIG. 2. The steps outlined herein are exemplary and can be implemented in any combination thereof, including combinations that exclude, add, or modify certain steps.



FIG. 3a shows a flow diagram illustrating an exemplary process executed by a server for servicing a request by a requesting device to locate one or more mobile devices (requested devices), such as mobile devices 240 and 245 in FIG. 2, connected to a communication network, such as communication network 210 in FIG. 2. The process may be performed by a server such as application server 230 in FIG. 2.


In a preferred embodiment, the server 230 may maintain data associated with the members of one or more services. The maintained data may include certain identification information relating to each ember such as, for example, the member's username and other personal identification information, unique identification information relating to the member's phone, and the identification of other members that have chosen to give permission to share their location information with this member. The information may also include recent location information of each member. This location information may be caused to be updated by certain applications/processes on the member's mobile device and/or at the request of a requesting device. For example, an application on a mobile device such as a mapping service or other location aware application may be requested by the user to determine the location of the device and, whenever such a determination is made, the device may provide this information to the application server. The server may then retain this information in storage 335a for a length of time that has been deemed to still be representative of that device's location (such as, for example, 15 minutes or less).


In a preferred embodiment, a user/requester may have an application on his or her computer or mobile device that, when executed, initiates one or more locate requests to all of the devices whose members have agreed to share their location with the requester (the requester's “friends”). In such embodiments, the application may initially present to the user/requester the location of all of the friends on a map or in a list. The locate request 310a may be received by a server such as application server 210 in FIG. 2 for processing.


Upon receiving a location request from a mobile device 301a of a requesting user, the server may initially respond with the location data that it has cached in 335a, As mentioned above, in a preferred embodiment, the application server may maintain and/or cache information relating to members of services including recent location information. Updates in location information preferably overwrite older location information. Thus, the server may first, in step 315a, determine whether it is in possession of recent location information. As mentioned before, the server may have a set “time of life” for the location information it maintains. When it has decided that the location information is has is recent, in step 330a, the server retrieves the last known location from storage 335a. Again, in some instances such as when a person may be on the go, only very recent location information would be relevant. Thus, some embodiments, the time of life of the information may be adjusted based on the device's recent location activity. Some examples might include when the owner of the device has designated his/her location, such as home or work, where he/she typically remains for several hours at a time each day. Thus, if the server determines that it is in possession of location information of the requested mobile device deemed to be recent, it will provide that information to the requesting device in step 360a.


The server also preferably maintains this location information at a relatively low level of accuracy. The reason for this is similar to why the location is only deemed relevant for a short period of time: the more accurate the location information is, the more likely the person has since moved from that specific location, thereby rendering the location incorrect. Thus, maintaining recent location information at a lower level of accuracy increases the likelihood that the location is still correct and, therefore, not requiring additional communication with the user device.


Alternatively, the server may determine, in step 315a, that it does not have recent location information relating to the requested device. The server may, in step 320a, send a location request to the one or more requested devices (i.e., those devices associated with the friends). In this step, the server transmits a location request message to each requested device. The message sent by the server may take on any number of forms but has the effect of commanding the requested mobile device to obtain its current location information and transmit it back to the server in the form of a response message. In some alternative embodiments, the server only sends a location request message to the cellular network system, which may continually maintain recent location information relating to the requested device. Such location information may include, for example, the coordinates of the cell sites deemed closest to the requested device.


Some time after sending the request in step 320a, server receives responses in step 340a. Depending on, for example, the location of the requested devices and the network traffic, the responses may arrive in any order and may take different amounts of time. The response messages from the devices preferably include information relating to the location of the responding device and the time at which the location was determined.


This location information may be determined by the device in any number of ways including but not limited to those that have been discussed above. This information may even be obtained indirectly (i.e., not directly from the requested device), such as from the cellular communications network to which the device is communicating. For example, obtaining location information from the cell tower identified as being closest to the mobile device. Although this option may be of lower accuracy, it oftentimes may result in a quicker response and a savings in battery life for the requested device. Accordingly, the level of accuracy of the location information may vary. Thus, the location information may therefore include accuracy information as well.


In some embodiments, the owner of the responding device may have the option to enter unique location identifiers or labels associated with a location. For example, a user may assign labels such as “home,” “work,” or “school” to such locations. The user's mobile device may preferably associate certain geographic coordinates with such a label and transmit location-based messages to the server including the associated label.


Upon receiving this information, in step 350a, the server preferably updates the stored information 335a, if any, that it maintains relating to the device's last known location so that it may be made available to the next requester.


Having received a response from a requested device, in step 360a, the server may then send location information to the requesting device. This step may be performed for each response received by the server from the various requested devices. Although location information relating to some devices may have already been retrieved from cache 335a in step 330a, the server may additionally request and send updated information to the requesting device. In some embodiments, the server may additionally have a step (not shown) to compare the “known location information” that it had initially sent o the requesting device with the location information that it just received from the requested device to determine if sending the recently received location information would be any different. In other words, some embodiments would only send location information to the requesting device if the location of the requested device has changed. In such embodiments, a reduction in the amount of data that needs to be communicated may be realized.


In addition to temporal accuracy, the server may also have logic to determine how to handle a location request having a certain geographic location accuracy. FIG. 3b shows a flow diagram illustrating an exemplary process 300b executed by a server for servicing a request by a requesting device to locate one or more mobile devices within a certain level of accuracy.


In a preferred embodiment, in step 310b, the server receives a request to acquire location information relating to a requested device at a certain acceptable level of accuracy (accuracy y). In a preferred embodiment, the server typically only maintains, in storage 335b, location information relating to devices at one level of accuracy (accuracy x). After receiving the request, in step 315b, the server determines whether the accuracy of the location information it has in storage 335b is greater than or equal to the accuracy requested by the requesting device (i.e., accuracy x≥accuracy y). If so, the level of accuracy is deemed acceptable and, in step 330b, the server retrieves the stored location information and, in step 360b, sends the location information to the requested device.


More typically, however, when the server receives a request for location information of a requested device, the requested accuracy (accuracy y) is greater than the accuracy of the information stored in 335b (accuracy x) (i.e, accuracy y>accuracy x). When this is determined in step 315b, the server sends a request to the requested device in step 320b. This request may be in several different forms. For example, the server may simply transmit the contents of the request to the requested device, containing the requested accuracy information, and leave it to the requested device (through its hardware, operating system, and applications) to determine how to respond to the request. Alternatively, the server may have sufficient information relating to the capabilities of the requested device (such as it having a GPS antenna of a certain accuracy) and the message sent is simply a command to determine its location using its GPS antenna and send this information to the server. The server, in step 340b, then receives the location information from the requested device. Again, this information may be in several different forms and may depend on the device information known by the server. For example, the response may include accuracy information provided by the requested device or may simply include the location and the means by which it was obtained. In the latter form, the server, preferably knowing the model features of the requested device, may then determine the accuracy provided by the requested device. Also, depending on the request sent by the server, the means information may not be provided in the response but may be implied by the server as the same as what was requested. Once the location information is received by the server, in step 350b, it updates its stored location information, 335b, and sends location information to the requesting device in step 360b.


Generally, the location information that is handled is of a low accuracy, such as at a city level or within a few miles of accuracy. As mentioned above, such information may be obtained by the server indirectly by, for example, knowing the geographic location of the cell phone tower or ISP to which the requested device is communicating. It is generally understood that mobile phones communicating with a cellular communications network periodically seek out cell sites having the strongest signal. In many cases, the strongest signals are measured by those cells that are the shortest distance away. Thus, in an area where there is a cell-phone tower every 4 miles, for example, the location of the mobile device may be extrapolated to be within 2 miles of the closest cell tower. A more accurate method of determining the location of a mobile device may be by determining the time difference of arrival (TDOA). The TDOA technique works based on trilateration by measuring the time of arrival of a mobile station radio signal at three or more separate cell sites. Such a method may be based on the availability of certain equipment supplied by the cellular network which may not be universally available and is therefore only an alternative embodiment. In either case, the location/accuracy determination may be performed by the communications network rather than by the mobile device. Such low accuracy information may preferably be transmitted by the server to the requesting device initially to give the device user a quick read on where his or her friends are located. The actions associated with obtaining such low accuracy information is herein referred to as a “shallow locate.”


Such low accuracy (i.e., less accuracy) location requests are only approximations but are preferably made initially, as they may result in the fastest response and require fewer resources from the requested device. On the other hand, a “deep locate request” may be requested by a user of a requesting device to obtain location information of a relatively higher accuracy (i.e., more accurate) from the requested device. For example, a “deep locate request” may command the requested device to use its GPS location resources to obtain location information having a level of accuracy that may be greater than that of some of the other location methods discussed above. While using a device feature such as GPS may be more accurate, the time and resources required to obtain signals from a sufficient number of GPS satellites and calculate the location oftentimes may take longer and require more energy. Thus, the “deep locate request” option is preferably reserved for specific requests made by the user of the requesting device.


This concept of a “shallow locate request” and a “deep locate request” is further illustrated from the perspective of the requesting device, such as a mobile device, in exemplary method 400 of FIG. 4. a preferred embodiment, method 400 begins with step 410 when the application is started on a mobile device. Initially, in step 420, the device may request location information of all friends that are associated with the user. This initial request is preferably a “shallow locate request” that is sent out to all of the “friend” devices (i.e., devices whose owners have allowed the requester to obtain location information). This request is sent to the server where it may be passed onto the requested device or serviced by the server, or both, as discussed above. Requesting device may then, in step 430, receive responses containing the shallow locations of its user's friends. As the responses are received, requesting device may display the locations of the friends to the user in step 440.


As individuals are often on the go, it is of value to the requesting user to occasionally have the location information of friends updated from time to time. The updating or refreshing of location information, performed in step 450, may be done automatically at predetermined intervals, such as every 15 seconds or 15 minutes, and/or may be done at the request of the user. These predetermined timing intervals may be consistently applied to every user or may be individually applied to each user differently based on the individual user's observed moment frequency in combination with the heuristics of observed general user-movement data (e.g., determine a shorter time interval for a user observed to be traveling on a highway but determine a longer time interval for a user who has “checked-in” to a location such as a restaurant). As is shown in method 400, a refresh step 450 will operate to repeat a request for shallow location information of all of the user's friends.


In addition to requesting and obtaining shallow location information of all of the user's friends, the user may request and obtain more detailed or “deep” location information of one or more friends, beginning in step 460. To perform a “deep locate request,” in a preferred embodiment the user may select a friend that has been presented to the user after a shallow locate request. In this preferred embodiment, a deep locate request is sent to the server which will send a command to the requested device to provide more detailed location information. This request may include commanding the device to obtain accurate location information from its GPS system. Upon the receipt of the response in step 470, the requesting device may display the deep location of the friend to the user in step 480. The accuracy of the deep location may also be displayed to the requesting user.


One way a user may gain authorization to obtain location information of a device associated with a friend is shown by method 500 in FIG. 5. In most embodiments, in order for a user to be able to locate a friend, that user must send an authorization request to the friend. A user may do this by, in step 510, selecting a friend to request authorization. In a preferred embodiment, the locating application may refer to or rely upon other applications on the user's device to maintain information of the user's friends. One example may be an address book application that stores contact information of persons known by the device user. These persons may include, friends, family members, business contacts, and others whose contact information the user has obtained. in the case where the particular person is not in the user's address book, the user may be able to enter that person's contact information directly into the running application. Upon selecting/entering a contact to locate, in step 520, an authorization request is prepared and sent from the user's device.


Upon receiving a request from a user, the requested person (i.e., “friend”) is preferably presented with a message explaining the nature of the request and where he or she may either accept the request or reject the request. When the friend accepts the request in step 530, an acceptance response is sent from that friend's device in step 540. Upon receiving an accepting response, the server may update the information it maintains on either or both the requesting user and accepting friend such that when the user sends a location request, the server will process that request in step 550. In addition, a notice may be sent by the server back to the requesting user to indicate to the user and/or the user's device that the authorization request has been accepted. Accordingly, the user may now obtain location information relating to that friend. In a preferred embodiment, the friend may revoke the authorization given to the user at any time; thus, the friend maintains control over the privacy of his or her location information.


On the other hand, a friend who has received a request to authorize the user to locate him or her but has rejected or ignored the request in step 560 may not be able to obtain location information relating to that friend. Thus, if the user subsequently attempts to locate that friend, in step 570, both the device and the server will not process that request. From the requesting user and device perspective, such a friend would be displayed as having a status of “awaiting a response,” location not available,” or simply will not be listed. Of course, in some embodiments, the user may be able to send another request to the friend subsequently.



FIGS. 6-20 show a series of “screen shots” of preferred embodiments of the present disclosure as they might be viewed on a mobile device, such as the iPhone® or iPad®, both by Apple, Inc. One skilled in the art will appreciate that while the preferred embodiments are shown on these particular Apple products, the location application may be employed on any type of mobile device, smart phone, post-pc device, laptop computer, or desktop computer.



FIG. 6 illustrates an interface window 600 that is presented to a user when he or she initially runs the location program. In this window, the user may be prompted to enter his or her user ID 610 and password 620 associated with an account that the user has presumably already established with the location service. After entering a user ID and password, the user may select the “sign in” button 630 to authenticate and run the program. If the user has not yet created an account, the user may do so by selecting button 640.


As shown in 7, when the user logs in for the first time, he may be presented with a screen 700 prompting him or her to invite friends to share their location. To invite a friend to share their location, a user may tap on the “+” button 710 to open a screen to select friends to invite. A more detailed explanation of these actions is in the discussion associated with FIGS. 11-16 below.


On the other hand, FIG. 8 shows what a user may likely immediately see when logging in after having several friends accept the user's invitation to share their location. As shown in FIG. 8, a list of friends 800 is displayed to the user. Next a displayed friend's information 810 is a locating status indicator 820. In this case, the status is that the device has sent out location requests to all of the friends' devices and is still waiting for responses from each of the devices.


After a brief time has elapsed and the device has received location information relating to the user's friends, the location information may be presented to the user in display interface 900, as shown in FIG. 9. As can be seen in FIG. 9, the friend information 910 may now include the location of the friend 920, the accuracy of the location information 930, and the time at which the location information was obtained 940. The location 920 may be presented in a number of ways. For example, location information 920 includes a label that was selected by the user. Alternatively, the location information may include the name of the town or an address at which the friend is located, as in 950. Additionally, when a location request was not successful, the display 900 may present a message similar to that of 960.



FIG. 10 shows an alternative embodiment of displaying location information of friends. As is shown in FIG, 10, map interface 1000 is presented. In a preferred embodiment, the initial scale of map interface 1000 may be determined by the identified locations of each of the user's friends such that all of the user's friends may be viewed on one screen. Thus, if all of the user's friends are located within a few miles from each other, the scale of map interface 1000 may be zoomed in such that only a few miles (i.e., a city level) are presented. On the other hand, if the user's friends are located across the country or in other countries, the scale of the map may be zoomed out such that map interface 1000 is covering hundreds or even thousands of miles (i.e., a state level).


Referring again to FIG. 10, the user is presented with locations of his or her friends on map 1000. In a preferred embodiment, the locations of the friends are presented as dots 1010 and 1020. However, any other icon or other reasonable method of indicating the location of a person on an interactive map may be used. When the user selects one of the dots, information relating to the friend at that location appears, as is shown in dot 1010. Additionally, the accuracy information may also be graphically presented on the map in the form of a shaded circle surrounding the friend's dot with a radius equivalent to the level of accuracy provided, as is shown in dot 1010.



FIGS. 11 and 12 show alternative embodiments of the present invention. Such embodiments may be ideal for use on a device that has a larger screen, such as an iPad, laptop, or desktop computer. In FIG. 11, interface 1100 displays both a listing of the user friends in a table format 1100 as well as their geographic location on a map 1120. In interface 1100, when a user selects one of his or her friends 1130 on the map 1120, details relating to the location of the friend may appear at the bottom of the map 1140. Similarly, in FIG. 12, which provides an interface in a different aspect ratio, interface 1200 presents to the user a map 1220 indicating the geographic locations of his or her friends 1225. Overlaying the map is a list of the user's friends in table 1210. Similar to interface 1100, when the user selects one of his or her friends within table 1210, details of that friend may be displayed at the bottom of the display 1240.


When a user wishes to send to a friend an invitation to share their location, “Add Friend” interface 1300, as shown in FIG. 13, may be used. In interface 1300, the user may enter the contact information of the friend/invitee at 1310 and may also include an optional personal message at 1320. As mentioned above, the contact information may be obtained from other services or applications located on the user's device, as is shown in contacts list 1400 in FIG. 14.



FIG. 15 shows a completed add friend request form 1500 with the name of the contact (preferably an embedded link to the contact's e-mail address, phone number, or other relevant contact information) entered at 1510. Also shown is a brief personal message 1520,



FIG. 16 shows one way a friend may be notified that he or she has received an invitation to share their location with the requesting user in window 11600. As presented to the friend in window 1600, the user may either view the invitation immediately by selecting button 1610 or may choose to view the invitation at a later time by selecting button 1620. Note that this notification may preferably be in the form of a system-based message that provides notification regardless of any particular application currently running.


When the friend selects to view the invitation, he or she is presented with a request message 1700, as shown in FIG. 17. In request message 1700, the invitation preferably includes the name of the inviter 1710 and a brief personal message 1720. In addition, the invitation may include an accept button 1730 and a decline button 1740.


Referring now to FIG. 18, a mobile device user may maintain certain items associated with his or her account in interface 1800. In interface 1800, a user may, for example set a label 1810 to his or her present location in field 1820. A user may also review the list of followers 1830 which include all of the friends whom he or she has accepted invitations to be followed. A user may additionally select to hide from all of his or her followers by toggling switch 1840.


With respect o assigning labels to certain locations, interface 1900 of FIG. 19 may be presented to a user for this purpose. In interface 1900, a user may select one of the prepared labels 1910 or may add a custom label by entering text into field 1930. The current label in use is shown in field 1920. In addition to the prepared labels 1910 in interface 1900, additional location-specific label options may be automatically added, as is shown in interface 2000 in FIG. 20. As is shown in FIG. 20, location label 2010 has been added to the list of prepared location labels. A label such as label 2010 may be added when the user is determined to be located in the vicinity of a Starbucks for example.


To further explain certain embodiments in this disclosure, the following use scenarios are presented to show how certain users of mobile devices may be able to use one or more embodiments in the disclosure to locate his or her friends.


One scenario may occur when a mobile device user is located somewhere, say downtown Palo Alto, at noon and wants to know if any of his friends are in the vicinity and are available for a quick lunch. The user may be able to use an embodiment in the present disclosure to see the location of his or her friends, identify one that is close by, and subsequently make contact.


A second scenario may arise when there is a need or desire by users of mobile devices to allow others to know where they are at certain times. One such situation is where a mobile device user may, for example, be training for a marathon and is outside running for miles each day. This user wishes to have her partner aware of her location during this period of time so that she can always be located in case something happens and may therefore benefit from embodiments in this disclosure. Also, when this person is actually participating in the marathon, her friends may want to know at what part of the course she has made it to so that they may be able to be present at certain locations during the race to cheer her on. In such a scenario, the user would benefit from embodiments of the disclosure having a map of the race course superimposed onto a street map of the area such that the users may be able to see the location of the runner and have some indication about the location where she will be heading to next.


A third scenario may arise when users of mobile devices wish to receive an indication that someone has reached a certain location. In such a scenario, one user of a mobile device may, for example, be embarking on a road trip and another person wants to be notified when he or she has arrived. Such a scenario might include a parent who is allowing her teenage son to take the family car on a holiday weekend to drive to visit his cousins that live several hours away. Although the parent has asked that the son call as soon as he arrives, he is often forgetful and does not do so. To overcome this, the parent or son may take advantage of an embodiment of the present disclosure where they may set an alert to automatically notify the parent when the son has arrived at the destination. In the interim, the parent may additionally use other embodiments to manually locate the son's mobile device to make sure that he has not gotten lost.


A fourth scenario may arise when users of mobile devices wish to receive a notification when someone has entered a certain geographic location. For example, a person commutes to and from the city using public transportation but does not live in walking distance to the train or bus stop. Rather than driving and parking, the person may rely on a spouse or partner to pick her up in the evenings or whenever there is inclement weather. As certain busses and train cars have rules and courtesies prohibiting talking on cell phones, the commuter may have to wait to call her spouse or partner until after she arrives and subsequently having to wait, for example, in the rain. The users would benefit from some embodiments of the disclosure that would allow for a way for the commuter's mobile device to notify her partner's device whenever she enters into a certain geographic region (i.e., is close to arriving at the bus or train stop) without requiring the commuter to place a call. Thus, the commuter and her partner may both arrive to the stop close to the same time.


Similarly, a fifth scenario includes users having certain household appliances that may be connected to a network and can perform certain tasks upon receiving a notification when a person enters a certain area. For example, when a person is traveling to her vacation home out in the mountains, certain appliances in the vacation home such as, for example, the furnace and front porch light, may turn on when the person enters into a certain geographic area (i.e., gets close to the home). An embodiment of this disclosure would enable a user to have and benefit from such a configuration.


A sixth scenario may arise when someone wishes to receive a notification when a mobile device user has left a certain geographic location. For example, a parent has asked his daughter to stay at home for the weekend to finish a school assignment that is due the following Monday. If the daughter leaves the neighborhood with her mobile device, the parent may be notified. Aspects of the disclosed technology would enable a parent to receive such notifications.


A seventh scenario may arise when some mobile device users wish to be located for only a brief period of time. For example, a person is on a business trip in a city and wants to be able to meet up for dinner with an old friend who lives in that city. Since she is not normally in that city and does not often interact with this old friend, she does not want the old friend to be able to locate her all the time. One embodiment of the disclosure employs a “day pass” which the person may send to the old friend to allow the old friend to locate her for the next 24 hours. After that time, the day pass is expired and the old friend may not be able to locate the person anymore.


In an eighth scenario, a user may select a number of persons in her contact list to all share location information with each other for a limited period of time. For example, a user is in town to attend a conference such as Apple's WWDC. The user knows that some people that she knows are also attending the conference and she would like to know their whereabouts during the event. One embodiment of the disclosure enables this user to send an invitation to the persons that she wants to locate at the conference. When the user's acquaintances accept her invitation, she and the acquaintances will be able to locate each other. Certain limits on this ability to locate each other may be set by a user, however, such as certain windows of time during the day (such as, only during the conference), or until an expiration time.



FIGS. 21-24 disclose the configuration of certain interfaces that may be used to share location information until an expiration time and may be used, for example, during a scenario such as the one explained in scenario eight. FIG. 21 displays one embodiment of an invitation interface screen in which the user may configure and send an invitation to friends to share their location. The user may add friends to the invitation by tapping the “+” button 2120. similar to the one described in FIG. 7. The friends that have been added to the invitation may be displayed on the screen 2110 to indicate that they have been added, similar to that of a “To:” line in a composed e-mail. FIG. 21 shows that the user has added two friends to the invitation, as their names; “Jared Gosler” and “Susan Adams” are displayed.


In the exemplary interface shown in FIG. 21, the user may also, for example, relate the invitation to enter a particular event 2130 and set an expiration time 2140. However other configuration options such as setting an applicable geographical area and other time constraints may also be offered. In FIG. 21, the user has related the invitation to the “WWDC” conference and set an expiration time to be “Fri, June 10 at 10 AM,” In some embodiments, the relating of the invitation to a particular event may enable the users to have access to certain maps and wireless access ports hosted by the particular event, which may, for example, offer more accurate non-GPS location information (i.e., specific conference rooms). The expiration time sets a limit on how long the user and the invited friends may share location information.



FIG. 22 shows an alert that an invited friend may receive upon receiving an invitation to share location information sent by the user. A message box 2210 may be displayed providing notification of the request to the friend. The text 2220 of the request may explain that the friend has been invited to share location information with the user and another person (Susan Adams) until the set expiration time. In this embodiment the message box 2210 includes buttons to enable the device user to close the message box or view the invitation. In other embodiments the message box may also include additional or different buttons to accept, ignore, or reject the invitation.



FIG. 23 shows an exemplary embodiment displaying an invitation. This invitation may include the related event 2310 and text 2320 explaining the details of the request to share location information including the set expiration time . The names of all parties invited to share location invitation and their response status 2330, 2340, and 2350 may also be displayed. As illustrated, a check mark may be placed next to a person's name to indicate that the person has accepted the invitation. Similarly, a question mark may be displayed next to a person's name to indicate that the person has not yet replied to the invitation and so it is still uncertain whether they will accept the invitation. If a person declines the invitation, an X may be displayed next to their name to indicate their decision to not share location information. This may also indicate that the person is not in the geographic area of the conference and/or, in some cases, has not checked in. Upon receipt of an invitation, a device user may decline or accept by selecting one of the available options 2360 and 2370.



FIG. 24 illustrates an embodiment showing what a user would likely see when the user selects to view the temporary friend 2460 and after invited friends have accepted a user's invitation to share limited-time location information. As shown in FIG. 24, there is an event 2410, WWDC, associated with the sharing of this location information. As mentioned above, in some embodiments, certain additional features may become available when the locate is associate with a particular hosted event, like connecting to local geo-coded access ports and receiving notifications from the event organizer. Alternatively or in addition, entering the event name and associating the locate with an event may simply auto-fill information such as the end time of the conference or event. Here, the end time 2420 of the locate permission is shown to expire on June 10 at 10 am. Information relating to the friends that have accepted a temporary locate request is shown on this display 2430a, 2430b. Similar to the embodiment shown in FIG. 9, the friend information may include, among other things, the name of the friend, their last known location and time of that last known location. Preferably specific to associating requests to a particular event, a user may be able to contact all of the other users on the list by clicking on a a button to send a group message 2440. This button, when selected, may allow the user to compose one message which will be sent to each friend who accepted the invitation to share location information. The user may also select a button to view a map 2450 which, when selected, may display an overhead map which indicates each friend's location upon it. As mentioned above, the map may be a typical location map or may be a map customized and associated with the related event (i.e., a map showing the rooms inside of the Moscone Center and Yerba Buena Center).


As described above, one aspect of the present technology is the gathering and use of data available from a user's mobile device. The present disclosure contemplates that in some instances, this gathered data may include personal info ration data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include hardware information relating to the user device, location-based data, telephone numbers, email addresses, social media IDs such as TWITTER IDs, work and home addresses, friends, or any other identifying information. The user typically enters this data when establishing an account and/or during the use of the application.


The present disclosure recognizes that the use of such personal information data in the present technology can be used to the benefit of users. In addition to being necessary to provide the core feature of the present technology (i.e., locating users), the personal information data can also be used to better understand user behavior and facilitate and measure the effectiveness applications. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure.


The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy and security policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.


Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of location aware services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the sending of personal information data. The present disclosure also contemplates that other methods or technologies may exist for blocking access to user's personal information data.


Embodiments within the scope of the present disclosure may also include tangible and/or non-transitory computer-readable storage media for carrying or having computer-executable instructions Or data structures stored thereon. Such non-transitory computer-readable storage media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above. By way of example, and not limitation, such non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of the computer-readable media.


Computer-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Computer-executable instructions also include program modules that are executed by computers in stand-alone or network environments. Generally, program modules include routines, programs, components, data structures, objects, and the functions inherent in the design of special-purpose processors, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.


Those of skill in the art will appreciate that other embodiments of the disclosure may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. in a distributed computing environment, program modules may be located in both local and remote memory storage devices.


The various embodiments described above are provided by way of illustration only and should not be construed to limit the scope of the disclosure. Those skilled in the art will readily recognize various modifications and changes that may be made to the principles described herein without following the example embodiments and applications illustrated and described herein, and without departing from the spirit and scope of the disclosure.

Claims
  • 1. A computer-implemented method comprising: receiving, by a first mobile device at a user interface of the first mobile device, one or more user inputs instructing the first mobile device to share a location of the first mobile device with a second mobile device, the one or more user inputs indicating an expiration time;estimating the location of the first mobile device;sharing the location of the first mobile device between the first mobile device and the second mobile device until the expiration time;receiving, at the first mobile device, an indication of a location of the second mobile device determined by the second mobile device; andpresenting, on the user interface of the first mobile device, a first portion indicating the location of the second mobile device, a second portion indicating a time at which the location of the second mobile device was determined, and a third portion indicating the expiration time.
  • 2. The method of claim 1, wherein the first portion comprises: a graphical map, anda first icon indicating the location of the second mobile device with respect to the graphical map.
  • 3. The method of claim 2, wherein the first icon comprises an image associated with a user of the second mobile device.
  • 4. The method of claim 3, further comprising presenting, on the user interface of the first mobile device, a fourth portion indicating a name of the user of the second mobile device.
  • 5. The method of claim 2, further comprising receiving, at the first mobile device, an indication of a location of a third mobile device and an indication of a time at which the location of the third mobile device was determined, wherein the first portion further indicates the location of the third mobile device, andwherein the second portion further indicates the time at which the location of the second third device was determined.
  • 6. The method of claim 5, wherein the first portion comprises a second icon indicating the location of the third mobile device with respect to the graphical map.
  • 7. The method of claim 1, further comprising varying a zoom level of the graphical map based on the location of the second mobile device and the location of the third mobile device.
  • 8. The method of claims 1, further comprising discontinuing sharing the location of the first mobile device between the first mobile device and the second mobile device after the expiration time.
  • 9. The method of claim 1, wherein the indication of the location of the second mobile device and the indication of the time at which the location of the second mobile device was determined is received from a server system different from the first mobile device and the second mobile device.
  • 10. The method of claim 1, wherein the location of the first mobile device is shared between the first mobile device and the second mobile device through a server system different from the first mobile device and the second mobile device.
  • 11. The method of claim 1, further comprising receiving, at the first mobile device, additional indications of the location of the second mobile device according to a plurality of different times.
  • 12. The method of claim 11, wherein the additional indications of the location of the second mobile device are received periodically.
  • 13. The method of claim 1, wherein receiving the one or more user inputs comprises receiving a first user input selecting a user associated with the second mobile device from a contact list.
  • 14. The method of claim 13, wherein receiving the one or more user inputs further comprises receiving a second user input specifying a textual message for presentation to the user associated with the second mobile device.
  • 15. The method of claim 14, wherein receiving the one or more user inputs further comprises receiving a third user input specifying that the textual message be transmitted to the second mobile device.
  • 16. At least one non-transitory storage device storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving, by a first mobile device at a user interface of the first mobile device, one or more user inputs instructing the first mobile device to share a location of the first mobile device with a second mobile device, the one or more user inputs indicating an expiration time;estimating the location of the first mobile device;sharing the location of the first mobile device between the first mobile device and the second mobile device until the expiration time;receiving, at the first mobile device, an indication of a location of the second mobile device determined by the second mobile device; andpresenting, on the user interface of the first mobile device, a first portion indicating the location of the second mobile device, a second portion indicating a time at which the location of the second mobile device was determined, and a third portion indicating the expiration time.
  • 17. The at least one non-transitory storage device of claim 16, wherein the first portion comprises: a graphical map, anda first icon indicating the location of the second mobile device with respect to the graphical map.
  • 18. The at least one non-transitory storage device of claim 17, wherein the first icon comprises an image associated with a user of the second mobile device.
  • 19. The at least one non-transitory storage device of claim 18, the operations further comprising presenting, on the user interface of the first mobile device, a fourth portion indicating a name of the user of the second mobile device.
  • 20. The at least one non-transitory storage device of claim 17, the operations further comprising receiving, at the first mobile device, an indication of a location of a third mobile device and an indication of a time at which the location of the third mobile device was determined, wherein the first portion further indicates the location of the third mobile device, andwherein the second portion further indicates the time at which the location of the second third device was determined.
  • 21. The at least one non-transitory storage device of claim 20, wherein the first portion comprises a second icon indicating the location of the third mobile device with respect to the graphical map.
  • 22. The at least one non-transitory storage device of claim 16, the operations further comprising varying a zoom level of the graphical map based on the location of the second mobile device and the location of the third mobile device.
  • 23. The at least one non-transitory storage device of claims 16, the operations further comprising discontinuing sharing the location of the first mobile device between the first mobile device and the second mobile device after the expiration time.
  • 24. The at least one non-transitory storage device of claim 16, the operations further comprising receiving, at the first mobile device, additional indications of the location of the second mobile device according to a plurality of different times.
  • 25. The at least one non-transitory storage device of claim 24, wherein the additional indications of the location of the second mobile device are received periodically.
  • 26. The at least one non-transitory storage device of claim 16, wherein receiving the one or more user inputs comprises receiving a first user input selecting a user associated with the second mobile device from a contact list.
  • 27. The at least one non-transitory storage device of claim 26, wherein receiving the one or more user inputs further comprises: receiving a second user input specifying a textual message for presentation to the user associated with the second mobile device, andreceiving a third user input specifying that the textual message be transmitted to the second mobile device.
  • 28. A system, comprising: one or more processors; andat least one non-transitory storage device storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: receiving, by a first mobile device at a user interface of the first mobile device, one or more user inputs instructing the first mobile device to share a location of the first mobile device with a second mobile device, the one or more user inputs indicating an expiration time;estimating the location of the first mobile device;sharing the location of the first mobile device between the first mobile device and the second mobile device until the expiration time;receiving, at the first mobile device, an indication of a location of the second mobile device determined by the second mobile device; andpresenting, on the user interface of the first mobile device, a first portion indicating the location of the second mobile device, a second portion indicating a time at which the location of the second mobile device was determined, and a third portion indicating the expiration time.
  • 29. The system of claim 28, wherein the indication of the location of the second mobile device and the indication of the time at which the location of the second mobile device was determined is received from a server system different from the first mobile device and the second mobile device.
  • 30. The system of claim 28, wherein the location of the first mobile device is shared between the first mobile device and the second mobile device through a server system different from the first mobile device and the second mobile device.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. patent application Ser. No. 15/219,239, filed Jul. 25, 2016, which is a continuation of U.S. patent application Ser No. 14/636,106, filed Mar. 2, 2015, now issued as U.S. Pat. No. 9,402,153 on Jul. 26, 2016, which is a continuation of U.S. patent application Ser. No. 13/113,856, filed May 23, 2011, now issued as U.S. Pat. No. 8,971,924 on Mar. 3, 2015, the entire contents of each of which are incorporated herein by reference.

US Referenced Citations (249)
Number Name Date Kind
5475653 Yamada et al. Dec 1995 A
5801700 Ferguson Sep 1998 A
6002402 Schacher Dec 1999 A
6040781 Murray Mar 2000 A
6191807 Hamada et al. Feb 2001 B1
6323846 Westerman et al. Nov 2001 B1
6362842 Tahara et al. Mar 2002 B1
6515585 Yamamoto Feb 2003 B2
6570557 Westerman et al. May 2003 B1
6677932 Westerman et al. Jan 2004 B1
6809724 Shiraishi et al. Oct 2004 B1
7015817 Copley et al. Mar 2006 B2
7039420 Koskinen et al. May 2006 B2
7076257 Kall Jul 2006 B2
7224987 Bhela May 2007 B1
7365736 Marvit et al. Apr 2008 B2
7593749 Vallstrom et al. Sep 2009 B2
7614008 Ording et al. Nov 2009 B2
7633076 Huppi et al. Dec 2009 B2
7653883 Hotelling et al. Jan 2010 B2
7657849 Chaudhri et al. Feb 2010 B2
7663607 Hotelling et al. Feb 2010 B2
7789225 Whiteis Sep 2010 B2
7801542 Stewart Sep 2010 B1
7834861 Lee Nov 2010 B2
7844914 Andre et al. Nov 2010 B2
7908219 Abanami et al. Mar 2011 B2
7953393 Chin et al. May 2011 B2
7957762 Herz et al. Jun 2011 B2
8006002 Kalayjian et al. Aug 2011 B2
8121586 Araradian et al. Feb 2012 B2
8150930 Satterfield et al. Apr 2012 B2
8239784 Hotelling et al. Aug 2012 B2
8244468 Scailisi et al. Aug 2012 B2
8255830 Ording et al. Aug 2012 B2
8279180 Hotelling et al. Oct 2012 B2
8285258 Schultz et al. Oct 2012 B2
8369867 Van Os et al. Feb 2013 B2
8374575 Mullen Feb 2013 B2
8381135 Hotelling et al. Feb 2013 B2
8412154 Leemet et al. Apr 2013 B1
8441367 Lee et al. May 2013 B1
8479122 Hotelling et al. Jul 2013 B2
8572493 Qureshi Oct 2013 B2
8786458 Wiltzius et al. Jul 2014 B1
8855665 Buford et al. Oct 2014 B2
8922485 Lloyd Dec 2014 B1
8971924 Pai et al. Mar 2015 B2
8989773 Sandel et al. Mar 2015 B2
9204283 Mullen Dec 2015 B2
9247377 Pai et al. Jan 2016 B2
9294882 Sandel et al. Mar 2016 B2
9369833 Tharshanan et al. Jun 2016 B2
9402153 Pai et al. Jul 2016 B2
9635540 Mullen Apr 2017 B2
9699617 Sandel et al. Jul 2017 B2
20020015024 Westerman et al. Feb 2002 A1
20020037715 Mauney et al. Mar 2002 A1
20020102989 Calvert et al. Aug 2002 A1
20020115478 Fujisawa et al. Aug 2002 A1
20020126135 Ball et al. Sep 2002 A1
20030081506 Karhu May 2003 A1
20030128163 Mizugaki et al. Jul 2003 A1
20040041841 LeMogne et al. Mar 2004 A1
20040070511 Kirn Apr 2004 A1
20040180669 Kall Sep 2004 A1
20040203854 Nowak Oct 2004 A1
20050032532 Kokkonen et al. Feb 2005 A1
20050138552 Venolia Jun 2005 A1
20050148340 Guyot Jul 2005 A1
20050190059 Wehrenberg Sep 2005 A1
20050191159 Benko Sep 2005 A1
20050222756 Davis et al. Oct 2005 A1
20050268237 Crane et al. Dec 2005 A1
20050288036 Brewer et al. Dec 2005 A1
20060017692 Wehrenberg et al. Jan 2006 A1
20060019649 Feinleib et al. Jan 2006 A1
20060026245 Cunninqham et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060030333 Ward Feb 2006 A1
20060033724 Chaudhri et al. Feb 2006 A1
20060044283 Eri et al. Mar 2006 A1
20060063538 Ishii Mar 2006 A1
20060092177 Blasko May 2006 A1
20060195787 Topiwala et al. Aug 2006 A1
20060197753 Hotelling et al. Sep 2006 A1
20060223518 Haney Oct 2006 A1
20070036300 Brown et al. Feb 2007 A1
20070085157 Fadell et al. Apr 2007 A1
20070117549 Arnos May 2007 A1
20070129888 Rosenberg Jun 2007 A1
20070150834 Muller et al. Jun 2007 A1
20070150836 Deggelmann et al. Jun 2007 A1
20070216659 Amineh Sep 2007 A1
20070236475 Wherry Oct 2007 A1
20080004043 Wilson et al. Jan 2008 A1
20080014989 Sandegard et al. Jan 2008 A1
20080045232 Cone et al. Feb 2008 A1
20080052945 Matas et al. Mar 2008 A1
20080055264 Anzures et al. Mar 2008 A1
20080057926 Forstall et al. Mar 2008 A1
20080070593 Altman et al. Mar 2008 A1
20080079589 Blackadar Apr 2008 A1
20080114539 Lim May 2008 A1
20080139219 Boeiro et al. Jun 2008 A1
20080153517 Lee Jun 2008 A1
20080165136 Christie et al. Jul 2008 A1
20080176583 Brachet et al. Jul 2008 A1
20080186165 Bertagna et al. Aug 2008 A1
20080216022 Lorch et al. Sep 2008 A1
20080287151 Fjelstad et al. Nov 2008 A1
20080320391 Lemay et al. Dec 2008 A1
20090005011 Christie et al. Jan 2009 A1
20090005018 Forstall et al. Jan 2009 A1
20090006566 Veeramachaneni et al. Jan 2009 A1
20090011340 Lee et al. Jan 2009 A1
20090037536 Braam Feb 2009 A1
20090049502 Levien et al. Feb 2009 A1
20090051648 Shamaie et al. Feb 2009 A1
20090051649 Rondel Feb 2009 A1
20090055494 Fukumoto Feb 2009 A1
20090066564 Burroughs et al. Mar 2009 A1
20090085806 Piersol et al. Apr 2009 A1
20090098903 Donaldson et al. Apr 2009 A1
20090113340 Bender Apr 2009 A1
20090164219 Yeung et al. Jun 2009 A1
20090177981 Christie et al. Jul 2009 A1
20090181726 Varqas et al. Jul 2009 A1
20090187842 Collins et al. Jul 2009 A1
20090254840 Churchill et al. Oct 2009 A1
20090298444 Shigeta Dec 2009 A1
20090303066 Lee et al. Dec 2009 A1
20090312032 Bornstein et al. Dec 2009 A1
20090313582 Rupsingh et al. Dec 2009 A1
20090319616 Lewis et al. Dec 2009 A1
20090322560 Tengler et al. Dec 2009 A1
20090325603 Van Os et al. Dec 2009 A1
20100004005 Pereira et al. Jan 2010 A1
20100017126 Holeman Jan 2010 A1
20100058231 Duarte et al. Mar 2010 A1
20100069035 Jonhson Mar 2010 A1
20100124906 Hautala May 2010 A1
20100125411 Goel May 2010 A1
20100125785 Moore et al. May 2010 A1
20100144368 Sullivan Jun 2010 A1
20100203901 Dinoff Aug 2010 A1
20100205242 Marchioro, II et al. Aug 2010 A1
20100211425 Govindarajan et al. Aug 2010 A1
20100240398 Hotes et al. Sep 2010 A1
20100248744 Bychkov et al. Sep 2010 A1
20100250727 King et al. Sep 2010 A1
20100274569 Reudink Oct 2010 A1
20100281409 Rainisto et al. Nov 2010 A1
20100287178 Lambert et al. Nov 2010 A1
20100299060 Snavely et al. Nov 2010 A1
20100325194 Williamson et al. Dec 2010 A1
20100330952 Yeoman et al. Dec 2010 A1
20100332518 Song et al. Dec 2010 A1
20110003587 Belz et al. Jan 2011 A1
20110051658 Jin et al. Mar 2011 A1
20110054780 Dhanani et al. Mar 2011 A1
20110054979 Cova et al. Mar 2011 A1
20110059769 Brunolli Mar 2011 A1
20110066743 Hurley Mar 2011 A1
20110080356 Kang et al. Apr 2011 A1
20110096011 Suzuki Apr 2011 A1
20110118975 Chen May 2011 A1
20110137813 Stewart Jun 2011 A1
20110137954 Diaz Jun 2011 A1
20110138006 Stewart Jun 2011 A1
20110148626 Acevedo Jun 2011 A1
20110151418 Delespaul et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110164058 Lemay Jul 2011 A1
20110167383 Schuller et al. Jul 2011 A1
20110183650 McKee Jul 2011 A1
20110225547 Fong et al. Sep 2011 A1
20110239158 Barraclough et al. Sep 2011 A1
20110250909 Mathias Oct 2011 A1
20110254684 Antoci Oct 2011 A1
20110265041 Ganetakos et al. Oct 2011 A1
20110276901 Zambetti et al. Nov 2011 A1
20110279323 Hung et al. Nov 2011 A1
20110306366 Trussel et al. Dec 2011 A1
20110306393 Goldman et al. Dec 2011 A1
20110307124 Morgan et al. Dec 2011 A1
20110316769 Boettcher et al. Dec 2011 A1
20120008526 Borghei Jan 2012 A1
20120022872 Gruber et al. Jan 2012 A1
20120040681 Yan et al. Feb 2012 A1
20120054028 Tengler et al. Mar 2012 A1
20120077463 Robbins et al. Mar 2012 A1
20120088521 Nishida et al. Apr 2012 A1
20120095918 Jurss Apr 2012 A1
20120102437 Worley et al. Apr 2012 A1
20120105358 Momeyer May 2012 A1
20120108215 Kameli et al. May 2012 A1
20120117507 Tseng et al. May 2012 A1
20120131458 Hayes May 2012 A1
20120136997 Yan et al. May 2012 A1
20120144452 Dyor Jun 2012 A1
20120149405 Bhat Jun 2012 A1
20120150970 Peterson et al. Jun 2012 A1
20120158511 Lucero et al. Jun 2012 A1
20120166531 Sylvain Jun 2012 A1
20120172088 Kirch et al. Jul 2012 A1
20120208592 Davis Aug 2012 A1
20120216127 Meyr Aug 2012 A1
20120218177 Pang et al. Aug 2012 A1
20120222083 Vaha-Sipila et al. Aug 2012 A1
20120239949 Kalvanasundaram et al. Sep 2012 A1
20120258726 Bansal et al. Oct 2012 A1
20120265823 Parmar et al. Oct 2012 A1
20120276919 Bi et al. Nov 2012 A1
20120290648 Sharkey Nov 2012 A1
20120302256 Pai et al. Nov 2012 A1
20120302258 Pai et al. Nov 2012 A1
20120304084 Kim et al. Nov 2012 A1
20120306770 Moore et al. Dec 2012 A1
20130002580 Sudou Jan 2013 A1
20130007665 Chaudhri et al. Jan 2013 A1
20130045759 Smith et al. Feb 2013 A1
20130063364 Moore Mar 2013 A1
20130065566 Gisby et al. Mar 2013 A1
20130091298 Ozzie et al. Apr 2013 A1
20130093833 Al-Asaaed et al. Apr 2013 A1
20130120106 Cauwels et al. May 2013 A1
20130143586 Williams et al. Jun 2013 A1
20130159941 Langlois et al. Jun 2013 A1
20130226453 Trussel et al. Aug 2013 A1
20130303190 Khan et al. Nov 2013 A1
20130305331 Kim Nov 2013 A1
20130307809 Sudou Nov 2013 A1
20130310089 Gianoukos et al. Nov 2013 A1
20140062790 Letz et al. Mar 2014 A1
20140099973 Cecchini et al. Apr 2014 A1
20140222933 Stovicek et al. Aug 2014 A1
20140237126 Bridge Aug 2014 A1
20150172393 Oplinger et al. Jun 2015 A1
20150180746 Day, II et al. Jun 2015 A1
20150346912 Yang et al. Dec 2015 A1
20150350130 Yang et al. Dec 2015 A1
20150350140 Garcia et al. Dec 2015 A1
20150350141 Yang et al. Dec 2015 A1
20160036735 Pycock et al. Feb 2016 A1
20160073223 Woolsey et al. Mar 2016 A1
20160234060 Raghu et al. Aug 2016 A1
20170026796 Raghu et al. Jan 2017 A1
20180091951 Eran et al. Mar 2018 A1
Foreign Referenced Citations (44)
Number Date Country
1475924 Feb 2004 CN
1852335 Oct 2006 CN
101390371 Mar 2009 CN
102098656 Jun 2011 CN
102111505 Jun 2011 CN
201928419 Aug 2011 CN
103583031 Feb 2014 CN
103959751 Jul 2014 CN
1 387 590 Feb 2004 EP
2574026 Mar 2013 EP
2610701 Jul 2013 EP
2610701 Apr 2014 EP
H1145117 Feb 1999 JP
2002-366485 Dec 2002 JP
2003-516057 May 2003 JP
2003-207556 Jul 2003 JP
2006-072489 Mar 2006 JP
2006-079427 Mar 2006 JP
2006-113637 Apr 2006 JP
2006-129429 May 2006 JP
2009-081865 Apr 2009 JP
2010-503126 Jan 2010 JP
2010-503332 Jan 2010 JP
2010-288162 Dec 2010 JP
2010-539804 Dec 2010 JP
2011-060065 Mar 2011 JP
2011-107823 Jun 2011 JP
2012-508530 Apr 2012 JP
2012-198369 Oct 2012 JP
2013-048389 Mar 2013 JP
10-2004-0089329 Oct 2004 KR
10-2007-0096222 Oct 2007 KR
10-2008-0074813 Aug 2008 KR
200532429 Oct 2005 TW
WO 2001041468 Jun 2001 WO
WO 2002003093 Jan 2002 WO
WO 2008030972 Mar 2008 WO
WO 2009071112 Jun 2009 WO
WO 2010048995 May 2010 WO
WO 2010054373 May 2010 WO
WO 2011080622 Jul 2011 WO
WO 2012128824 Sep 2012 WO
WO 2012170446 Dec 2012 WO
WO 2013093558 Jun 2013 WO
Non-Patent Literature Citations (168)
Entry
Australian Patent Examination Report No. 1 in Australian Application No. 2012202929, dated Sep. 28, 2013, 3 pages.
Chinese Office Action in Chinese Application No. 201210288784.3, dated Jul. 3, 2014, 16 pages (with English Translation).
European Search Report in European Application No. 12168980.6, dated Sep. 21, 2012, 7 pages.
International Preliminary Report on Patentability in International Application No. PCT/US/2012/038718, dated Nov. 26, 2013, 5 pages.
International Search Report in International Application No. PCT/US2012/038718, dated Aug. 17, 2012, 5 pages.
International Search Report and Written Opinion in International Application No. PCT/US13/41780, dated Dec. 1, 2014, 18 pages.
International Preliminary Report on Patentability in International Application No. PCT/US13/41780, dated Dec. 9, 2014, 8 pages.
Japanese Office Action in Japanese Application No. 2012-113725, dated May 27, 2013, 5 pages.
Korean Preliminary Rejection in Korean Application No. 10-2012-54888, dated Sep. 5, 2014, 9 pages (with English Translation).
Search and Examination Report in GB Application No. GB1209044.5, dated Aug. 24, 2012, 10 pages.
U.S. Final Office Action in U.S. Appl. No. 13/113,856, dated Nov. 7, 2012, 19 pages.
U.S. Final Office Action in U.S. Appl. No. 13/488,430, dated May 8, 2013, 19 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 13/113,856, dated Jul. 18, 2012, 14 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 13/488,430, dated Dec. 5, 2012, 13 pages.
Written Opinion in International Application No. PCT/US/2012/038718, dated Aug. 17, 2012, 4 pages.
Australian Patent Examination Report No. 1 in Australian Application No. 2013203926, dated Oct. 7, 2014, 5 pages.
Australian Patent Examination Report No. 2 in Australian Application No. 2013203926, dated Jan. 13, 2016, 3 pages.
European Extended Search Report in Application No. 16155938.0, dated Jun. 7, 2016, 8 pages.
Chinese Office Action for Application No. 201210288784.3, dated Jan. 5, 2017, 14 pages.
India Office Action for Application No. 2030/CHE/2012, dated Dec. 27, 2016, 9 pages.
Chinese Notification of Reexamination for Application No. 201210288784.3, dated Sep. 27, 2017, 18 pages.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365358.4, dated Nov. 20, 2015, 2 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365843.1, dated Feb. 15, 2016, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520669842.6, dated May 18, 2016, 2 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365358.4, dated Aug. 11, 2015, 4 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Aug. 25, 2015, 4 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Nov. 16, 2015, 3 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520669842.6, dated Dec. 4, 2015, 7 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201620393549.6, dated Aug. 18, 2016, 2 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201620393748.7, dated Aug. 18, 2016, 2 pages with English Translation.
Invitation to Pay Additional Fees and Partial Search Report received for PCT Patent Application No. PCT/US2015/043487, dated Nov. 9, 2015, 4 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/044083, dated Nov. 4, 2015, 11 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/046787, dated Dec. 15, 2015, 8 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2016/046828, dated Sep. 23, 2016, 2 pages.
Taiwanese Office Action received for Taiwanese Patent Application No. 104107332, dated Oct. 29, 2018, 12 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128519, dated Mar. 29, 2017, 16 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Jul. 31, 2017, 7 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Nov. 2, 2016, 12 pages with English Translation.
‘absoluteblogger.com’ [online]. “WeChat Review—Communication Application with Screenshots” available on or before Jun. 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://www.absoluteblogger.com/2012/10/wechat-review-communication-application.html>. 4 pages.
‘appps.jp’ [online]. “WhatsApp” users over 400 million people! I tried to investigate the most used messaging application in the world Jan. 24, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140410142411/http://www.appps.jp/2128786>. 13 pages, with Machine Translation English.
Australian Certificate of Examination in Australian Patent Application No. 2017100760 dated Feb. 9, 2018, 2 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015267259, dated Jan. 30, 2018, 3 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015267260, dated Jan. 30, 2018, 3 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015312369, dated Mar. 21, 2018, 3 pages.
Australian Office Action in Australian Patent Application No. 2015100711, dated Jul. 27, 2015, 7 pages.
Australian Office Action in Australian Patent Application No. 2015100711, dated Nov. 19, 2015, 6 pages.
Australian Office Action in Australian Patent Application No. 2015101188, dated Apr. 14, 2016, 3 pages.
Australian Office Action in Australian Patent Application No. 2015267259, dated Jun. 2, 2017, 2 pages.
Australian Office Action in Australian Patent Application No. 2015267260, dated Jun. 2, 2017, 2 pages.
Australian Office Action in Australian Patent Application No. 2015312369, dated Mar. 29, 2017, 3 Pages.
Australian Office Action in Australian Patent Application No. 2016102028, dated Feb. 13, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2016102029, dated Feb. 22, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100197, dated Apr. 28, 2017, 4 Pages.
Australian Office Action in Australian Patent Application No. 2017100198, dated Apr. 20, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100760, dated Aug. 10, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100760, dated Jan. 30, 2018, 3 pages.
Australian Office Action in Australian Patent Application No. 2018204430, dated Aug. 15, 2018, 5 pages.
Chinese Notice of Allowance received for Chinese Patent Application No. 201510290133.1, dated Jan. 9, 2019, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201510291012.9, dated Jan. 9, 2019, 3 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510290133.1, dated Feb. 9, 2018, 10 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510291012. 9, dated Feb. 8, 2018, 9 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Aug. 7, 2018, 7 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Nov. 24, 2017, 22 pages with English Translation.
Danish Decision to Grant received for Danish Patent Application No. PA201770126, dated Mar. 27, 2018, 2 pages.
Danish Intention to Grant received for Denmark Patent Application No. PA201570550, dated Dec. 22, 2016, 2 pages.
Danish Intention to Grant received for Denmark Patent Application No. PA201770126, dated Jan. 19, 2018, 2 pages.
Danish Notice of Allowance received for Danish Patent Application No. PA201570550, dated Mar. 20, 2017, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Oct. 19, 2016, 3 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Dec. 7, 2015, 5 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Jan. 19, 2016, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201770089, dated Apr. 25, 2017, 10 pages.
Danish Office Action received for Danish Patent Application No. PA201770125, dated Jan. 26, 2018, 5 pages.
Danish Office Action received for Danish Patent Application No. PA201770125, dated Jul. 20, 2018, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201770126, dated Oct. 18, 2017, 3 pages.
Danish Search Report received for Danish Patent Application No. PA201770125, dated May 5, 2017, 10 pages.
Danish Search Report received for Danish Patent Application No. PA201770126, dated Apr. 26, 2017, 8 Pages.
‘digitalstreetsa.com’ [online]. “Why WeChat might kill Whatsapp's future . . . ” Jul. 3, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://digitalstreetsa.com/why-wechat-might-kill-whatsapps-future>. 9 pages.
‘download.cnet.com’ [online]. “WeChat APK for Android” Jan. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://download.cnet.com/WeChat/3000-2150_4-75739423.html>. 5 pages.
‘engadget.com’ [online]. “WhatsApp Introduces Major New Audio Features,” Aug. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://www.engadget.com/2013/08/07/whatsapp-introduces-major-new-audio-features>. 12 pages.
European Extended Search Report in European Patent Application No. 17167629.9, dated Jun. 2, 2017, 7 pages.
European Extended Search Report in European Patent Application No. 18170262.2, dated Jul. 25, 2018, 8 pages.
European Office Action in European Patent Application No. 15728307.8, dated Feb. 8, 2018, 7 pages.
European Office Action in European Patent Application No. 15729286.3, dated Feb. 7, 2018, 7 pages.
European Office Action in European Patent Application No. 15759981.2, dated Apr. 19, 2018, 6 pages.
European Office Action in European Patent Application No. 15759981.2, dated Aug. 6, 2018, 10 pages.
European Office Action in European Patent Application No. 15759981.2, dated May 16, 2018, 6 pages.
European Office Action in European Patent Application No. 17167629.9, dated Jan. 25, 2019, 7 pages.
‘heresthethingblog.com’ [online]. “iOS 7 tip: Alerts, Banners, and Badgesawhats the Difference?” Jan. 22, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140128072440/http://heresthethingblog.com/2014/01/22/ios-7-tip-whats-difference-alert/>. 5 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/032305, dated Dec. 15, 2016, 7 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2015/032309, dated Dec. 15, 2016, 7 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/043487, dated Feb. 16, 2017, 12 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/044083, dated Mar. 16, 2017, 24 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/046787, dated Mar. 16, 2017, 18 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2016/046828, dated Mar. 1, 2018, 19 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032305, dated Sep. 10, 2015, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032309, dated Sep. 2, 2015, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/043487, dated Jan. 29, 2016, 17 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/044083, dated Feb. 4, 2016, 31 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/046787, dated Apr. 1, 2016, 26 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/046828, dated Dec. 15, 2016, 21 pages.
iPhone, “User Guide for iOS 7.1 Software”, Mar. 2014, 162 pages.
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-510297, dated May 7, 2018, 5 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-514992, dated Feb. 15, 2019, 5 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent application No. 2017514993, dated Jan. 12, 2018, 6 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent Application No. 2018-072632, dated Dec. 7, 2018, 6 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Dec. 4, 2017, 6 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Jul. 10, 2017, 9 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-514992, dated Apr. 6, 2018, 9 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2018-018497, dated Dec. 10, 2018, 7 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2018-072632, dated Jul. 9, 2018, 5 pages with English Translation.
‘jng.org’ [online]. “Affordances and Design,” published on or before Feb. 25, 2010 [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20150318044240/jnd.org/dn.mss/affordances_and html>. 6 pages.
Korean Notice of Allowance received for Korean Patent Application No. 10-2017-7005628, dated Jun. 18, 2018, 4 pages with English Translation.
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated Jan. 30, 2018, 6 pages with English translation.
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated May 10, 2017, 11 pages with English Translation.
Korean Office Action received for Korean Patent Application No. 10-2018-7027006, dated Jan. 14, 2019, 4 pages with English Translation.
‘makeuseof.com’ [online]. “MS Outlook Tip: How to Automatically Organize Incoming Emails,”Sep. 27, 2019, retrieved on Apr. 23, 2019], retrieved from: URL<http://www.makeuseof.com/tag/ms-outlook-productivity-tip-how-to-move-emails-to-individual-folders-automatically>. 5 pages.
‘manualslib.com’ [online]. “Samsung Gear 2 User Manual”, 2014, [retrieved on Apr. 23, 2019], retrieved from: URL<http://www.manualslib.com/download/754923/Samsung-Gear-2.html>. 97 pages.
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2015354, dated Jun. 22, 2017, 23 pages with English Translation.
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2019878, dated Apr. 6, 2018, 23 pages with English Translation.
Samsung, “SM-G900F User Manual”, English (EU). Rev.1.0, Mar. 2014, 249 pages.
Samsung, “SM-R380”, User Manual, 2014, 74 pages.
‘seechina365.com’ [online]. “How to use China's popular social networking service wechat2_ voice message, press together, shake function etc.” Apr. 5, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://seechina365.com/2014/04/05/wechat02>. 29 pages with Machine English Translation.
‘slideshare.net.’ [online]. “Samsung Gear 2 User manual”, Apr. 2014, [retrieved on Apr. 23, 2019], retrieved from: URL>http://www.slideshare.net/badaindonesia/samsung-gear-2-user-manual>. 58 pages.
U.S. Corrected U.S. Notice of Allowance received for U.S. Appl. No. 14/841,614, dated Jan. 8, 2019, 3 pages.
U.S. Final Office Action received for U.S. Appl. No. 14/841,614, dated May 10, 2018, 12 pages.
U.S. Final Office Action received for U.S. Appl. No. 14/841,623, dated Sep. 5, 2017, 15 pages.
U.S. Final Office Action received for U.S. Appl. No. 14/928,865, dated Dec. 5, 2018, 14 pages.
U.S. Final Office Action received for U.S. Appl. No. 14/817,572, dated Mar. 23, 2017, 13 pages.
U.S. Final Office Action received for U.S. Appl. No. 14/838,235, dated Jun. 15, 2016, 17 pages.
U.S. Non Final Office Action received for U.S. Appl. No. 14/503,386, dated Jan. 7, 2015, 18 pages.
U.S. Non Final Office Action received for U.S. Appl. No. 14/817,572, dated Sep. 12, 2016, 8 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 14/841,608, dated Apr. 12, 2017, 8 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 14/841,614, dated Jul. 27, 2017, 12 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 14/841,623, dated Feb. 2, 2017, 16 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 14/928,865, dated Mar. 27, 2018, 14 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 15/142,661, dated Jan. 25, 2017, 28 Pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 15/425,273, dated Oct. 3, 2018, 9 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 15/431,435, dated Jun. 8, 2017, 10 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 15/985,570, dated Aug. 16, 2018, 23 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 14/503,376, dated Dec. 22, 2014, 19 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 15/366,763, dated Mar. 8, 2019, 13 pages.
U.S. Non-Final Office Action received for U.S. Appl. No. 14/838,235, dated Jan. 5, 2016, 18 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/841,608, dated Nov. 14, 2017, 5 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/841,614, dated Oct. 24, 2018, 10 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/841,623, dated Feb. 23, 2018, 8 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 15/142,661, dated Feb. 15, 2018, 9 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 15/142,661, dated Oct. 4, 2017, 21 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 15/425,273, dated Mar. 7, 2019, 8 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 15/431,435, dated Jan. 23, 2018, 8 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 15/985,570, dated Mar. 13, 2019, 21 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,376, dated Jul. 29, 2015, 12 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,376, dated Sep. 2, 2015, 4 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,376, dated Sep. 24, 2015, 5 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,386, dated Jul. 30, 2015, 11 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,386, dated Sep. 24, 2015, 5 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/817,572, dated Nov. 30, 2017, 26 pages.
U.S. Notice of Allowance received for U.S. No. 14/838,235, dated Dec. 29, 2016, 3 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 14/838,235, dated Oct. 4, 2016, 7 pages.
U.S. Notice of Allowance received for U.S. Appl. No. 15/876,673, dated May 4, 2018, 26 pages.
U.S. Supplemental Notice of Allowance received for U.S. Appl. No. 14/841,608, dated Jan. 25, 2018, 2 pages.
‘wechat.wikia.com’ [online]. “WeChat Wiki”, May 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://web.archive.org/web/20130514131044/http://wechat.wikia.com/wiki/WeChat_Wiki>. 6 pages.
‘wikihow.com’ [online]. “How to Move Mail to Different Folders in Gmail,” available on or before Jul. 31, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140731230338/http://www.wikihow.com/Move-Mail-to-Different-Folders-in-Gmail>. 4 pages.
‘youtube.com’ [online]. “How to Dismiss Banner Notifications or Toast Notifications on iOS7,” Dec. 17, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=vSjHnBFIW_M>. 2 pages.
‘youtube.com’ [online]. “How to Send a Picture Message/MMS—Samsung Galaxy Note 3,” Nov. 3, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=-3d0z8-KeDw>. \2 page.
‘youtube.com’ [online]. “iOS 7 Notification Center Complete Walkthrough,” Jun. 10, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=gATXt-o42LA>. 3 pages.
‘youtube.com’ [online]. “iOS Notification Banner Pull Down to Notification Center in iOS 7 Beta 5”, Aug. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=nP0s6ETPxDg>. 2 pages.
‘youtube.com’ [online]. “Notification & Control Center Problem Issue Solution” Dec. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=K0zCueYlaTA>. 3 pages.
'youtube.com' [online]. “WeChat TVC—Hold to Talk”, May 11, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=E_UxteOWVSo>. 2 page.
Related Publications (1)
Number Date Country
20190037353 A1 Jan 2019 US
Continuations (3)
Number Date Country
Parent 15219239 Jul 2016 US
Child 16146774 US
Parent 14636106 Mar 2015 US
Child 15219239 US
Parent 13113856 May 2011 US
Child 14636106 US