Sharing location information among devices

Information

  • Patent Grant
  • 10375526
  • Patent Number
    10,375,526
  • Date Filed
    Friday, June 30, 2017
    7 years ago
  • Date Issued
    Tuesday, August 6, 2019
    4 years ago
Abstract
Methods, systems, apparatus, and computer program products that include, on a first device executing a first instance of a location application, receiving an indication to begin sharing data describing a path traveled by the first device, receiving location data describing the path traveled by the first device, the location data received from a location system of the first device, and the location data including a plurality of locations of the first device, and transmitting the location data in a form usable to enable a user interface of a second instance of a location application executing on a second device to indicate the path traveled by the first device.
Description
TECHNICAL FIELD

The disclosure generally relates to sharing location information among devices.


BACKGROUND

Devices (e.g., mobile devices such a smartphones) can execute location applications that provide information about a location of the device. For example, the device could have a location system which determines a current position of the device and displays the position on a user interface of the device. The location application may also track the location of the device over time and display a path representing motion of the device.


SUMMARY

In one aspect, in general, a method includes on a first device executing a first instance of a location application, receiving an indication to begin sharing data describing a path traveled by the first device, receiving location data describing the path traveled by the first device, the location data received from a location system of the first device, and the location data including a plurality of locations of the first device, and transmitting the location data in a form usable to enable a user interface of a second instance of a location application executing on a second device to indicate the path traveled by the first device. Other aspects may include corresponding systems, apparatus, or computer program products.


Implementations of these aspects may include one or more of the following features. The method includes receiving a request for the second device to receive the data describing the path traveled by the first device, and receiving, at the first device, an authorization to transmit the location data to the second device. The method includes displaying on the first device a visual representation of the path traveled by the first device in the first instance of the location application, and wherein the transmitted location data is usable by the second device to display, in the second instance of a location application, a substantially identical visual representation of the path traveled by the first device. The method includes providing, on the user interface of the first instance of the location application, an indication that the second device has requested to receive the data describing the path traveled by the first device.


In another aspect, in general, a method includes on a first device executing a first instance of a location application, receiving an indication to receive shared data describing a path traveled by a second device, receiving location data describing the path traveled by the second device, the location data including a plurality of locations of the second device, and based on the received location data, indicating the path traveled by the second device in a user interface of the first instance of a location application executing on the first device. Other aspects may include corresponding systems, apparatus, or computer program products.


Implementations of these aspects may include one or more of the following features. The path traveled by the second device is indicated before at least some locations of the second device are received, and wherein the path traveled by the second device is subsequently updated in response to receiving at least some locations of the second device. Indicating the path traveled by the second device includes displaying, on the first device, a visual representation of the path traveled by the second device. Indicating the path traveled by the second device includes displaying, on the first device, a visual representation of directions for a user of the first device to follow the path. Indicating the path traveled by the second device includes providing, by the first device, a spoken word description of directions for a user of the first device to follow the path. The method includes indicating, on the first device, a spoken message generated based on a message provided by a user of the second device.


Details of one or more implementations are set forth in the accompanying drawings and the description below. Other features, aspects, and potential advantages will be apparent from the description and drawings, and from the claims.





DESCRIPTION OF DRAWINGS


FIG. 1 shows a first device sharing location information with a second device.



FIGS. 2A and 2B are views of location information user interfaces of devices.



FIGS. 3A and 3B are views of authorization user interfaces of devices.



FIG. 4 is a flowchart of an exemplary process of sharing location information.



FIG. 5 is a flowchart of another exemplary process of indicating shared location information



FIG. 6 is a block diagram of an example computing device that can implement the features and processes of FIGS. 1-5.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION

A device in motion can record data about the path it travels and send the path data to another device. A user of the second device can then use the data to see where the first user traveled and/or travel the same path as did the first user.


For example, while the first user is driving a car, she could be running a maps application on the first device, and share the path she is travelling with the user of the second device while the second user is also driving a car. The second device could then display the path in an instance of the maps application running on the second device, or the second device could display directions the second user could use to follow the first user, or the second device could generate spoken word directions the second user could use to follow the first user, all in real time.



FIG. 1 shows a first device 100 sharing location information with a second device 102. The first device 100 is contained within an automobile 104 and the second device 104 is contained within another automobile 106. While the first automobile 104 travels on a road, the first device 100 transmits location information to the second device 102. A user of the second device 102, e.g., a driver or passenger of the second automobile 106, can use the location information to guide the second automobile 106 along the same road to follow the first automobile 104.


In some implementations, the first device 100 may display location information 110 in the form of a map on a user interface of the first device 100. The first device 100 determines the location information 110, for example, using a location information facility of the first device 100. In some implementations, the location information facility is a Global Navigation Satellite System (GNSS) facility, for example, a GPS (Geographical Positioning System) facility. The location information facility may include a transmitter and receiver of location signals, e.g., GNSS signals. In some implementations, the first device 100 determines location information 110 using a location information facility other than a GNSS facility. For example, the first device may determine location information 110 using a wireless network facility, such as a wireless network transceiver. In some implementations, the wireless network transceiver can be used to determine a location of the first device 100 by collecting information about nearby wireless networks (e.g., 802.11 networks) of known location. In some implementations, other location information facilities can be used to determine a location of the first device 100. In some implementations, a location information facility external to the first device 100 is used (e.g., an external device that communicates a location to the first device 100). In some implementations, the location information can be entered manually by a user of the first device 100.


The second device 102 displays location information 112 based on the location information 110 determined by the first device 100. For example, the second device 102 may display the location information 112 in the form of a map that includes a current location of the first device 100.


In use, the first device 100 transmits 114 location information which is received 116 by the second device 102. The second device 102 uses the information transmitted by the first device 100 to display location information 112 that includes a location of the first device 100. The information can be transmitted and received using a communication medium usable by the devices 100, 102. For example, the first device 100 may transmit 114 the location information using a wireless communications facility such as a wireless communications transceiver. The information may be transmitted using a wireless communications transceiver that communicates on a mobile communications network (e.g., 3G, LTE, WiMAX, etc.), a wireless LAN (e.g., 802.11), or another kind of wireless network. When a mobile communications network or a wireless network is used, an intermediary may receive the location information from the first device 100 which then re-transmits the location information to the second device 102. For example, the intermediary may be a server (e.g., a “cloud” server accessible to both the first device 100 and the second device 102) that receives and re-transmits the location information.


In some implementations, the wireless communications facility operates using a point-to-point communications protocol that allows the first device 100 to transmit 114 location information directly to a wireless communications transceiver of the second device 102. For example, the point-to-point communications protocol could be a Bluetooth protocol.


In some implementations, the location information 112 also includes a path followed by the first device 100. For example, the location information displayed by the second device 112 may include a starting location of the first device 100 and may include graphical elements (e.g., lines on a map) representing the path followed by the first device 100. A user of the second device 102 could use this information to not only identify a current location of the first device 100 but also re-create the path followed by the first device 100.


In some implementations, the first device 100 does not display location information 110, and the location information is determined on the first device 100 with a primary purpose of transmission to the second device 102 (e.g., for display on the second device 102).


In this example, the devices 100, 102 are shown as mobile devices (e.g., smartphones). Either of the devices 100, 102 could be any kind of electronic device, e.g., a laptop computer, a tablet computer, a wrist-mounted computer, a personal digital assistant, a built-in navigation system (e.g., built into either or both of the automobiles 104, 106), or any other kind of device capable of transmitting and receiving information. The devices 100, 102 need not be devices of the same type; for example, one device 100 could be a smartphone and the other device 104 could be a laptop computer.


In this example, the devices 100, 102 are shown as each contained within automobiles 104, 106. However, in some examples, either or both of the devices 100, 102 could be carried by another type of vehicle (e.g., bicycle, motorcycle, jet-ski, helicopter, etc.). Either or both of the devices 100, 102 could be carried by a human being on foot, or an animal, or a robot.


The location information represents a path 108 (e.g., a path corresponding to a road or other geographical feature) that the first device 100 is traveling. Data describing the path 108 may be recorded by the first device 100. For example, the first device 100 may record data describing multiple locations (e.g., positions) of the first device 100 over time, and the data describing the path 108 includes the data describing the multiple locations over time.


In some examples, the first device 100 transmits location information to multiple other receiving devices (e.g., the second device 102 as well as other devices). In some implementations, the first device 100 may have a “broadcast” mode in which the location of the first device 100 is shared with any number of other devices. The first device 100 may make this information publically available, e.g., in a database or other resource available to many other users of mobile devices. In this example, the first device 100 may not receive any communications from other devices and may not be notified that other devices are receiving location information. The database or other resource may also store information about the location of the first device 100 over time for later retrieval by other users.



FIGS. 2A and 2B are views of location information user interfaces of devices. FIG. 2A shows a first device 200 and FIG. 2B shows a second device 202. For example the first device 200 could be an example of the first device 100 shown in FIG. 1, and the second device 202 could be an example of the second device 102 shown in FIG. 1. The first device 200 displays a user interface 204 representing a current location 206 of the first device 200. The current location 206 may be displayed in the context of a map 208 of the region surrounding the first device 200. The map 208 may include streets, landmarks, geographical features, etc. The map 208 can also display a visual representation of a path 210 representing the route that the first device 200 has taken (e.g., representing previous current locations of the first device 200). In some implementations, the user interface 204 has been configured to indicate directions 212 to a specified destination (e.g., a destination specified by a user of the first device 200). The directions 212 could be indicated as textual directions, or spoken-word directions (using an audio output facility of the first device 200), or in another manner.


The second device 202 displays a user interface 220 representing location information received from the first device 200 (e.g., received in the form of data transmitted using a mobile communications network or other communications medium). In some examples, the user interface 220 could display a current location 222 of the first device 200. The user interface 220 may also display a visual representation of a path 224 traveled by the first device 200 (e.g., representing previous locations of the first device 200).


The path 224 may be displayed in the user interface 220 upon the initiation of sharing of location information, such that locations of the first device 200 recorded prior to the initiation of the sharing can be displayed on a user interface 220 of the second device. For example, when the first device 200 begins transmitting location information, the first device 200 may have already completed a portion of a journey. The journey could be, e.g., a journey from a start destination to an end destination each specified by a user of the first device 200. For example, a journey is sometimes specified in connection with functionality for providing directions from a start destination to an end destination. A portion of a journey already in progress (e.g., previous locations of the first device 200 recorded by the first device 200) can be transmitted to the second device 202 and a visual representation of the portion of the journey (e.g., a path representing the portion of the journey) can be displayed on the user interface 220 of the second device. As the first device 200 records further locations (e.g., continues the journey) then the locations can be transmitted to the second device 202 and the path 224 displayed on the user interface 220 of the second device 202 can be subsequently updated.


In some examples, the visual representation of the path 224 displayed on the second device 202 may be substantially identical to the visual representation of the path 210 displayed on the first device 200. For example, the path 224 displayed on the second device 202 may be shown with the same shapes and colors used to represent the path 210 displayed on the first device 200. The current location 222 and the path 224 may be displayed on the second device 202 simultaneous with the display of the current location 206 and path 210 displayed on the first device 200. In some examples, the user interface 220 may display a current location 226 of the second device 202.


In some examples, the user interface 220 of the second device can be configured to display exactly the same view as shown on the user interface 204 of the first device 200. This may take the form of a “mirroring” mode in which the second device 202 receives information from the first device 200 usable to replicate the view shown on the user interface 204 of the first device 200. For example, if the second device 202 has entered the “mirroring” mode, the first device 200 may transmit information to the second device 202 representing a current zoom level of the map 208, boundaries of the map 208 as displayed on the first device 200, and other data describing what is displayed on the first device 200. In some examples, the directions 212 indicated in the user interface 204 of the first device 200 could also be indicated in the user interface 220 of the second device 202.


In some implementations, the user interface 220 of the second device 202 can be configured to indicate directions 228 to the current location 222 of the first device 200. In some examples, the directions 228 indicate a route to follow which corresponds to the path 224 traveled by the first device 200. In some implementations, the directions 228 are generated by the second device 202 only based on the current location 222 of the first device 200 and may not correspond to the path 224 traveled by the first device 200 (e.g., if the first device 200 followed a meandering path, had to backtrack, or otherwise followed an inefficient route). The directions 228 could be indicated as textual directions, or spoken-word directions (using an audio output facility of the first device 200), or in another manner.


In some implementations, a user of the first device 200 can send a message to a user of the second device 202. For example, the user interface 204 displayed on the first device 200 could provide functionality for the user of the first device 200 to enter a textual message or spoken-word message which is then transmitted to the second device 202. In some examples, users of the first device 200 and the second device 202 may be operating motor vehicles, and so those users will each favor a spoken-word message for safety reasons (e.g., to avoid operating their respective devices while also operating controls of the motor vehicle). For example, the user interface 204 of the first device 200 could include a button 230 which, when invoked (e.g., pressed, tapped, or invoked by a spoken-word command), records a message spoken by the user of the first device 200. In use, the message can include, for example, information helpful to the user of the second device 202 in navigation (e.g., “After the bridge I will be turning right onto Beacon Street”).


In some examples, a recording of the message could be transmitted to the second device 202. In some examples, the spoken-word message could be converted to textual data by a voice recognition mechanism. The textual data could then be transmitted to the second device 202 and displayed in the user interface 220 of the second device as text or converted to audio (e.g., by a text-to-speech engine) or both. In some examples, the user interface 220 of the second device 202 enables a user of the second device 202 to send a message to a user of the first device 200. In some examples, multiple devices receive location information from the first device 200, and the message recorded by the user of the first device 200 can be sent to multiple devices.


In some implementations, the user interface 204 on the first device 200 and the user interface 220 on the second device 202 are each displayed in connection with the execution of a location application. A location application is any software that provides facilities for determining or displaying location information. For example, the user interface 204 on the first device 200 could be generated by a location application executing on the first device 200, and the user interface 220 on the second device 202 could be generated by a location application executing on the second device 202. An application executing on a device can be said to be running on the device, and a particular execution of an application is sometimes called an “instance.” In some implementations, the location application running on the first device may be the same application (e.g., the same program code) as the location application running on the second device. In some implementations, the location application running on the first device may be a different application as the location application running on the second device.



FIGS. 3A and 3B are views of authorization user interfaces of devices. FIG. 3A shows a first device 300 and FIG. 3B shows a second device 302. For example the first device 300 could be an example of the second device 102 shown in FIG. 1, and the second device 302 could be an example of the first device 100 shown in FIG. 1.


The first device 300 displays a user interface 304 that enables a user of the first device 300 to request location information from another device. The user interface 304 displays a list 306 of users available to a user of the first device 300. A user of the first device 300 selects a user in the list 306 of users and presses a “request location” button 308. The first device 300 then sends a request to another device, which operated by the identified user (e.g., the second device 302) for that other device to share location information with the first device 300.


In some implementations, the list 306 of users could be a list of users associated with a user profile or user account of a user of the first device 300. For example, the ability to share location data may be enabled by a service external to the first device 300, e.g., a “cloud” service available using a network such as the Internet. The “cloud” service may enable users to establish user profiles or user accounts. Each user who has established a user profile or user account may configure the profile or account to include information about a device operated by the user, so that the “cloud” service can identify devices operated by that user when location information is requested of the user. In some examples, a user of the “cloud” service may also establish a list (e.g., the list 306 shown in the user interface 304) of other users of the “cloud” service from whom that user may wish to request location information. When a user of the first device 300 selects a user 310 in the list 306 of users and presses the “request location” button 308, the first device 300 may contact the “cloud” service (e.g., a server made available on the Internet by the “cloud” service). The “cloud” service can determine a device currently associated with the user 310 on the list 306 of users (e.g., by receiving regular updates from the user on the list 306 of users about which device the user is using) and make a request for location information from the second device 302 on behalf of the first device 300.


In some examples, the list 306 of users could be a “contacts list” available on the first device 300. The “contacts list” may be a list of contacts for whom a user of the first device 300 has entered contact information, e.g., name, email address, mobile phone number, etc. When a user of the first device 300 selects a user 310 in the list 306 of users and presses the “request location” button 308, the first device 300 may use contact information associated with the selected user 310 to transmit a request to the selected user. For example, the first device 300 may transmit an email message or SMS (short message service) message to the email address or mobile phone number associated with the selected user 310 making a request to receive location information from a device (e.g., the second device 302) operated by the selected user 310. In some implementations, the first device 300 can use a mobile phone number to identify the second device 302 and determine a manner of communicating with the second device 302 based on the mobile telephone number. In some examples, the first device 300 could access a table of mobile phone numbers (e.g., a table available on the Internet or made available by a “cloud” service that stores the table on a publicly accessible server) and retrieve identifying information for the second device 302 based on the mobile phone number. For example, the table could correlate mobile phone numbers to IP (Internet Protocol) addresses, and the first device 300 could use the retrieved IP address to send a communication to the second device 302 over the Internet (or other network).


When the first device 300 communicates a request to receive location information from the second device 302, the second device 302 may enable a user of the second device 302 (e.g., the selected user 310 shown in FIG. 3A) to authorize the request. For example, as shown in FIG. 3B, the second device 302 can display a user interface 320 indicating that another user 322 (e.g., a user of the first device 300) has made a request to receive location information from the second device 302. The user interface 320 includes buttons 324, 326 that enable the user of the second device 302 to approve or reject the request. In this way, the user of the second device 302 may keep his or her location information private if he or she chooses.


In some implementations, the user interface 320 displayed on the second device 302 can include location information 328 from the first device 300, for example, information representing a current location of the first device 300. In some examples, when the first device 300 transmits a request to receive location information from the second device 302, the first device 300 can also transmit location information such as the current location of the first device 300. The second device 302 can then display the location information 328 in the user interface 320 when the second device 302 enables a user of the second device 302 to approve the request. In this way, the user of the second device 302 can know the location of the first device 300 before approving the request. For example, the user of the second device 302 may choose to only share location information with users who are physically nearby.


If the user of the second device 302 chooses to approve the request to receive location information, the second device 302 will then transmit location information (e.g., previous and current locations of the second device) to the first device 300. The first device 300 can then display the location information, e.g., in the form of the user interface 220 shown in FIG. 2B.


Once the sharing of location information has been established, an user interface is displayed on the first device 300 showing location information (e.g., current location and a path traveled) of the second device 302. For example, the user interface that is displayed could be the user interface 220 shown in FIG. 2A.


In some implementations, a user of the second device 302 could choose to share location information with a user of the first device 300 but selecting the user of the first device 300 from a list of users displayed on the second device 302 (similar to the list 306 of users shown in FIG. 3A).



FIG. 4 is a flowchart of an exemplary process 400 of sharing location information. The process 400 can be performed, for example, by the first device 100 shown in FIG. 1.


An indication to begin sharing data describing a path traveled by a first device is received (402). The first device executes a first instance of a location application. In some implementations, an indication is provided, on the user interface of the first instance of the location application, that the second device has requested to receive the data describing the path traveled by the first device.


Location data describing the path traveled by the first device is received (404). The location data is received from a location system of the first device, and the location data includes a plurality of locations of the first device. For example, the location system could be a GNSS system.


The location data is transmitted (406) in a form usable to enable a user interface of a second instance of a location application executing on a second device to indicate the path traveled by the first device. For example, the first device could display a visual representation of the path traveled by the first device in the first instance of the location application, and the transmitted location data can be usable by the second device to display, in a second instance of a location application, a substantially identical visual representation of the path traveled by the first device. The location data could be transmitted to a device other than the second device.


In some implementations, a request is received from the second device to receive the data describing the path traveled by the first device, and an authorization to transmit the location data to the second device is received at the first device. For example, the authorization can be received at a user interface of the first device (e.g, from a user of the first device).



FIG. 5 is a flowchart of another exemplary process 500 of indicating shared location information. The process 500 can be performed, for example, by the second device 102 shown in FIG. 1.


An indication is received (502), on a first device executing a first instance of a location application, to receive shared data describing a path traveled by a second device.


Location data describing the path traveled by the second device is received (504) by the first device, the location data including a plurality of locations of the second device.


Based on the received location data, the path traveled by the second device is indicated (506) in a user interface of the first instance of the location application executing on the first device. In some implementations, the path traveled by the second device is displayed on the first device as a visual representation of the path traveled by the second device. In some implementations, the path traveled by the second device is displayed on the first device as a visual representation of the directions for a user of the first device to follow the path. In some implementations, a spoken word description of directions for a user of the first device to follow the path is provided. In some implementations, a spoken message generated based on a message provided by a user of the second device is indicated on the first device. In some implementations, the path traveled by the second device is indicated before at least some locations of the second device are received, and the path traveled by the second device is subsequently updated in response to receiving at least some locations of the second device.


This disclosure describes various Graphical User Interfaces (UIs) for implementing various features, processes or workflows. These GUIs can be presented on a variety of electronic devices including but not limited to laptop computers, desktop computers, computer terminals, television systems, tablet computers, e-book readers and smart phones. One or more of these electronic devices can include a touch-sensitive surface. The touch-sensitive surface can process multiple simultaneous points of input, including processing data related to the pressure, degree or position of each point of input. Such processing can facilitate gestures with multiple fingers, including pinching and swiping.


When the disclosure refers “to select” or “selecting” user interface elements in a GUI, these terms are understood to include clicking or “hovering” with a mouse or other input device over a user interface element, or touching, tapping or gesturing with one or more fingers or stylus on a user interface element. User interface elements can be virtual buttons, menus, selectors, switches, sliders, scrubbers, knobs, thumbnails, links, icons, radial buttons, checkboxes and any other mechanism for receiving input from, or providing feedback to a user.



FIG. 6 is a block diagram of an example computing device 600 that can implement the features and processes of FIGS. 1-5. The computing device 600 can include a memory interface 602, one or more data processors, image processors and/or central processing units 604, and a peripherals interface 606. The memory interface 602, the one or more processors 604 and/or the peripherals interface 606 can be separate components or can be integrated in one or more integrated circuits. The various components in the computing device 600 can be coupled by one or more communication buses or signal lines.


Sensors, devices, and subsystems can be coupled to the peripherals interface 606 to facilitate multiple functionalities. For example, a motion sensor 610, a light sensor 612, and a proximity sensor 614 can be coupled to the peripherals interface 606 to facilitate orientation, lighting, and proximity functions. Other sensors 616 can also be connected to the peripherals interface 606, such as a global navigation satellite system (GNSS) (e.g., GPS receiver), a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functionalities.


A camera subsystem 620 and an optical sensor 622, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips. The camera subsystem 620 and the optical sensor 622 can be used to collect images of a user to be used during authentication of a user, e.g., by performing facial recognition analysis.


Communication functions can be facilitated through one or more wireless communication subsystems 624, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 624 can depend on the communication network(s) over which the computing device 600 is intended to operate. For example, the computing device 600 can include communication subsystems 624 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a Bluetooth™ network. In particular, the wireless communication subsystems 624 can include hosting protocols such that the device 100 can be configured as a base station for other wireless devices.


An audio subsystem 626 can be coupled to a speaker 628 and a microphone 630 to facilitate voice-enabled functions, such as speaker recognition, voice replication, digital recording, and telephony functions. The audio subsystem 626 can be configured to facilitate processing voice commands, voiceprinting and voice authentication, as described above with reference to FIGS. 1-5, In some implementations, the microphone 630 facilitates voice-enabled functions, such as speech-to-text, speaker recognition, voice replication, digital recording, and telephony functions. The audio subsystem 626 can be configured to facilitate processing voice commands, voiceprinting and voice authentication. In some implementations, audio recorded by the audio subsystem 626 is transmitted to an external resource for processing. For example, voice commands recorded by the audio subsystem 626 may be transmitted to a network resource such as a network server which performs voice recognition on the voice commands.


The I/O subsystem 640 can include a touch-surface controller 642 and/or other input controller(s) 644. The touch-surface controller 642 can be coupled to a touch surface 646. The touch surface 646 and touch-surface controller 642 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch surface 646.


The other input controller(s) 644 can be coupled to other input/control devices 648, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of the speaker 628 and/or the microphone 630.


In one implementation, a pressing of the button for a first duration can disengage a lock of the touch surface 646; and a pressing of the button for a second duration that is longer than the first duration can turn power to the computing device 600 on or off. Pressing the button for a third duration can activate a voice control, or voice command, module that enables the user to speak commands into the microphone 630 to cause the device to execute the spoken command. The user can customize a functionality of one or more of the buttons. The touch surface 646 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.


In some implementations, the computing device 600 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the computing device 600 can include the functionality of an MP3 player, such as an iPod™. The computing device 600 can, therefore, include a 36-pin connector that is compatible with the iPod. Other input/output and control devices can also be used.


The memory interface 602 can be coupled to memory 650. The memory 650 can include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 650 can store an operating system 652, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks.


The operating system 652 can include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, the operating system 652 can be a kernel (e.g., UNIX kernel). In some implementations, the operating system 652 can include instructions for performing voice authentication. For example, operating system 652 can implement the security lockout and voice authentication features as described with reference to FIGS. 1-5. Operating system 352 can implement the voiceprint and voice authentication features described with reference to FIGS. 1-5.


The memory 650 can also store communication instructions 654 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. The memory 650 can include graphical user interface instructions 656 to facilitate graphic user interface processing; sensor processing instructions 658 to facilitate sensor-related processing and functions; phone instructions 660 to facilitate phone-related processes and functions; electronic messaging instructions 662 to facilitate electronic-messaging related processes and functions; web browsing instructions 664 to facilitate web browsing-related processes and functions; media processing instructions 666 to facilitate media processing-related processes and functions; GNSS/Navigation instructions 668 to facilitate GNSS and navigation-related processes and instructions; and/or camera instructions 670 to facilitate camera-related processes and functions.


The memory 650 can store other software instructions 672 to facilitate other processes and functions, such as the security and/or authentication processes and functions as described with reference to FIGS. 1-5. For example, the software instructions can include instructions for performing voice authentication on a per application or per feature basis and for allowing a user to configure authentication requirements of each application or feature available on device 100.


The memory 650 can also store other software instructions (not shown), such as web video instructions to facilitate web video-related processes and functions; and/or web shopping instructions to facilitate web shopping-related processes and functions. In some implementations, the media processing instructions 666 are divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing-related processes and functions, respectively. An activation record and International Mobile Equipment Identity (IMEI) 674 or similar hardware identifier can also be stored in memory 650.


Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. The memory 650 can include additional instructions or fewer instructions. Furthermore, various functions of the computing device 600 can be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.

Claims
  • 1. A method comprising: on a first device executing a first instance of an application:determining first location data representing a location of the first device;displaying a first user interface to a first user of the first device, the first user interface including a representation of the location of the first device;receiving second location data representing a location of a second device, the second location data received before the second device is authorized to receive location data from the first device;displaying a representation of the location of the second device;receiving an indication to access a second user interface;displaying a list of users to the first user of the first device;receiving a selection of a second user from the list of users; andin response to user input provided in the second user interface, transmitting the first location data representing the location of the first device for use by the second device, the first location data usable by the second device to display a location of the first device.
  • 2. The method of claim 1, comprising receiving a selection of a third user from the list of users, the third user associated with a third device; and transmitting the first location data representing the location of the first device for use by the third device, the first location data usable by the third device to display a location of the first device.
  • 3. The method of claim 1, wherein the second device is identified as associated with the second user based on data received from a cloud service.
  • 4. The method of claim 3, wherein transmitting the first location data representing the location of the first device for use by the second device comprises transmitting the first location data to the cloud service.
  • 5. The method of claim 3, wherein the cloud service comprises one or more servers made available on the Internet.
  • 6. The method of claim 1, wherein the list of users is determined based on a user profile or user account of the user of the first device.
  • 7. The method of claim 1, wherein the list of users is received from a cloud service.
  • 8. A computer readable storage device encoded with instructions that, when executed by a first device executing a first instance of an application, cause the first device to carry out operations comprising: determining first location data representing a location of the first device;displaying a first user interface to a first user of the first device, the first user interface including a representation of the location of the first device;receiving second location data representing a location of a second device, the second location data received before the second device is authorized to receive location data from the first device;displaying a representation of the location of the second device;receiving an indication to access a second user interface;displaying a list of users to the first user of the first device;receiving a selection of a second user from the list of users; andin response to user input provided in the second user interface, transmitting the first location data representing the location of the first device for use by the second device, the first location data usable by the second device to display a location of the first device.
  • 9. The computer-readable storage device of claim 8, the operations comprising receiving a selection of a third user from the list of users, the third user associated with a third device; and transmitting the first location data representing the location of the first device for use by the third device, the first location data usable by the third device to display a location of the first device.
  • 10. The computer-readable storage device of claim 8, wherein the second device is identified as associated with the second user based on data received from a cloud service.
  • 11. The computer-readable storage device of claim 10, wherein transmitting the first location data representing the location of the first device for use by the second device comprises transmitting the first location data to the cloud service.
  • 12. The computer-readable storage device of claim 10, wherein the cloud service comprises one or more servers made available on the Internet.
  • 13. The computer-readable storage device of claim 8, wherein the list of users is determined based on a user profile or user account of the user of the first device.
  • 14. The computer-readable storage device of claim 8, wherein the list of users is received from a cloud service.
  • 15. A system comprising: a location system configured to determine a location of a first device; anda processor configured for, while a first instance of an application is executing on the first device, operations including:determining first location data representing a location of the first device;displaying a first user interface to a first user of the first device, the first user interface including a representation of the location of the first device;receiving second location data representing a location of a second device, the second location data received before the second device is authorized to receive location data from the first device;displaying a representation of the location of the second device;receiving an indication to access a second user interface;displaying a list of users to the first user of the first device;receiving a selection of a second user from the list of users; andin response to user input provided in the second user interface, transmitting the first location data representing the location of the first device for use by the second device, the first location data usable by the second device to display a location of the first device.
  • 16. The system of claim 15, the operations comprising receiving a selection of a third user from the list of users, the third user associated with a third device; and transmitting the first location data representing the location of the first device for use by the third device, the first location data usable by the third device to display a location of the first device.
  • 17. The system of claim 15, wherein the second device is identified as associated with the second user based on data received from a cloud service.
  • 18. The system of claim 17, wherein transmitting the first location data representing the location of the first device for use by the second device comprises transmitting the first location data to the cloud service.
  • 19. The system of claim 17, wherein the cloud service comprises one or more servers made available on the Internet.
  • 20. The system of claim 15, wherein the list of users is determined based on a user profile or user account of the user of the first device.
  • 21. The system of claim 15, wherein the list of users is received from a cloud service.
Parent Case Info

This application is a continuation of and claims priority to pending U.S. application Ser. No. 15/042,628, entitled “Sharing Location Information Among Devices,” filed on Feb. 2, 2016, to be issued as U.S. Pat. No. 9,699,617, which is a continuation of and claims priority to U.S. application Ser. No. 14/666,148, entitled “Sharing Location Information Among Devices, filed Mar. 23, 2015, and issued as U.S. Pat. No. 9,294,882, which is a continuation of and claims priority to U.S. application Ser. No. 13/752,604, entitled “Sharing Location Information Among Devices,” filed on Jan. 29, 2013 and issued as U.S. Pat. No. 8,989,773, the entire contents of which are incorporated herein by reference.

US Referenced Citations (282)
Number Name Date Kind
5475653 Yamada et al. Dec 1995 A
5801700 Ferguson Sep 1998 A
6002402 Schacher Dec 1999 A
6040781 Murray Mar 2000 A
6191807 Hamada et al. Feb 2001 B1
6323846 Westerman et al. Nov 2001 B1
6362842 Tahara et al. Mar 2002 B1
6515585 Yamamoto Feb 2003 B2
6570557 Westerman et al. May 2003 B1
6677932 Westerman et al. Jan 2004 B1
6809724 Shiraishi et al. Oct 2004 B1
7015817 Copley et al. Mar 2006 B2
7039420 Koskinen et al. May 2006 B2
7076257 Kall Jul 2006 B2
7224987 Bhela May 2007 B1
7365736 Marvit et al. Apr 2008 B2
7593749 Vallstrom et al. Sep 2009 B2
7614008 Ording et al. Nov 2009 B2
7633076 Huppi et al. Dec 2009 B2
7653883 Hotelling et al. Jan 2010 B2
7657849 Chaudhri et al. Feb 2010 B2
7663607 Hotelling et al. Feb 2010 B2
7789225 Whiteis Sep 2010 B2
7801542 Stewart Sep 2010 B1
7834861 Lee Nov 2010 B2
7844914 Andre et al. Nov 2010 B2
7908219 Abanami et al. Mar 2011 B2
7953393 Chin et al. May 2011 B2
7957762 Herz et al. Jun 2011 B2
8006002 Kalayjian et al. Aug 2011 B2
8121586 Araradian et al. Feb 2012 B2
8150930 Satterfield et al. Apr 2012 B2
8239784 Hotelling et al. Aug 2012 B2
8244468 Scailisi et al. Aug 2012 B2
8255830 Ording et al. Aug 2012 B2
8279180 Hotelling et al. Oct 2012 B2
8285258 Schultz et al. Oct 2012 B2
8369867 Van Os et al. Feb 2013 B2
8374575 Mullen Feb 2013 B2
8381135 Hotelling et al. Feb 2013 B2
8412154 Leemet et al. Apr 2013 B1
8441367 Lee et al. May 2013 B1
8479122 Hotelling et al. Jul 2013 B2
8572493 Qureshi Oct 2013 B2
8786458 Wiltzius et al. Jul 2014 B1
8855665 Buford et al. Oct 2014 B2
8922485 Lloyd Dec 2014 B1
8971924 Pai et al. Mar 2015 B2
8989773 Sandel Mar 2015 B2
9100944 Newham et al. Aug 2015 B2
9191988 Newham et al. Nov 2015 B2
9204283 Mullen Dec 2015 B2
9247377 Pai et al. Jan 2016 B2
9294882 Sandel Mar 2016 B2
9369833 Tharshanan et al. Jun 2016 B2
9400489 Kim et al. Jul 2016 B2
9402153 Pai et al. Jul 2016 B2
9477208 Lee et al. Oct 2016 B2
9635540 Mullen Apr 2017 B2
9699617 Sandel Jul 2017 B2
20020015024 Westerman et al. Feb 2002 A1
20020037715 Mauney et al. Mar 2002 A1
20020102989 Calvert et al. Aug 2002 A1
20020115478 Fujisawa et al. Aug 2002 A1
20020126135 Ball et al. Sep 2002 A1
20030081506 Karhu May 2003 A1
20030128163 Mizugaki et al. Jul 2003 A1
20040041841 LeMogne et al. Mar 2004 A1
20040070511 Kirn Apr 2004 A1
20040180669 Kall Sep 2004 A1
20040203854 Nowak Oct 2004 A1
20050032532 Kokkonen et al. Feb 2005 A1
20050138552 Venolia Jun 2005 A1
20050148340 Guyot Jul 2005 A1
20050190059 Wehrenberg Sep 2005 A1
20050191159 Benko Sep 2005 A1
20050222756 Davis et al. Oct 2005 A1
20050268237 Crane et al. Dec 2005 A1
20050288036 Brewer et al. Dec 2005 A1
20060017692 Wehrenberg et al. Jan 2006 A1
20060019649 Feinleib et al. Jan 2006 A1
20060026245 Cunningham et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060030333 Ward Feb 2006 A1
20060033724 Chaudhri et al. Feb 2006 A1
20060044283 Eri et al. Mar 2006 A1
20060063538 Ishii Mar 2006 A1
20060092177 Blasko May 2006 A1
20060195787 Topiwala et al. Aug 2006 A1
20060197753 Hotelling et al. Sep 2006 A1
20060223518 Haney Oct 2006 A1
20070036300 Brown et al. Feb 2007 A1
20070085157 Fadell et al. Apr 2007 A1
20070117549 Arnos May 2007 A1
20070129888 Rosenberg Jun 2007 A1
20070150834 Muller et al. Jun 2007 A1
20070150836 Deggelmann et al. Jun 2007 A1
20070216659 Amineh Sep 2007 A1
20070236475 Wherry Oct 2007 A1
20080004043 Wilson et al. Jan 2008 A1
20080014989 Sandegard et al. Jan 2008 A1
20080045232 Cone et al. Feb 2008 A1
20080052945 Matas et al. Mar 2008 A1
20080055264 Anzures et al. Mar 2008 A1
20080057926 Forstall et al. Mar 2008 A1
20080070593 Altman Mar 2008 A1
20080079589 Blackadar Apr 2008 A1
20080114539 Lim May 2008 A1
20080139219 Boeiro et al. Jun 2008 A1
20080153517 Lee Jun 2008 A1
20080165136 Christie et al. Jul 2008 A1
20080176583 Brachet et al. Jul 2008 A1
20080186165 Bertagna et al. Aug 2008 A1
20080216022 Lorch et al. Sep 2008 A1
20080287151 Fjelstad et al. Nov 2008 A1
20080320391 Lemay et al. Dec 2008 A1
20090005011 Christie et al. Jan 2009 A1
20090005018 Forstall et al. Jan 2009 A1
20090006566 Veeramachaneni et al. Jan 2009 A1
20090011340 Lee et al. Jan 2009 A1
20090037536 Braam Feb 2009 A1
20090049502 Levien et al. Feb 2009 A1
20090051648 Shamaie et al. Feb 2009 A1
20090051649 Rondel Feb 2009 A1
20090055494 Fukumoto Feb 2009 A1
20090066564 Burroughs et al. Mar 2009 A1
20090085806 Piersol et al. Apr 2009 A1
20090098903 Donaldson et al. Apr 2009 A1
20090113340 Bender Apr 2009 A1
20090164219 Yeung et al. Jun 2009 A1
20090177981 Christie et al. Jul 2009 A1
20090181726 Vargas et al. Jul 2009 A1
20090187842 Collins et al. Jul 2009 A1
20090254840 Churchill et al. Oct 2009 A1
20090298444 Shigeta Dec 2009 A1
20090303066 Lee et al. Dec 2009 A1
20090312032 Bornstein et al. Dec 2009 A1
20090313582 Rupsingh et al. Dec 2009 A1
20090319616 Lewis et al. Dec 2009 A1
20090322560 Tengler et al. Dec 2009 A1
20090325603 Van Os et al. Dec 2009 A1
20100004005 Pereira et al. Jan 2010 A1
20100017126 Holeman Jan 2010 A1
20100058231 Duarte et al. Mar 2010 A1
20100069035 Johnson Mar 2010 A1
20100124906 Hautala May 2010 A1
20100125411 Goel May 2010 A1
20100125785 Moore et al. May 2010 A1
20100144368 Sullivan Jun 2010 A1
20100203901 Dinoff Aug 2010 A1
20100205242 Marchioro, II et al. Aug 2010 A1
20100211425 Govindarajan et al. Aug 2010 A1
20100240398 Hotes et al. Sep 2010 A1
20100248744 Bychkov et al. Sep 2010 A1
20100250727 King et al. Sep 2010 A1
20100274569 Reudink Oct 2010 A1
20100281409 Rainisto et al. Nov 2010 A1
20100287178 Lambert et al. Nov 2010 A1
20100299060 Snavely et al. Nov 2010 A1
20100325194 Williamson et al. Dec 2010 A1
20100330952 Yeoman et al. Dec 2010 A1
20100332518 Song et al. Dec 2010 A1
20110003587 Belz et al. Jan 2011 A1
20110051658 Jin et al. Mar 2011 A1
20110054780 Dhanani et al. Mar 2011 A1
20110054979 Cova et al. Mar 2011 A1
20110059769 Brunolli Mar 2011 A1
20110066743 Hurley Mar 2011 A1
20110080356 Kang et al. Apr 2011 A1
20110096011 Suzuki Apr 2011 A1
20110118975 Chen May 2011 A1
20110137813 Stewart Jun 2011 A1
20110137954 Diaz Jun 2011 A1
20110138006 Stewart Jun 2011 A1
20110148626 Acevedo Jun 2011 A1
20110151418 Delespaul et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110164058 Lemay Jul 2011 A1
20110167383 Schuller et al. Jul 2011 A1
20110183650 McKee Jul 2011 A1
20110225547 Fong et al. Sep 2011 A1
20110239158 Barraclough et al. Sep 2011 A1
20110250909 Mathias Oct 2011 A1
20110254684 Antoci Oct 2011 A1
20110265041 Ganetakos et al. Oct 2011 A1
20110276901 Zambetti et al. Nov 2011 A1
20110279323 Hung et al. Nov 2011 A1
20110306366 Trussel et al. Dec 2011 A1
20110306393 Goldman et al. Dec 2011 A1
20110307124 Morgan et al. Dec 2011 A1
20110316769 Boettcher et al. Dec 2011 A1
20120008526 Borghei Jan 2012 A1
20120022872 Gruber et al. Jan 2012 A1
20120040681 Yan et al. Feb 2012 A1
20120054028 Tengler et al. Mar 2012 A1
20120077463 Robbins et al. Mar 2012 A1
20120088521 Nishida et al. Apr 2012 A1
20120095918 Jurss Apr 2012 A1
20120102437 Worley et al. Apr 2012 A1
20120105358 Momeyer May 2012 A1
20120108215 Kameli et al. May 2012 A1
20120117507 Tseng et al. May 2012 A1
20120131458 Hayes May 2012 A1
20120136997 Yan et al. May 2012 A1
20120144452 Dyor Jun 2012 A1
20120149405 Bhat Jun 2012 A1
20120150970 Peterson et al. Jun 2012 A1
20120158511 Lucero et al. Jun 2012 A1
20120166531 Sylvain Jun 2012 A1
20120172088 Kirch et al. Jul 2012 A1
20120208592 Davis et al. Aug 2012 A1
20120216127 Meyr Aug 2012 A1
20120218177 Pang et al. Aug 2012 A1
20120222083 Vaha-Sipila et al. Aug 2012 A1
20120239949 Kalyanasundaram et al. Sep 2012 A1
20120258726 Bansal et al. Oct 2012 A1
20120265823 Parmar et al. Oct 2012 A1
20120276919 Bi et al. Nov 2012 A1
20120290648 Sharkey Nov 2012 A1
20120302256 Pai et al. Nov 2012 A1
20120302258 Pai et al. Nov 2012 A1
20120304084 Kim et al. Nov 2012 A1
20120306770 Moore et al. Dec 2012 A1
20130002580 Sudou Jan 2013 A1
20130007665 Chaudhri et al. Jan 2013 A1
20130045759 Smith et al. Feb 2013 A1
20130063364 Moore Mar 2013 A1
20130065566 Gisby et al. Mar 2013 A1
20130091298 Ozzie et al. Apr 2013 A1
20130093833 Al-Asaaed et al. Apr 2013 A1
20130120106 Cauwels et al. May 2013 A1
20130143586 Williams et al. Jun 2013 A1
20130159941 Langlois et al. Jun 2013 A1
20130212470 Karunamuni et al. Aug 2013 A1
20130222236 Gardenfors et al. Aug 2013 A1
20130226453 Trussel et al. Aug 2013 A1
20130234924 Janefalkar et al. Sep 2013 A1
20130244633 Jacobs et al. Sep 2013 A1
20130254714 Shin et al. Sep 2013 A1
20130262298 Morley Oct 2013 A1
20130275924 Weinberg et al. Oct 2013 A1
20130303190 Khan et al. Nov 2013 A1
20130305331 Kim Nov 2013 A1
20130307809 Sudou Nov 2013 A1
20130310089 Gianoukos et al. Nov 2013 A1
20130321314 Oh et al. Dec 2013 A1
20130322634 Bennett et al. Dec 2013 A1
20130346882 Shipiacoff et al. Dec 2013 A1
20130347018 Limp et al. Dec 2013 A1
20140055552 Song et al. Feb 2014 A1
20140058873 Sorensen et al. Feb 2014 A1
20140062790 Letz et al. Mar 2014 A1
20140066105 Bridge et al. Mar 2014 A1
20140073256 Newham et al. Mar 2014 A1
20140085487 Park et al. Mar 2014 A1
20140099973 Cecchini et al. Apr 2014 A1
20140136990 Gonnen et al. May 2014 A1
20140181183 Houjou et al. Jun 2014 A1
20140189533 Krack et al. Jul 2014 A1
20140222933 Stovicek et al. Aug 2014 A1
20140237126 Bridge Aug 2014 A1
20140344711 Hallerstrom et al. Nov 2014 A1
20140365944 Moore et al. Dec 2014 A1
20150007049 Langlois et al. Jan 2015 A1
20150040029 Koum et al. Feb 2015 A1
20150089660 Song et al. Mar 2015 A1
20150100537 Grieves et al. Apr 2015 A1
20150102992 Klement et al. Apr 2015 A1
20150172393 Oplinger et al. Jun 2015 A1
20150180746 Day, II et al. Jun 2015 A1
20150185849 Levesque et al. Jul 2015 A1
20150188869 Gilad et al. Jul 2015 A1
20150346912 Yang et al. Dec 2015 A1
20150350130 Yang et al. Dec 2015 A1
20150350140 Garcia et al. Dec 2015 A1
20150350141 Yang et al. Dec 2015 A1
20160036735 Pycock et al. Feb 2016 A1
20160073223 Woolsey et al. Mar 2016 A1
20160234060 Raghu et al. Aug 2016 A1
20160299526 Inagaki et al. Oct 2016 A1
20170026796 Raghu et al. Jan 2017 A1
20190037353 Pai et al. Jan 2019 A1
Foreign Referenced Citations (57)
Number Date Country
1475924 Feb 2004 CN
1852335 Oct 2006 CN
101390371 Mar 2009 CN
102098656 Jun 2011 CN
102111505 Jun 2011 CN
201928419 Aug 2011 CN
102695302 Sep 2012 CN
103207674 Jul 2013 CN
103309606 Sep 2013 CN
103500079 Jan 2014 CN
103583031 Feb 2014 CN
103959751 Jul 2014 CN
104205785 Dec 2014 CN
1 387 590 Feb 2004 EP
2574026 Mar 2013 EP
2610701 Jul 2013 EP
2610701 Apr 2014 EP
2849042 Mar 2015 EP
H1145117 Feb 1999 JP
2002-366485 Dec 2002 JP
2003-516057 May 2003 JP
2003-207556 Jul 2003 JP
2006-072489 Mar 2006 JP
2006-079427 Mar 2006 JP
2006-113637 Apr 2006 JP
2006-129429 May 2006 JP
2009-081865 Apr 2009 JP
2010-503126 Jan 2010 JP
2010-503332 Jan 2010 JP
2010-288162 Dec 2010 JP
2010-539804 Dec 2010 JP
2011-060065 Mar 2011 JP
2011-107823 Jun 2011 JP
2012-508530 Apr 2012 JP
2012-198369 Oct 2012 JP
2013-048389 Mar 2013 JP
2014-057129 Mar 2014 JP
10-2004-0089329 Oct 2004 KR
10-2007-0096222 Oct 2007 KR
10-2008-0074813 Aug 2008 KR
200532429 Oct 2005 TW
WO 2001041468 Jun 2001 WO
WO 2002003093 Jan 2002 WO
WO 2008030972 Mar 2008 WO
WO 2009071112 Jun 2009 WO
WO 2010048995 May 2010 WO
WO 2010054373 May 2010 WO
WO 2011080622 Jul 2011 WO
WO 2012128824 Sep 2012 WO
WO 2012170446 Dec 2012 WO
WO 2013093558 Jun 2013 WO
WO 2013169842 Nov 2013 WO
WO 2013169865 Nov 2013 WO
WO 2013169875 Nov 2013 WO
WO 2014083001 Jun 2014 WO
WO 2014105276 Jul 2014 WO
WO 2015038684 Mar 2015 WO
Non-Patent Literature Citations (168)
Entry
Australian Patent Examination Report No. 1 in Australian Application No. 2012202929, dated Sep. 28, 2013, 3 pages.
Chinese Office Action in Chinese Application No. 201210288784.3, dated Jul. 3, 2014, 16 pages (with English Translation).
European Search Report in European Application No. 12168980.6, dated Sep. 21, 2012, 7 pages.
International Preliminary Report on Patentability in International Application No. PCT/US/2012/038718, dated Nov. 26, 2013, 5 pages.
International Search Report in International Application No. PCT/US2012/038718, dated Aug. 17, 2012, 3 pages.
International Search Report and Written Opinion in International Application No. PCT/US13/41780, dated Dec. 1, 2014, 8 pages.
International Preliminary Report on Patentability in International Application No. PCT/US13/41780, dated Dec. 9, 2014, 7 pages.
Japanese Office Action in Japanese Application No. 2012-113725, dated May 27, 2013, 9 pages (with English Translation).
Korean Preliminary Rejection in Korean Application No. 10-2012-54888, dated Sep. 5, 2014, 9 pages (with English Translation).
Search and Examination Report in GB Application No. GB1209044.5, dated Aug. 24, 2012, 10 pages.
U.S. Final Office Action in U.S. Appl. No. 13/113,856, dated Nov. 7, 2012, 19 pages.
U.S. Final Office Action in U.S. Appl. No. 13/488,430, dated May 8, 2013, 19 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 13/113,856, dated Jul. 18, 2012, 14 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 13/488,430, dated Dec. 5, 2012, 13 pages.
Written Opinion in International Application No. PCT/US/2012/038718, dated Aug. 17, 2012, 4 pages.
Australian Patent Examination Report No. 1 in Australian Application No. 2013203926, dated Oct. 7, 2014, 5 pages.
Australian Patent Examination Report No. 2 in Australian Application No. 2013203926, dated Jan. 13, 2016, 3 pages.
European Extended Search Report in Application No. 16155938.0, dated Jun. 7, 2016, 8 pages.
Chinese Office Action for Application No. 201210288784.3, dated Jan. 5, 2017, 13 pages (with English translation).
India Office Action for Application No. 2030/CHE/2012, dated Dec. 27, 2016, 9 pages.
Chinese Notification of Reexamination for Application No. 201210288784.3, dated Sep. 27, 2017, 17 pages (with English translation).
‘absoluteblogger.com’ [online]. “WeChat Review—Communication Application with Screenshots,” available on or before Jun. 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://www.absoluteblogger.com/2012/10/wechat-review-communication-application.html>. 4 pages.
‘appps.jp’ [online]. “WhatsApp” users over 400 million people! I tried to investigate the most used messaging application in the world Jan. 24, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140410142411/http://www.appps.jp/2128786>. 13 pages, with Machine English Translation.
Australian Certificate of Examination in Australian Patent Application No. 2017100760 dated Feb. 9, 2018, 2 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015267259, dated Jan. 30, 2018, 3 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015267260, dated Jan. 30, 2018, 3 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015312369, dated Mar. 21, 2018, 3 pages.
Australian Office Action in Australian Patent Application No. 2015100711, dated Jul. 27, 2015, 7 pages.
Australian Office Action in Australian Patent Application No. 2015100711, dated Nov. 19, 2015, 6 pages.
Australian Office Action in Australian Patent Application No. 2015101188, dated Apr. 14, 2016, 3 pages.
Australian Office Action in Australian Patent Application No. 2015267259, dated Jun. 2, 2017, 2 pages.
Australian Office Action in Australian Patent Application No. 2015267260, dated Jun. 2, 2017, 2 pages.
Australian Office Action in Australian Patent Application No. 2015312369, dated Mar. 29, 2017, 3 Pages.
Australian Office Action in Australian Patent Application No. 2016102028, dated Feb. 13, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2016102029, dated Feb. 22, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100197, dated Apr. 28, 2017, 4 Pages.
Australian Office Action in Australian Patent Application No. 2017100198, dated Apr. 20, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100760, dated Aug. 10, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100760, dated Jan. 30, 2018, 3 pages.
Australian Office Action in Australian Patent Application No. 2018204430, dated Aug. 15, 2018, 5 pages.
Chinese Notice of Allowance received for Chinese Patent Application No. 201510290133.1, dated Jan. 9, 2019, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201510291012.9, dated Jan. 9 2019, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365358.4, dated Nov. 20, 2015, 2 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365843.1, dated Feb. 15, 2016, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520669842.6, dated May 18, 2016, 2 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510290133.1, dated Feb. 9, 2018, 10 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510291012. 9, dated Feb. 8, 2018, 9 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Aug. 7, 2018, 7 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Nov. 24, 2017, 22 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365358.4, dated Aug. 11, 2015, 4 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Aug. 25, 2015, 4 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Nov. 16, 2015, 3 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520669842.6, dated Dec. 4, 2015, 7 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201620393549.6, dated Aug. 18, 2016, 2 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201620393748.7, dated Aug. 18, 2016, 2 pages with English Translation.
Danish Decision to Grant received for Danish Patent Application No. PA201770126, dated Mar. 27, 2018, 2 pages.
Danish Intention to Grant received for Denmark Patent Application No. PA201570550, dated Dec. 22, 2016, 2 pages.
Danish Intention to Grant received for Denmark Patent Application No. PA201770126, dated Jan. 19, 2018, 2 pages.
Danish Notice of Allowance received for Danish Patent Application No. PA201570550, dated Mar. 20, 2017, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Dec. 7, 2015, 5 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Jan. 19, 2016, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Oct. 19, 2016, 3 pages.
Danish Office Action received for Danish Patent Application No. PA201770089, dated Apr. 25, 2017, 10 pages.
Danish Office Action received for Danish Patent Application No. PA201770125, dated Jan. 26, 2018, 5 pages.
Danish Office Action received for Danish Patent Application No. PA201770125, dated Jul. 20, 2018, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201770126, dated Oct. 18, 2017, 3 pages.
Danish Search Report received for Danish Patent Application No. PA201770125, dated May 5, 2017, 10 pages.
Danish Search Report received for Danish Patent Application No. PA201770126, dated Apr. 26, 2017, 8 Pages.
‘digitalstreetsa.com’ [online]. “Why WeChat might kill Whatsapp's future . . .” Jul. 3, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://digitalstreetsa.com/why-wechat-might-kill-whatsapps-future>. 9 pages.
‘download.cnet.com’ [online]. “WeChat APK for Android” Jan. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://download.cnet.com/WeChat/3000-2150_4-75739423.html> 5 pages.
‘engadget.com’ [online]. “WhatsApp Introduces Major New Audio Features,” Aug. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://www.engadget.com/2013/08/07/whatsapp-introduces-major-new-audio-features>. 12 pages.
European Extended Search Report in European Patent Application No. 17167629.9, dated Jun. 2, 2017, 7 pages.
European Extended Search Report in European Patent Application No. 18170262.2, dated Jul. 25, 2018, 8 pages.
European Office Action in European Patent Application No. 15728307.8, dated Feb. 8, 2018, 7 pages.
European Office Action in European Patent Application No. 15729286.3, dated Feb. 7, 2018, 7 pages.
European Office Action in European Patent Application No. 15759981.2, dated Apr. 19, 2018, 6 pages.
European Office Action in European Patent Application No. 15759981.2, dated Aug. 6, 2018, 10 pages.
European Office Action in European Patent Application No. 15759981.2, dated May 16, 2018, 6 pages.
European Office Action in European Patent Application No. 17167629.9, dated Jan. 25, 2019, 7 pages.
‘heresthethingblog.com’ [online]. “iOS 7 tip: Alerts, Banners, and Badgesawhats the Difference?” Jan. 22, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140128072440/http://heresthethingblog.com/2014/01/22/ios-7-tip-whats-difference-alert/>. 5 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2015/032309, dated Dec. 15, 2016, 7 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/032305, dated Dec. 15, 2016, 7 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/043487, dated Feb. 16, 2017, 12 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/044083, dated Mar. 16, 2017, 24 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/046787, dated Mar. 16, 2017, 18 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2016/046828, dated Mar. 1, 2018, 19 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032305, dated Sep. 10, 2015, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032309, dated Sep. 2, 2015, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/043487, dated Jan. 29, 2016, 17 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/044083, dated Feb. 4, 2016, 31 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/046787, dated Apr. 1, 2016, 26 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/046828, dated Dec. 15, 2016, 21 pages.
Invitation to Pay Additional Fees and Partial Search Report received for PCT Patent Application No. PCT/US2015/043487, dated Nov. 9, 2015, 4 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/044083, dated Nov. 4, 2015, 11 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/046787, dated Dec. 15, 2015, 8 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2016/046828, dated Sep. 23, 2016, 2 pages.
iPhone, “User Guide for iOS 7.1 Software”, Mar. 2014, 162 pages.
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-510297, dated May 7, 2018, 5 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-514992, dated Feb. 15, 2019, 5 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent application No. 2017514993, dated Jan. 12, 2018, 6 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent Application No. 2018-072632, dated Dec. 7, 2018, 6 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Dec. 4, 2017, 6 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Jul. 10, 2017, 9 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-514992, dated Apr. 6, 2018, 9 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2018-018497, dated Dec. 10, 2018, 7 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2018-072632, dated Jul. 9, 2018, 5 Pages with English Translation.
‘jng.org’ [online]. “Affordances and Design,” published on or before Feb. 25, 2010 [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20150318044240/jnd.org/dn.mss/affordances_and.html> 6 pages.
Korean Notice of Allowance received for Korean Patent Application No. 10-2017-7005628, dated Jun. 18, 2018, 4 pages with English Translation.
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated Jan. 30, 2018, 6 pages with English translation.
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated May 10, 2017, 11 pages with English Translation.
Korean Office Action received for Korean Patent Application No. 10-2018-7027006, dated Jan. 14, 2019, 4 pages with English Translation.
‘makeuseof.com’ [online]. “MS Outlook Tip: How to Automatically Organize Incoming Emails,” Sep. 27, 2019, retrieved on Apr. 23, 2019], retrieved from: URL<http://www.makeuseof.com/tag/ms-outlook-productivity-tip-how-to-move-emails-to-individual-folders-automatically>. 5 pages.
‘manualslib.com’ [online]. “Samsung Gear 2 User Manual”, 2014, [retrieved on Apr. 23, 2019], retrieved from: URL<http://www.manualslib.com/download/754923/Samsung-Gear-2.html>. 97 pages.
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2015354, completed on Jun. 22, 2017, 23 pages with English Translation.
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2019878, dated Apr. 6, 2018, 23 pages with English Translation.
Samsung, “SM-G900F User Manual”, English (EU). Rev.1.0, Mar. 2014, 249 pages.
Samsung, “SM-R380”, User Manual, 2014, 74 pages.
‘seechina365.com’ [online]. “How to use China's popular social networking service wechat2_ voice message, press together, shake function etc.” Apr. 5, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://seechina365.com/2014/04/05/wechat02>. 29 pages with Machine English Translation.
‘slideshare.net.’ [online]. “Samsung Gear 2 User manual”, Apr. 2014, [retrieved on Apr. 23, 2019], retrieved from: URL<http://www.slideshare.net/badaindonesia/samsung-gear-2-user-manual>. 58 pages.
Taiwanese Office Action received for Taiwanese Patent Application No. 104107332, dated Oct. 29, 2018, 12 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128519, dated Mar. 29, 2017, 16 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Jul. 31, 2017, 7 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Nov. 2, 2016, 12 pages with English Translation.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,376, dated Jul. 29, 2015, 12 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,376, dated Sep. 2, 2015, 4 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,376, dated Sep. 24, 2015, 5 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,386, dated Jul. 30, 2015, 11 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,386, dated Sep. 24, 2015, 5 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/817,572, dated Nov. 30, 2017, 26 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/838,235, dated Dec. 29, 2016, 3 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/838,235, dated Oct. 4, 2016, 7 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,608, dated Jan. 25, 2018, 2 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,608, dated Nov. 14, 2017, 5 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,614, dated Jan. 8, 2019, 3 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,614, dated Oct. 24, 2018, 10 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,623, dated Feb. 23, 2018, 8 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/142,661, dated Feb. 15, 2018, 9 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/142,661, dated Oct. 4, 2017, 21 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/425,273, dated Mar. 7, 2019, 8 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/431,435, dated Jan. 23, 2018, 8 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/876,673, dated May 4, 2018, 26 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/985,570, dated Mar. 13, 2019, 21 pages.
U.S. Office Action in U.S. Appl. No. 14/503,376, dated Dec. 22, 2014, 19 pages.
U.S. Office Action in U.S. Appl. No. 14/503,386, dated Jan. 7, 2015, 18 pages.
U.S. Office Action in U.S. Appl. No. 14/817,572, dated Mar. 23, 2017, 13 pages.
U.S. Office Action in U.S. Appl. No. 14/817,572, dated Sep. 12, 2016, 8 pages.
U.S. Office Action in U.S. Appl. No. 14/838,235, dated Jun. 15, 2016, 17 pages.
U.S. Office Action in U.S. Appl. No. 14/841,608, dated Apr. 12, 2017, 8 pages.
U.S. Office Action in U.S. Appl. No. 14/841,614, dated Jul. 27, 2017, 12 pages.
U.S. Office Action in U.S. Appl. No. 14/841,614, dated May 10, 2018, 12 pages.
U.S. Office Action in U.S. Appl. No. 14/841,623, dated Feb. 2, 2017, 16 pages.
U.S. Office Action in U.S. Appl. No. 14/841,623, dated Sep. 5, 2017, 15 pages.
U.S. Office Action in U.S. Appl. No. 14/928,865, dated Dec. 5, 2018, 14 pages.
U.S. Office Action in U.S. Appl. No. 14/928,865, dated Mar. 27, 2018, 14 pages.
U.S. Office Action in U.S. Appl. No. 15/142,661, dated Jan. 25, 2017, 28 Pages.
U.S. Office Action in U.S. Appl. No. 15/366,763, dated Mar. 8, 2019, 13 pages.
U.S. Office Action in U.S. Appl. No. 15/425,273, dated Oct. 3, 2018, 9 pages.
U.S. Office Action in U.S. Appl. No. 15/431,435, dated Jun. 8, 2017, 10 pages.
U.S. Office Action in U.S. Appl. No. 15/985,570, dated Aug. 16, 2018, 23 pages.
U.S. Office Action received for U.S. Appl. No. 14/838,235, dated Jan. 5, 2016, 18 pages.
‘wechat.wikia.com’ [online]. “WeChat Wiki”, May 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://web.archive.org/web/20130514131044/http://wechat.wikia.com/wiki/WeChat_Wiki>. 6 pages.
‘wikihow.com’ [online]. “How to Move Mail to Different Folders in Gmail,” available on or before Jul. 31, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140731230338/http://www.wikihow.com/Move-Mail-to-Different-Folders-in-Gmail>. 4 pages.
‘youtube.com’ [online]. “How to Dismiss Banner Notifications or Toast Notifications on iOS7,” Dec. 17, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=vSjHnBFIW_M>. 2 pages.
‘youtube.com’ [online]. “How to Send a Picture Message/MMS—Samsung Galaxy Note 3,” Nov. 3, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=−3d0z8-KeDw>. \2 page.
‘youtube.com’ [online]. “iOS 7 Notification Center Complete Walkthrough,” Jun. 10, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=gATXt-o42LA>. 3 pages.
‘youtube.com’ [online]. “iOS Notification Banner Pull Down to Notification Center in iOS 7 Beta 5”, Aug. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=nP0s6ETPxDg>. 2 pages.
‘youtube.com’ [online]. “Notification & Control Center Problem Issue Solution” Dec. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=K0zCueY1aTA>. 3 pages.
‘youtube.com’ [online]. “WeChat TVC—Hold to Talk”, May 11, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=E_UxteOWVSo>. 2 page.
Related Publications (1)
Number Date Country
20180091951 A1 Mar 2018 US
Continuations (3)
Number Date Country
Parent 15042628 Feb 2016 US
Child 15639107 US
Parent 14666148 Mar 2015 US
Child 15042628 US
Parent 13752604 Jan 2013 US
Child 14666148 US