Live location sharing

Information

  • Patent Grant
  • 11943191
  • Patent Number
    11,943,191
  • Date Filed
    Monday, August 5, 2019
    4 years ago
  • Date Issued
    Tuesday, March 26, 2024
    a month ago
Abstract
Techniques for live location sharing are described. A first mobile device and a second mobile device can communicate with one another using an IM program. The first mobile device can receive a user input to share a location of the first mobile device in the IM program. Sharing the location can include causing the second mobile device to display a location of the first mobile device in an IM program user interface on the second mobile device. Duration of sharing the location can be user-configurable. The second mobile device may or may not share a location of the second device for display in the IM program executing on the first mobile device.
Description
TECHNICAL FIELD

This disclosure relates generally to location-based services.


BACKGROUND

A mobile device may have an instant messaging (IM) program that allows a user of the mobile device to chat with another user over the Internet. The IM program can offer real-time (“live”) transmission of text from the mobile device to a device of the other user, and receive and display real-time text received from the other device. The IM program can have a peer-to-peer or server-client architecture for transmitting the text in real-time.


SUMMARY

Techniques for live location sharing are described. A first mobile device and a second mobile device can communicate with one another using an IM program. The first mobile device can receive a user input to share a location of the first mobile device in the IM program. Sharing the location can include causing the second mobile device to display a location of the first mobile device in an IM program user interface on the second mobile device. Duration of sharing the location can be user-configurable. The second mobile device may or may not share a location of the second device for display in the IM program executing on the first mobile device.


The features described in this specification can be implemented to achieve one or more advantages. Compared to conventional IM program, the features described in this specification can allow chatting users to share more information. A user may see, in a user interface of the IM program, where the user's chatting partner is located. Likewise, the chatting partner can see where the user is located. Such information can enhance user experience, and can make tasks such as scheduling a gathering at a location easier. A user's privacy is protected according to the user's own preference as to with whom to share a location, and for how long.


The details of one or more implementations of the subject matter are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram illustrating exemplary live location sharing.



FIGS. 2A-2D illustrate exemplary user interfaces for live location sharing.



FIG. 3 is a block diagram illustrating exemplary interaction between mobile devices and their respective servers for live location sharing.



FIG. 4 is a block diagram illustrating components of an exemplary server and an exemplary mobile device for live location sharing.



FIG. 5 is a flowchart of an exemplary process of live location sharing.



FIG. 6 is a flowchart of an exemplary process of live location sharing.



FIG. 7 is a flowchart of an exemplary process of live location sharing.



FIG. 8 is a block diagram illustrating an exemplary device architecture of a mobile device implementing the features and operations described in reference to FIGS. 1-7.



FIG. 9 is a block diagram of an exemplary network operating environment for the mobile devices of FIGS. 1-7.



FIG. 10 is a block diagram of an exemplary system architecture for implementing the features and operations of FIGS. 1-7.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION
Exemplary Live Location Sharing


FIG. 1 is a diagram illustrating exemplary live location sharing. Mobile device 102 can communicate with mobile device 104 over communications network 110 using an IM program. The IM program can be hosted on a server to which mobile device 102 and mobile device 104 connect. Alternatively, each of mobile device 102 and mobile device 104 can host a separate copy of an IM program. A first user of mobile device 102 and a second user of mobile device 104 may chat (112A, 112B) with each other online using the IM program.


During the chat, mobile device 104 can display location sharing interface 114 in response to an input from the second user. Location sharing interface 114 allows the second user to enable location sharing. Location sharing can include allowing mobile device 102 to see a real-time location of mobile device 104 in the IM program. Allowing mobile device 102 to see the location of mobile device 104 can include allowing mobile device 102 to access the location through a server. The location can be stored on mobile device 104, or submitted by mobile device 104 to be stored on the server temporarily for duration of location sharing.


Mobile device 104 receives the input to enable location sharing. In response, mobile device 104 notifies mobile device 102 of the location sharing. Mobile device 102 acquires the location of mobile device 104. Mobile device 102 can display virtual map 116 in the IM program. Mobile device 102 can represent the real-time location of mobile device 104 using marker 118 in virtual map 116. Marker 118 can move in virtual map 116, corresponding to physical movement of mobile device 104.


Exemplary User Interface


FIGS. 2A-2D illustrate exemplary user interfaces for live location sharing. Each user interface can be a user interface of an IM program executing on either mobile device 102 or mobile device 104 of FIG. 1. For convenience, each user interface will be described in reference to mobile device 102.



FIG. 2A illustrates exemplary user interface 202 for initiating live location sharing. The live location sharing can be sharing a location of mobile device 102 with a device that is in communication with mobile device through an IM program. The sharing can be limited to the IM program, where the shared location is visible in an IM program on the other device.


User interface 202 can include settings user interface item 204. Settings user interface item 204 can have a label “details” or any other label indicating that a user can access detailed settings of the IM program. Upon receiving a user input in settings user interface item 204, mobile device 102 can display a list of settings. One of the settings can be location sharing user interface item 206. Location sharing user interface item 206 can include a virtual button that, when touched, can cause mobile device 102 to display location sharing user interface 208.



FIG. 2B illustrates exemplary location sharing user interface 208. Location sharing user interface 208 can include various user interface items for specifying when to share a location of mobile device 102 with another mobile device in an IM program. Location sharing user interface 208 can include virtual button 210 that, when selected, causes mobile device 102 to share location of mobile device 102 in an IM program for a first time period, e.g., one hour. Location sharing user interface 208 can include virtual button 212 that, when selected, causes mobile device 102 to share location of mobile device 102 in an IM program for a second time period, e.g., one day. Location sharing user interface 208 can include virtual button 214 that, when selected, causes mobile device 102 to share location of mobile device 102 in an IM program for a third time period, e.g., indefinitely. Location sharing user interface 208 can include virtual button 216 that, when selected, causes mobile device 102 to share location of mobile device 102 in an program with another device when mobile device 102 is in proximity with the other device and in communication with the other device. The proximity can be user defined, e.g., within a same country, within a same city, or within X miles or meters of one another.



FIG. 2C illustrates exemplary map user interface 218 of an IM program executing on mobile device 102. Mobile device 102 can display map user interface 218 upon receiving a user confirmation for sharing the location. Map user interface 218 can include marker 220 indicating a current location of mobile device 102, as can be visible in an IM program on another device that receives the shared location. Accordingly, a user of mobile device 102 can be aware of what a user of the other device sees.



FIG. 2D illustrates exemplary map user interface 222 of an IM program executing on mobile device 102. Mobile device 102 is in communication with another mobile device using the program. Mobile device 102 shared location of mobile device 102 with the other device. The other device, in return, shared location of that device with mobile device 102. Mobile device 102 can display map user interface 222 that includes a virtual map, marker 224 indicating a real-time location of mobile device 102, and marker 226 indicating the real-time location of the other device.


Exemplary System Components


FIG. 3 is a block diagram illustrating exemplary interaction between mobile devices and their respective servers for live location sharing. Mobile device 102 and mobile device 104 can communicate with one another using communication channel 302. Communication channel 302 can be a communication channel for IM programs and can be based on a first telephone number PN1 of mobile device 102 and a second telephone number PN2 of mobile device 104. Mobile device 102 has logged into a user account on first server 304. The user account is associated with an account identifier ID1, e.g., an account name. Mobile device 104 has logged into a user account on second server 306. The user account is associated with an account identifier ID2.


Mobile device 102 received a user input requesting mobile device 102 to share a location of mobile device 102 with mobile device 104 in the IM program. In response, mobile device 102 can submit request 308 to server 304 requesting server 304 to provide location sharing information for passing to mobile device 104 through communication channel 302. In response, server 304 can provide mapping packet 310A to mobile device 102. Mapping packet 310A can include PN1 and ID1, and information on how long the location will be shared.


Mobile device 102 can submit mapping packet 310B, which can be the same as mapping packet 310A, to mobile device 104 through communication channel 302. Mobile device 104 provides the mapping packet 310B to server 306 as request 310C. Server 306 may already store the second telephone number PN2 of mobile device 104 and account identifier ID2.


Server 306 can submit the number PN1 and ID1 to an identity service (IDS) 312. The IDS 312 can include one or more computers configured to determine, based on PN1 and ID1, whether mobile device 102 is still logged in to server 304. The IDS 312 can send token 314 to server 306. Server 306 can submit token 314 to server 304. Server 304 can retrieve location of mobile device 102 and provide the location to server 306. Server 306 can, in turn, provide the location to mobile device 104 for displaying in the IM program.



FIG. 4 is a block diagram illustrating components of an exemplary server and an exemplary mobile device for live location sharing. The server can be either server 304 or server 306 (of FIG. 3). The mobile device can be either mobile device 102 or mobile device 104 (of FIG. 3). For convenience, FIG. 4 will be described in reference to server 304 and mobile device 102.


Mobile device 102 can include instant messaging subsystem 402. Instant messaging subsystem 402 is a component of mobile device 102 configured to execute an IM program and sharing a location of mobile device 102 in the IM program with another device. Instant messaging subsystem 402 can include location interface module 404 configured to share the location in the IM program. Instant messaging subsystem 402 can include map module 406 configured to display a map in the IM program, including displaying in the map the location of the mobile device 102 and, if a location of another device is shared, the location of the other device. Instant messaging subsystem 402 can include device communication module 408 configured to establish a telephone number based communication channel with another device and communicate with the other device using an IM program over that channel.


Mobile device 102 can include server communication subsystem 410. Server communication subsystem 410 is a component of mobile device 102 configured to send a request to server 304 for mapping packet upon receiving instructions from location interface module 404 to share location. Server communication subsystem 410 can receive the mapping packet from server 304.


If another device shares a location with mobile device 102, the other device can notify mobile device 102 of the sharing through device communication module 408. Location interface module 404 can then instruct server communication subsystem 410 to request the shared location from server 304. Location interface module 404 can provide the shared location to location interface module 404 for displaying in a map of the program.


Mobile device 102 can include location subsystem 412. Location subsystem 412 is a component of mobile device 102 configured to determine a location of mobile device 102, for example, by using signals from a cellular communication system, one or more wireless access points, or a global satellite navigation system.


Location subsystem 412 can provide the location to server communication subsystem 410 for submitting to the server for sharing.


Exemplary Procedures


FIG. 5 is a flowchart of an exemplary process 500 of live location sharing. A first mobile device, e.g., mobile device 102, can submit (502) a notification to a second mobile device, e.g., mobile device 104, through an instant message program. The notification can indicate that the first mobile device shall provide a first location of the first mobile device for sharing with the second mobile device. At time of submitting the notification, the first mobile device and the second mobile device can be in communication through the instant message program. The communication can be established based on a phone number of the first mobile device and a phone number of the second mobile device.


The first mobile device can receive (504), through the instant message program and from the second mobile device, a response to the notification. The response can be triggered by the notification. The response can be approved by a user of the second mobile device. The response can indicate that the second mobile device shall provide a second location of the second mobile device for sharing with the first mobile device.


The first mobile device can obtain (506) from a server, the second location. The first mobile device can then provide (508) a marker representing the second location for display on a virtual map in the instant message program on the first mobile device. Likewise, the second mobile device can provide a marker representing the first location of the first mobile device for display on a virtual map in an instant message program on the second mobile device.


The first mobile device can obtain, from the server, one or more updates of the second location. The updates can correspond to a movement of the second mobile device. The first mobile device can provide a representation of updated second location for display in the instant message program on the first mobile device. The representation of the updated second location can indicate a path of the movement.



FIG. 6 is a flowchart of an exemplary process 600 of live location sharing. An instant message program executing on first mobile device, e.g., mobile device 102, can receive (602) a notification to a second mobile device, e.g., mobile device 104. The notification can indicate that the second mobile device shares a location of the second mobile device with the first mobile device. The notification can include a mapping packet including a phone number of the second mobile device and an account identifier of the second mobile device.


The first mobile device can submit (604) and to a server, the mapping packet including the phone number and the account identifier for retrieving the location of the second mobile device.


Upon successful authentication by the server indicating that the second mobile device is logged in and that a location of the second mobile device is available, the first mobile device can receive (606) the location from the server during a time period as specified by the second device for sharing the location. The time period can be an hour, a day, or an indefinite time period as specified by the second mobile device according to a user input in the instant message program.


The first mobile device then provides (608) a marker representing the location for display on a virtual map in the instant message program on the first mobile device. During the time period, the first mobile device can provide the marker representing the location of the second mobile device for display in one or more other programs for displaying locations. The programs can include, for example, a “find my friend” application program.



FIG. 7 is a flowchart of an exemplary process 700 of live location sharing. A first server, e.g., server 304 of FIG. 3 can receive (702) a mapping packet from an instant message program of a first mobile device, e.g., mobile device 102. The mapping packet can include a phone number of a second mobile device, e.g., mobile device 104. The mapping packet can include an account identifier of the second mobile device. The mapping packet can indicate that the second mobile device has shared a location of the second mobile device with the first mobile device in the instant message program. The first server can be connected to the first mobile device by a communications network.


The second server can be connected to the second mobile device by the communications network. The first mobile device and the second mobile device can be connected to one another by the same communications network or a different communications network.


The first server can submit (704) the phone number and the account identifier to an identity service for determining whether the second mobile device is logged into the account on a second server. The identity service can provide a token indicating that the second mobile device is logged into the account.


Upon receiving the token from the identity service, the first server can submit (706) a request to the second server for retrieving a current location of the second mobile device. The request can include the account identifier of the second mobile device. The current location of the second mobile device can be received by the second server from the second mobile device in response to an input on the second mobile device indicating that the second mobile device shares location of the second mobile device with the first mobile device.


Upon receiving the current location from the second server, the first server can submit (708) the current location to the first mobile device for display in the instant message program.


Exemplary Mobile Device Architecture


FIG. 8 is a block diagram of an exemplary architecture 800 for the mobile devices of FIGS. 1-7. A mobile device (e.g., mobile device 102) can include memory interface 802, one or more data processors, image processors and/or processors 804, and peripherals interface 806. Memory interface 802, one or more processors 804 and/or peripherals interface 806 can be separate components or can be integrated in one or more integrated circuits. Processors 804 can include application processors, baseband processors, and wireless processors. The various components in mobile device 102, for example, can be coupled by one or more communication buses or signal lines.


Sensors, devices, and subsystems can be coupled to peripherals interface 806 to facilitate multiple functionalities. For example, motion sensor 810, light sensor 812, and proximity sensor 814 can be coupled to peripherals interface 806 to facilitate orientation, lighting, and proximity functions of the mobile device. Location processor 815 (e.g., GPS receiver) can be connected to peripherals interface 806 to provide geopositioning. Electronic magnetometer 816 (e.g., an integrated circuit chip) can also be connected to peripherals interface 806 to provide data that can be used to determine the direction of magnetic North. Thus, electronic magnetometer 816 can be used as an electronic compass. Motion sensor 810 can include one or more accelerometers configured to determine change of speed and direction of movement of the mobile device. Barometer 818 can include one or more devices connected to peripherals interface 806 and configured to measure pressure of atmosphere around the mobile device.


Camera subsystem 820 and an optical sensor 822, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.


Communication functions can be facilitated through one or more wireless communication subsystems 824, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 824 can depend on the communication network(s) over which a mobile device is intended to operate. For example, a mobile device can include communication subsystems 824 designed to operate over a GSM network, a CPRS network, an EDGE network, a Wi-Fi™ or WiMax™ network, and a Bluetooth™ network. In particular, the wireless communication subsystems 824 can include hosting protocols such that the mobile device can be configured as a base station for other wireless devices.


Audio subsystem 826 can be coupled to a speaker 828 and a microphone 830 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. Audio subsystem 826 can be configured to receive voice commands from the user.


I/O subsystem 840 can include touch surface controller 842 and/or other input controller(s) 844. Touch surface controller 842 can be coupled to a touch surface 846 or pad. Touch surface 846 and touch surface controller 842 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch surface 846. Touch surface 846 can include, for example, a touch screen.


Other input controller(s) 844 can be coupled to other input/control devices 848, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of speaker 828 and/or microphone 830.


In one implementation, a pressing of the button for a first duration may disengage a lock of the touch surface 846; and a pressing of the button for a second duration that is longer than the first duration may turn power to mobile device 102 on or off. The user may be able to customize a functionality of one or more of the buttons. The touch surface 846 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.


In some implementations, mobile device 102 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, mobile device 102 can include the functionality of an MP3 player. Mobile device 102 may, therefore, include a pin connector that is compatible with the MP3 player. Other input/output and control devices can also be used.


Memory interface 802 can be coupled to memory 850. Memory 850 can include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). Memory 850 can store operating system 852, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. Operating system 852 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, operating system 852 can include a kernel (e.g., UNIX kernel).


Memory 850 may also store communication instructions 854 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. Memory 850 may include graphical user interface instructions 856 to facilitate graphic user interface processing; sensor processing instructions 858 to facilitate sensor-related processing and functions; phone instructions 860 to facilitate phone-related processes and functions; electronic messaging instructions 862 to facilitate electronic-messaging related processes and functions; web browsing instructions 864 to facilitate web browsing-related processes and functions; media processing instructions 866 to facilitate media processing-related processes and functions; GPS/Navigation instructions 868 to facilitate GPS and navigation-related processes and instructions; camera instructions 870 to facilitate camera-related processes and functions; magnetometer data 872 and calibration instructions 874 to facilitate magnetometer calibration. The memory 850 may also store other software instructions (not shown), such as security instructions, web video instructions to facilitate web video-related processes and functions, and/or web shopping instructions to facilitate web shopping-related processes and functions. In some implementations, the media processing instructions 866 are divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing-related processes and functions, respectively. An activation record and international Mobile Equipment Identity (MEI) or similar hardware identifier can also be stored in memory 850. Memory 850 can store live location sharing instructions 876 that, when executed, can cause processor 804 to perform operations of live location sharing, e.g., procedures as described in reference to FIG. 5 and FIG. 6.


Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. Memory 850 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.


Exemplary Operating Environment


FIG. 9 is a block diagram of an exemplary network operating environment 900 for the mobile devices of FIGS. 1-7. Mobile devices 902a and 902b can, for example, communicate over one or more wired and/or wireless networks 910 in data communication. For example, a wireless network 912, e.g., a cellular network, can communicate with a wide area network (WAN) 914, such as the Internet, by use of a gateway 916. Likewise, an access device 918, such as an 802.11g wireless access point, can provide communication access to the wide area network 914. Each of mobile devices 902a and 902b can be mobile device 102 and mobile device 104, respectfully, configured to communicate with one another using an instant messaging program and to share a respective location in the instant messaging program.


In some implementations, both voice and data communications can be established over wireless network 912 and the access device 918. For example, mobile device 902a can place and receive phone calls (e.g., using voice over Internet Protocol (VoIP) protocols), send and receive e-mail messages (e.g., using Post Office Protocol 3 (POP3)), and retrieve electronic documents and/or streams, such as web pages, photographs, and videos, over wireless network 912, gateway 916, and wide area network 914 (e.g., using Transmission Control Protocol/Internet Protocol (TCP/IP) or User Datagram Protocol (UDP)). Likewise, in some implementations, the mobile device 902b can place and receive phone calls, send and receive e-mail messages, and retrieve electronic documents over the access device 918 and the wide area network 914. In some implementations, mobile device 902a or 902b can be physically connected to the access device 918 using one or more cables and the access device 918 can be a personal computer. In this configuration, mobile device 902a or 902b can be referred to as a “tethered” device.


Mobile devices 902a and 902b can also establish communications by other means. For example, wireless device 902a can communicate with other wireless devices, e.g., other mobile devices, cell phones, etc., over the wireless network 912. Likewise, mobile devices 902a and 902b can establish peer-to-peer communications 920, e.g., a personal area network, by use of one or more communication subsystems, such as the Bluetooth™ communication devices. Other communication protocols and topologies can also be implemented.


The mobile device 902a or 902b can, for example, communicate with one or more services 930 and 940 over the one or more wired and/or wireless networks.


For example, instant messaging services 930 can allow mobile devices 902a and 902b to communicate with one another using an instant messaging program. Location service 940 can provide the location and map data to mobile devices 902a and 902b for determining locations of mobile devices 902a and 902b.


Mobile device 902a or 902b can also access other data and content over the one or more wired and/or wireless networks. For example, content publishers, such as news sites, Really Simple Syndication (RSS) feeds, web sites, blogs, social networking sites, developer networks, etc., can be accessed by mobile device 902a or 902b. Such access can be provided by invocation of a web browsing function or application (e.g., a browser) in response to a user touching, for example, a Web object.


A number of implementations of the invention have been described. Nevertheless, it will be understood that various modifications can be made without departing from the spirit and scope of the invention.


Exemplary System Architecture


FIG. 10 is a block diagram of an exemplary system architecture for implementing the features and operations of FIGS. 1-7. Other architectures are possible, including architectures with more or fewer components. In some implementations, architecture 1000 includes one or more processors 1002 (e.g., dual-core Intel® Xeon® Processors), one or more output devices 1004 (e.g., LCD), one or more network interfaces 1006, one or more input devices 1008 (e.g., mouse, keyboard, touch-sensitive display) and one or more computer-readable media 1012 (e.g., RAM, ROM, SDRAM, hard disk, optical disk, flash memory, etc.). These components can exchange communications and data over one or more communication channels 1010 (e.g., buses which can utilize various hardware and software for facilitating the transfer of data and control signals between components.


The term “computer-readable medium” refers to a medium that participates in providing instructions to processor 1002 for execution, including without limitation, non-volatile media (e.g., optical or magnetic disks), volatile media (e.g., memory) and transmission media. Transmission media includes, without limitation, coaxial cables, copper wire and fiber optics.


Computer-readable media 1012 can further include operating system 1014 (e.g., a Linux® operating system), network communication module 1016, location sharing manager 1020, location manager 1030, and identity service manager 1040. Operating system 1014 can be multi-user, multiprocessing, multitasking, multithreading, real time, etc. Operating system 1014 performs basic tasks, including but not limited to: recognizing input from and providing output to devices 1006, 1008; keeping track and managing files and directories on computer-readable media 1012 (e.g., memory or a storage device); controlling peripheral devices; and managing traffic on the one or more communication channels 1010. Network communications module 1016 includes various components for establishing and maintaining network connections (e.g., software for implementing communication protocols, such as TCP/IP, HTTP, etc.).


Location sharing manager 1020 can include computer instructions that, when executed, cause processor 1002 to perform operations of location sharing, e.g., procedure 700 as described in reference to FIG. 7. Location manager 1030 can include computer instructions that, when executed, cause processor 1002 to provide location of mobile device and virtual maps to a mobile device. Identity service manager 1040 can include computer instructions that, when executed, cause processor 1002 to perform functions of identity services 312 as described in reference to FIG. 3.


Architecture 1000 can be implemented in a parallel processing or peer-to-peer infrastructure or on a single device with one or more processors. Software can include multiple software components or can be a single body of code.


The described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. A computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program can be written in any form of programming language (e.g., Objective-C, Java), including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, a browser-based web application, or other unit suitable for use in a computing environment.


Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data. Generally, a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).


To provide for interaction with a user, the features can be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor or a retina display device for displaying information to the user. The computer can have a touch surface input device (e.g., a touch screen) or a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer. The computer can have a voice input device for receiving voice commands from the user.


The features can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them. The components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.


A system of one or more computers can be configured to perform particular actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.


While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described. In the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.


Similarly, while operations are depicted. In the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.


Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed. In a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous.

Claims
  • 1. A non-transitory machine-readable medium storing instructions which, when executed by one or more processors, cause the one or more processors to perform operations comprising: on a first mobile device, receiving a request to create a message within a messenger application displayed on a graphical interface of the first mobile device, the message to be sent to a second mobile device;presenting a message interface within the messenger application;presenting, via the message interface, a location sharing user interface item;receiving an input via the message interface to select the location sharing user interface item;generating first location data representing a current location of the first mobile device;presenting a representation of the first location data on the graphical interface of the first mobile device, the representation of the first location data and the message to be sent being presented concurrently on the message interface;presenting a prompt to confirm the sharing of the location of the first mobile device with the second mobile device;receiving an input via a user interface of the first mobile device to confirm the sharing of the location of the first mobile device with the second mobile device;transmitting, via the messenger application, an account identifier for the second mobile device for submission to an identity service;upon receipt of a token from the identity service, via the messenger application, that authenticates the second mobile device associated with the account identifier is logged in to a user account, receiving a second location data for the second mobile device;presenting a representation of a second location data on the graphical interface of the first mobile device; andtransmitting a message from the first mobile device to the second mobile device via the messenger application, the message including an indication of the location of the first mobile device.
  • 2. The non-transitory machine-readable medium as in claim 1, the operations further comprising transmitting, by the first mobile device, the message to a server, the server to relay the message to the second mobile device via an identifier associated with the second mobile device.
  • 3. The anon-transitory machine-readable medium as in claim 2, wherein the identifier associated with the second mobile device is the account identifier of the user account associated with the second mobile device or a phone number associated with the second mobile device.
  • 4. The lion-transitory machine-readable medium as in claim 2, wherein the message interface including a settings user interface item which, when selected, causes presentation of the location sharing user interface item.
  • 5. The non-transitory machine-readable medium as in claim 1, the operations further comprising receiving, via the messenger application, an indication of a location of the second mobile device.
  • 6. The non-transitory machine-readable medium as in claim 5, additionally comprising, after receiving, via the messenger application, the indication of the location of the second mobile device, presenting, via the messenger application, a map including a representation of the location of the second mobile device.
  • 7. A system on a first mobile device, the system comprising: a memory to store instructions;one or more processors to execute the instructions, wherein the instructions, when executed, cause the one or more processors to: receive a request to create a message within a messenger application displayed on a graphical interface of the first mobile device, the message to be sent to a second mobile device;present a message interface within the messenger application;present, via the message interface, a location sharing user interface item;receive an input via the message interface to select the location sharing user interface itemgenerate first location data representing a current location of the first mobile device;present a representation of the first location data on the graphical interface of the first mobile device, the representation of the first location data and the message to be sent being presented concurrently on the message interface;present a prompt to confirm the sharing of the location of the first mobile device with the second mobile device;receive an input via a user interface of the first mobile device to confirm the sharing of the location of the first mobile device with the second mobile device;transmit, via the messenger application, an account identifier for the second mobile device for submission to an identity service;upon receipt of a token from the identity service, via the messenger application, that authenticates the second mobile device associated with the account identifier is logged in to an account, receiving a second location for the second mobile device;present a representation of a second location data on the graphical interface of the first mobile device; andtransmit a message from the first mobile device to the second mobile device via the messenger application, the message including an indication of the location of the first mobile device.
  • 8. The system as in claim 7, the one or more processors further to transit the message to a server, the server to relay the message to the second mobile device via an identifier associated with the second mobile device.
  • 9. The system as in claim 8, wherein the identifier associated with the second mobile device includes the account identifier of the user account associated with the second mobile device or a phone number associated with the second mobile device.
  • 10. The system as in claim 8, wherein the message interface including a settings user interface item which, when selected, causes presentation of the location sharing user interface item.
  • 11. The system as in claim 7, wherein the one or more processors are further to: receive via the messenger application, an indication of a location of the second mobile device; andafter receipt of the indication of the location of the second mobile device, present, via the messenger application, a map including a representation of the location of the second mobile device.
  • 12. A method comprising: on a first mobile device, receiving a request to create a message within a messenger application displayed on a graphical interface of the first mobile device, the message to be sent to a second mobile device;presenting a message interface within the messenger application;presenting, via the message interface, a location sharing user interface item;receiving an input via the message interface to select the location sharing user interface item;generating first location data representing a current location of the first mobile device;presenting a representation of the first location data on the graphical interface of the first mobile device, the representation of the first location data and the message to be sent being presented concurrently on the message interface;presenting a prompt to confirm the sharing of the location of the first mobile device with the second mobile device;receiving an input via a user interface of the first mobile device to confirm the sharing of the location of the first mobile device with the second mobile device;transmitting, via the messenger application, an account identifier for the second mobile device for submission to an identity service;upon receipt of a token, from the identity service, via the messenger application, that authenticates the second mobile device associated with the account identifier is logged in to a user account, receiving a second location data for the second mobile device;presenting a representation of a second location data on the graphical interface of the first mobile device; andtransmitting a message from the first mobile device to the second mobile device via the messenger application, the message including an indication of the location of the first mobile device.
  • 13. The method as in claim 12, further comprising transmitting, by the first mobile device, the message to a server, the server to relay the message to the second mobile device via an identifier associated with the second mobile device.
  • 14. The method as in claim 13, wherein the identifier associated with the second mobile device includes the account identifier of the user account associated with the second mobile device or a phone number associated with the second mobile device.
  • 15. The method as in claim 14, wherein the message interface including a settings user interface item which, when selected, causes presentation of the location sharing user interface item.
  • 16. The method as in claim 12, further comprising receiving, via the messenger application, an indication of a location of the second mobile device.
  • 17. The method as in claim 16, further comprising, after receiving the indication of the location of the second mobile device, presenting, via the messenger application, a map including a representation of the location of the second mobile device.
  • 18. A method comprising: receiving, by a first mobile device, an account identifier and a notification from a second mobile device indicating that the second mobile device confirms a request to share a location of the second mobile device with the first mobile device;submitting, via a messaging application by the first mobile device, the account identifier to an identity service for retrieving the location of the second mobile device;upon receipt, via the messaging application, of a token indicating successful authentication by the identity service that the second mobile device is logged in, receiving the location by the first mobile device during a time period as specified by the second device for sharing the location; and
  • 19. The method of claim 1, wherein the messaging application comprises a first server that receives the token from the identity service and the first server is operable to send the token to a second server to retrieve the location.
  • 20. The method of claim 18, wherein the account identifier is a user account name associated with the second mobile device or a phone number associated with the second mobile device.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/503,270, entitled “Live Location Sharing,” filed Sep. 30, 2014, which claims priority to U.S. Provisional Application No. 62/006,110, entitled “Live Location Sharing,” filed. May 31, 2014, the entire contents of which is incorporated herein by reference. This application is also related to U.S. patent application Ser. Nos. 14/503,355, 14/503,376, 14/503,386, all of which are entitled “Message User Interfaces for Capture and Transmittal of Media and Location Content,” all filed on May 31, 2014.

US Referenced Citations (301)
Number Name Date Kind
5475653 Yamada et al. Dec 1995 A
5801700 Ferguson Sep 1998 A
6002402 Schacher Dec 1999 A
6040781 Murray Mar 2000 A
6191807 Hamada et al. Feb 2001 B1
6323846 Westerman et al. Nov 2001 B1
6362842 Tahara et al. Mar 2002 B1
6515585 Yamamoto Feb 2003 B2
6570557 Westerman et al. May 2003 B1
6677932 Westerman et al. Jan 2004 B1
6809724 Shiraishi et al. Oct 2004 B1
7015817 Copley et al. Mar 2006 B2
7039420 Koskinen et al. May 2006 B2
7076257 Kall Jul 2006 B2
7224987 Bhela et al. May 2007 B1
7365736 Marvit et al. Apr 2008 B2
7593749 Vallstrom et al. Sep 2009 B2
7614008 Ording et al. Nov 2009 B2
7633076 Huppi et al. Dec 2009 B2
7653883 Hotelling et al. Jan 2010 B2
7657849 Chaudhri et al. Feb 2010 B2
7663607 Hotelling et al. Feb 2010 B2
7789225 Whiteis Sep 2010 B2
7801542 Stewart Sep 2010 B1
7834861 Lee Nov 2010 B2
7844914 Andre et al. Nov 2010 B2
7908219 Abanami et al. Mar 2011 B2
7953393 Chin et al. May 2011 B2
7957762 Herz et al. Jun 2011 B2
8006002 Kalayjian et al. Aug 2011 B2
8121586 Araradian et al. Feb 2012 B2
8150930 Satterfield et al. Apr 2012 B2
8239784 Hotelling et al. Aug 2012 B2
8244468 Scailisi et al. Aug 2012 B2
8255830 Ording et al. Aug 2012 B2
8279180 Hotelling et al. Oct 2012 B2
8285258 Schultz et al. Oct 2012 B2
8369867 Van Os et al. Feb 2013 B2
8374575 Mullen Feb 2013 B2
8381135 Hotelling et al. Feb 2013 B2
8412154 Leemet et al. Apr 2013 B1
8441367 Lee et al. May 2013 B1
8479122 Hotelling et al. Jul 2013 B2
8572493 Qureshi Oct 2013 B2
8786458 Wiltzius et al. Jul 2014 B1
8811951 Faaborg et al. Aug 2014 B1
8855665 Buford et al. Oct 2014 B2
8922485 Lloyd Dec 2014 B1
8971924 Pai et al. Mar 2015 B2
8989773 Sandel et al. Mar 2015 B2
9100944 Newham et al. Aug 2015 B2
9185062 Yang et al. Nov 2015 B1
9191988 Newham et al. Nov 2015 B2
9204283 Mullen Dec 2015 B2
9207835 Yang et al. Dec 2015 B1
9247377 Pai et al. Jan 2016 B2
9294882 Sandel et al. Mar 2016 B2
9369833 Tharshanan Jun 2016 B2
9400489 Kim et al. Jul 2016 B2
9402153 Pai et al. Jul 2016 B2
9477208 Lee et al. Oct 2016 B2
9635540 Mullen Apr 2017 B2
9699617 Sandel et al. Jul 2017 B2
10382378 Garcia et al. Aug 2019 B2
20020015024 Westerman et al. Feb 2002 A1
20020037715 Mauney et al. Mar 2002 A1
20020102989 Calvert et al. Aug 2002 A1
20020115478 Fujisawa et al. Aug 2002 A1
20020126135 Ball et al. Sep 2002 A1
20030081506 Karhu May 2003 A1
20030128163 Mizugaki et al. Jul 2003 A1
20040041841 LeMogne et al. Mar 2004 A1
20040070511 Kim Apr 2004 A1
20040180669 Kall Sep 2004 A1
20040203854 Nowak Oct 2004 A1
20050032532 Kokkonen et al. Feb 2005 A1
20050138552 Venolia Jun 2005 A1
20050148340 Guyot Jul 2005 A1
20050190059 Wehrenberg Sep 2005 A1
20050191159 Benko Sep 2005 A1
20050222756 Davis et al. Oct 2005 A1
20050268237 Crane et al. Dec 2005 A1
20050288036 Brewer et al. Dec 2005 A1
20060017692 Wehrenberg et al. Jan 2006 A1
20060019649 Feinleib et al. Jan 2006 A1
20060026245 Cunningham et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060030333 Ward et al. Feb 2006 A1
20060033724 Chaudhri et al. Feb 2006 A1
20060044283 Eri et al. Mar 2006 A1
20060063538 Ishii Mar 2006 A1
20060092177 Blasko May 2006 A1
20060195787 Topiwala et al. Aug 2006 A1
20060197753 Hotelling et al. Sep 2006 A1
20060223518 Haney Oct 2006 A1
20070036300 Brown et al. Feb 2007 A1
20070085157 Fadell et al. Apr 2007 A1
20070117549 Amos May 2007 A1
20070129888 Rosenberg Jun 2007 A1
20070150834 Muller et al. Jun 2007 A1
20070150836 Deggelmann et al. Jun 2007 A1
20070216659 Amineh Sep 2007 A1
20070236475 Wherrv Oct 2007 A1
20080004043 Wilson et al. Jan 2008 A1
20080014989 Sandegard et al. Jan 2008 A1
20080045232 Cone Feb 2008 A1
20080052945 Matas et al. Mar 2008 A1
20080055264 Anzures et al. Mar 2008 A1
20080057926 Forstall et al. Mar 2008 A1
20080070593 Altman Mar 2008 A1
20080079589 Blackadar Apr 2008 A1
20080114539 Lim May 2008 A1
20080139219 Boeiro et al. Jun 2008 A1
20080153517 Lee Jun 2008 A1
20080165136 Christie et al. Jul 2008 A1
20080176583 Brachet et al. Jul 2008 A1
20080186165 Bertagna et al. Aug 2008 A1
20080216022 Lorch et al. Sep 2008 A1
20080287151 Fjelstad et al. Nov 2008 A1
20080320391 Lemay et al. Dec 2008 A1
20090005011 Christie et al. Jan 2009 A1
20090005018 Forstall et al. Jan 2009 A1
20090006566 Veeramachaneni et al. Jan 2009 A1
20090011340 Lee et al. Jan 2009 A1
20090037536 Braam Feb 2009 A1
20090049502 Levien et al. Feb 2009 A1
20090051648 Shamaie et al. Feb 2009 A1
20090051649 Rondel Feb 2009 A1
20090055494 Fukumoto Feb 2009 A1
20090066564 Burroughs et al. Mar 2009 A1
20090085806 Piersol et al. Apr 2009 A1
20090098903 Donaldson et al. Apr 2009 A1
20090113340 Bender Apr 2009 A1
20090164219 Yeung et al. Jun 2009 A1
20090177981 Christie et al. Jul 2009 A1
20090181726 Varqas et al. Jul 2009 A1
20090187842 Collins et al. Jul 2009 A1
20090254840 Churchill et al. Oct 2009 A1
20090298444 Shigeta Dec 2009 A1
20090303066 Lee et al. Dec 2009 A1
20090312032 Bornstein et al. Dec 2009 A1
20090313582 Rupsingh et al. Dec 2009 A1
20090319616 Lewis et al. Dec 2009 A1
20090322560 Tengler et al. Dec 2009 A1
20090325603 Van Os Dec 2009 A1
20100004005 Pereira et al. Jan 2010 A1
20100017126 Holeman Jan 2010 A1
20100058231 Duarte et al. Mar 2010 A1
20100069035 Jonhson Mar 2010 A1
20100124906 Hautala May 2010 A1
20100125411 Goel May 2010 A1
20100125785 Moore et al. May 2010 A1
20100144368 Sullivan et al. Jun 2010 A1
20100203901 Dinoff et al. Aug 2010 A1
20100205242 Marchioro, II et al. Aug 2010 A1
20100211425 Govindarajan Aug 2010 A1
20100240398 Hotes et al. Sep 2010 A1
20100248744 Bychkov et al. Sep 2010 A1
20100250727 King et al. Sep 2010 A1
20100274569 Reudink Oct 2010 A1
20100281409 Rainisto et al. Nov 2010 A1
20100287178 Lambert et al. Nov 2010 A1
20100299060 Snavely et al. Nov 2010 A1
20100325194 Williamson Dec 2010 A1
20100330952 Yeoman et al. Dec 2010 A1
20100332518 Song et al. Dec 2010 A1
20110003587 Belz et al. Jan 2011 A1
20110051658 Jin et al. Mar 2011 A1
20110054780 Dhanani et al. Mar 2011 A1
20110054979 Cova et al. Mar 2011 A1
20110059769 Bmnolli Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110080356 Kang et al. Apr 2011 A1
20110096011 Suzuki Apr 2011 A1
20110118975 Chen May 2011 A1
20110137813 Stewart Jun 2011 A1
20110137954 Diaz Jun 2011 A1
20110138006 Stewart Jun 2011 A1
20110148626 Acevedo Jun 2011 A1
20110151418 Delespaul et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110164058 Lemay Jul 2011 A1
20110167383 Schuller et al. Jul 2011 A1
20110183650 McKee Jul 2011 A1
20110225547 Fong et al. Sep 2011 A1
20110239158 Barraclough et al. Sep 2011 A1
20110250909 Mathias et al. Oct 2011 A1
20110254684 Antoci Oct 2011 A1
20110265041 Ganetakos et al. Oct 2011 A1
20110276901 Zambetti et al. Nov 2011 A1
20110279323 Hung et al. Nov 2011 A1
20110306366 Trussel et al. Dec 2011 A1
20110306393 Goldman et al. Dec 2011 A1
20110307124 Morgan et al. Dec 2011 A1
20110316769 Boettcher et al. Dec 2011 A1
20120008526 Borghei Jan 2012 A1
20120022872 Gruber et al. Jan 2012 A1
20120040681 Yan et al. Feb 2012 A1
20120054028 Tengler et al. Mar 2012 A1
20120077463 Robbins et al. Mar 2012 A1
20120088521 Nishida et al. Apr 2012 A1
20120095918 Jurss Apr 2012 A1
20120102437 Worley et al. Apr 2012 A1
20120105358 Momeyer May 2012 A1
20120108215 Kameli et al. May 2012 A1
20120117507 Tseng et al. May 2012 A1
20120131458 Hayes May 2012 A1
20120136997 Yan et al. May 2012 A1
20120144452 Dyor et al. Jun 2012 A1
20120149405 Bhat Jun 2012 A1
20120150970 Peterson et al. Jun 2012 A1
20120158511 Lucero et al. Jun 2012 A1
20120166531 Sylvain Jun 2012 A1
20120172088 Kirch et al. Jul 2012 A1
20120208592 Davis et al. Aug 2012 A1
20120216127 Meyr Aug 2012 A1
20120218177 Pang et al. Aug 2012 A1
20120222083 Vaha-Sipila et al. Aug 2012 A1
20120239949 Kalvanasundaram et al. Sep 2012 A1
20120258726 Bansal et al. Oct 2012 A1
20120265823 Parmar et al. Oct 2012 A1
20120276919 Bi Nov 2012 A1
20120290648 Sharkey Nov 2012 A1
20120302256 Pai et al. Nov 2012 A1
20120302258 Pai et al. Nov 2012 A1
20120304084 Kim et al. Nov 2012 A1
20120306770 Moore et al. Dec 2012 A1
20130002580 Sudou Jan 2013 A1
20130007665 Chaudhri et al. Jan 2013 A1
20130045759 Smith Feb 2013 A1
20130063364 Moore Mar 2013 A1
20130065566 Gisby et al. Mar 2013 A1
20130091298 Ozzie et al. Apr 2013 A1
20130093833 Al-Asaaed et al. Apr 2013 A1
20130120106 Cauwels et al. May 2013 A1
20130143586 Williams et al. Jun 2013 A1
20130159941 Langlois et al. Jun 2013 A1
20130212470 Karunamuni et al. Aug 2013 A1
20130222236 Gardenfors et al. Aug 2013 A1
20130226453 Trussel Aug 2013 A1
20130234924 Janefalkar et al. Sep 2013 A1
20130244633 Jacobs et al. Sep 2013 A1
20130254714 Shin et al. Sep 2013 A1
20130262298 Morley Oct 2013 A1
20130275924 Weinberg et al. Oct 2013 A1
20130303190 Khan et al. Nov 2013 A1
20130305331 Kim Nov 2013 A1
20130307809 Sudou Nov 2013 A1
20130310089 Gianoukos et al. Nov 2013 A1
20130321314 Oh et al. Dec 2013 A1
20130322634 Bennett et al. Dec 2013 A1
20130346882 Shipiacoff et al. Dec 2013 A1
20130347018 limp et al. Dec 2013 A1
20140026099 Andersson Reimer Jan 2014 A1
20140055552 Song et al. Feb 2014 A1
20140058873 Sorensen et al. Feb 2014 A1
20140062790 Letz et al. Mar 2014 A1
20140066105 Bridge et al. Mar 2014 A1
20140073256 Newham et al. Mar 2014 A1
20140085487 Park et al. Mar 2014 A1
20140099973 Cecchini et al. Apr 2014 A1
20140136990 Gonnen et al. May 2014 A1
20140181183 Houjou et al. Jun 2014 A1
20140189533 Krack et al. Jul 2014 A1
20140222933 Stovicek et al. Aug 2014 A1
20140237126 Bridge Aug 2014 A1
20140240122 Roberts et al. Aug 2014 A1
20140344711 Hallerstrom et al. Nov 2014 A1
20140365944 Moore et al. Dec 2014 A1
20150007049 Langlois et al. Jan 2015 A1
20150040029 Koum et al. Feb 2015 A1
20150089660 Song et al. Mar 2015 A1
20150100537 Grieves et al. Apr 2015 A1
20150102992 Klement et al. Apr 2015 A1
20150172393 Oplinger et al. Jun 2015 A1
20150180746 Day, II et al. Jun 2015 A1
20150185849 Levesque et al. Jul 2015 A1
20150188869 Gilad et al. Jul 2015 A1
20150248389 Kalm et al. Sep 2015 A1
20150264303 Chastney et al. Sep 2015 A1
20150286387 Gu et al. Oct 2015 A1
20150286391 Jacobs et al. Oct 2015 A1
20150312185 Lanqholz et al. Oct 2015 A1
20150346912 Yang et al. Dec 2015 A1
20150350130 Yang et al. Dec 2015 A1
20150350140 Garcia et al. Dec 2015 A1
20150350141 Yang et al. Dec 2015 A1
20160036735 Pycock et al. Feb 2016 A1
20160036996 Midholt et al. Feb 2016 A1
20160054841 Yang et al. Feb 2016 A1
20160073223 Woolsey et al. Mar 2016 A1
20160234060 Pai et al. Aug 2016 A1
20160294958 Zhang Oct 2016 A1
20160295384 Shan et al. Oct 2016 A1
20160299526 Inagaki et al. Oct 2016 A1
20160342141 Koumaiha Nov 2016 A1
20170026796 Pai et al. Jan 2017 A1
20170083189 Yang et al. Mar 2017 A1
20170083202 Yang et al. Mar 2017 A1
20170220212 Yang et al. Aug 2017 A1
20180091951 Sandel et al. Mar 2018 A1
Foreign Referenced Citations (60)
Number Date Country
2016102028 Jul 2017 AU
1475924 Feb 2004 CN
1852335 Oct 2006 CN
101390371 Mar 2009 CN
102098656 Jun 2011 CN
102111505 Jun 2011 CN
201928419 Aug 2011 CN
102695302 Sep 2012 CN
103207674 Jul 2013 CN
103309606 Sep 2013 CN
103500079 Jan 2014 CN
103583031 Feb 2014 CN
103959751 Jul 2014 CN
104205785 Dec 2014 CN
205263700 May 2016 CN
1387590 Feb 2004 EP
2574026 Mar 2013 EP
2610701 Jul 2013 EP
2610701 Apr 2014 EP
2849042 Mar 2015 EP
2002-366485 Dec 2002 JP
2003-516057 May 2003 JP
2003-207556 Jul 2003 JP
2006-072489 Mar 2006 JP
2006-079427 Mar 2006 JP
2006-113637 Apr 2006 JP
2006-129429 May 2006 JP
2009-081865 Apr 2009 JP
2010-503126 Jan 2010 JP
2010-503332 Jan 2010 JP
2010-288162 Dec 2010 JP
2010-539804 Dec 2010 JP
2011-060065 Mar 2011 JP
2011-107823 Jun 2011 JP
2012-508530 Apr 2012 JP
2012-198369 Oct 2012 JP
2013-048389 Mar 2013 JP
2014-057129 Mar 2014 JP
10-2004-0089329 Oct 2004 KR
2007-0096222 Oct 2007 KR
2008-0074813 Aug 2008 KR
200532429 Oct 2005 TW
2001041468 Jun 2001 WO
2002003093 Jan 2002 WO
2008030972 Mar 2008 WO
2009071112 Jun 2009 WO
2010048995 May 2010 WO
2010054373 May 2010 WO
2011080622 Jul 2011 WO
2012128824 Sep 2012 WO
2012170446 Dec 2012 WO
2013093558 Jun 2013 WO
2013169842 Nov 2013 WO
2013169865 Nov 2013 WO
2013169875 Nov 2013 WO
2014083001 Jun 2014 WO
2014105276 Jul 2014 WO
2015038684 Mar 2015 WO
2015120358 Aug 2015 WO
2016036472 Mar 2016 WO
Non-Patent Literature Citations (167)
Entry
International Search Report in International Application No. PCT/US2012/038718, dated Aug. 17, 2012, 3 pages.
International Search Report and Written Opinion in International Application No. PCT/US13/41780, dated Dec. 1, 2014, 8 pages.
International Preliminary Report on Patent Ability in International Application No. PCT/US13/41780, dated Dec. 9, 2014, 7 pages.
Japanese Office Action in Japanese Application No. 2012-113725, dated May 27, 2013, 9 pages (with English Translation).
Korean Preliminary Rejection in Korean Application No. 10-2012-54888, dated Sep. 5, 2014, 9 pages (with English Translation).
Search and Examination Report in GB Application No. GB1209044.5, dated Aug. 24, 2012, 10 pages.
U.S. Final Office Action in U.S. Appl. No. 13/113,856, dated Nov. 7, 2012, 19 pages.
U.S. Final Office Action in U.S. Appl. No. 13/488,430, dated May 8, 2013, 19 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 13/113,856, dated Jul. 18, 2012, 14 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 13/488,430, dated Dec. 5, 2012, 13 pages.
Written Opinion in International Application No. PCT/US/2012/038718, dated Aug. 17, 2012, 4 pages.
Australian Patent Examination Report No. 1 in Australian Application No. 2013203926, dated Oct. 7, 2014, 5 pages.
Australian Patent Examination Report No. 2 in Australian Application No. 2013203926, dated Jan. 13, 2016, 3 pages.
European Extended Search Report in Application No. 16155938.0, dated Jun. 7, 2016, 8 pages.
Chinese Office Action for Application No. 201210288784.3, dated Jan. 5, 2017, 13 pages (with English translation).
India Office Action for Application No. 2030/CHE/2012, dated Dec. 27, 2016, 9 pages.
Chinese Notification of Reexamination for Application No. 201210288784.3, dated Sep. 27, 2017, 17 pages (with English translation).
Australian Certificate of Examination in Australian Patent Application No. 2017100760 dated Feb. 9, 2018, 2 pages.
Australian Notice of Acceptance in Australian Paten Application No. 2015267259, dated Jan. 30, 2018, 3 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015267260, dated Jan. 30, 2018, 3 pages.
Australian Notice of Acceptance in Australian Patent Application No. 2015312369, dated Mar. 21, 2018, 3 pages.
Australian Office Action in Australian Patent Application No. 2015100711, dated Jul. 27, 2015, 7 pages.
Australian Office Action in Australian Patent Application No. 2015100711, dated Nov. 19, 2015, 6 pages.
Australian Office Action in Australian Patent Application No. 2015101188, dated Apr. 14, 2016, 3 pages.
Australian Office Action in Australian Patent Application No. 2015267259, dated Jun. 2, 2017, 2 pages.
Australian Office Action in Australian Patent Application No. 2015267260, dated Jun. 2, 2017, 2 pages.
Australian Office Action in Australian Patent Application No. 2015312369, dated Mar. 29, 2017, 3 Pages.
Australian Office Action in Australian Patent Application No. 2016102028, dated Feb. 13, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2016102029, dated Feb. 22, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100197, dated Apr. 28, 2017, 4 Pages.
Australian Office Action in Australian Patent Application No. 2017100198, dated Apr. 20, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100760, dated Aug. 10, 2017, 4 pages.
Australian Office Action in Australian Patent Application No. 2017100760, dated Jan. 30, 2018, 3 pages.
Australian Office Action in Australian Patent Application No. 2018204430, dated Aug. 15, 2018, 5 pages.
Chinese Notice of Allowance received for Chinese Patent Application No. 201510290133.1, dated Jan. 9, 2019, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201510291012.9, dated Jan. 9, 2019, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365358.4, dated Nov. 20, 2015, 2 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365843.1, dated Feb. 15, 2016, 3 pages with English Translation.
Chinese Notice of Allowance received for Chinese Patent Application No. 201520669842.6, dated May 18, 2016, 2 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510290133.1, dated Feb. 9, 2018, 10 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510291012.9, dated Feb. 8, 2018, 9 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Aug. 7, 2018, 7 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Nov. 24, 2017, 22 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365358.4, dated Aug. 11, 2015, 4 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Aug. 25, 2015, 4 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Nov. 16, 2015, 3 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201520669842.6, dated Dec. 4, 2015, 7 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201620393549.6, dated Aug. 18, 2016, 2 pages with English Translation.
Chinese Office Action received for Chinese Patent Application No. 201620393748.7, dated Aug. 18, 2016, 2 pages with English Translation.
Danish Decision to Grant received for Danish Patent Application No. PA201770126, dated Mar. 27, 2018, 2 pages.
Danish Intention to Grant received for Denmark Patent Application No. PA201570550, dated Dec. 22, 2016, 2 pages.
Danish Intention to Grant received for Denmark Patent Application No. PA201770126, dated Jan. 19, 2018, 2 pages.
Danish Notice of Allowance received for Danish Patent Application No. P A201570550, dated Mar. 20, 2017, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Dec. 7, 2015, 5 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Jan. 19, 2016, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201570550, dated Oct. 19, 2016, 3 pages.
Danish Office Action received for Danish Patent Application No. PA201770089, dated Apr. 25, 2017, 10 pages.
Danish Office Action received for Danish Patent Application No. PA201770125, dated Jan. 26, 2018, 5 pages.
Danish Office Action received for Danish Patent Application No. PA201770125, dated Jul. 20, 2018, 2 pages.
Danish Office Action received for Danish Patent Application No. PA201770126, dated Oct. 18, 2017, 3 pages.
Danish Search Report received for Danish Patent Application No. PA201770125, dated May 5, 2017, 10 pages.
Danish Search Report received for Danish Patent Application No. PA201770126, dated Apr. 26, 2017, 8 Pages.
European Extended Search Report in European Patent Application No. 17167629.9, dated Jun. 2, 2017, 7 pages.
European Extended Search Report in European Patent Application No. 18170262.2, dated Jul. 25, 2018, 8 pages.
European Office Action in European Patent Application No. 15728307.8, dated Feb. 8, 2018, 7 pages.
European Office Action in European Patent Application No. 15729286.3, dated Feb. 7, 2018, 7 pages.
European Office Action in European Patent Application No. 15759981.2, dated Apr. 19, 2018, 6 pages.
European Office Action in European Patent Application No. 15759981.2, dated Aug. 6, 2018, 10 pages.
European Office Action in European Patent Application No. 15759981.2, dated May 16, 2018, 6 pages.
European Office Action in European Patent Application No. 17167629.9, dated Jan. 25, 2019, 7 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2015/032309, dated Dec. 15, 2016, 7 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/032305, dated Dec. 15, 2016, 7 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/043487, dated Feb. 16, 2017, 12 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/044083, dated Mar. 16, 2017, 24 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/046787, dated Mar. 16, 2017, 18 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT /US2016/046828, dated Mar. 1, 2018, 19 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032305, dated Sep. 10, 2015, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032309, dated Sep. 2, 2015, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/043487, dated Jan. 29, 2016, 17 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/044083, dated Feb. 4, 2016, 31 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/046787, dated Apr. 1, 2016, 26 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/046828, dated Dec. 15, 2016, 21 pages.
Invitation to Pay Additional Fees and Partial Search Report received for PCT Patent Application No. PCT/US2015/043487, dated Nov. 9, 2015, 4 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/044083, mailed on Nov. 4, 2015, 11 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/046787, mailed on Dec. 15, 2015, 8 pages.
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2016/046828, mailed on Sep. 23, 2016, 2 pages.
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-510297, dated May 7, 2018, 5 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-514992, dated Feb. 15, 2019, 5 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent application No. 2017514993, dated Jan. 12, 2018, 6 pages with English Translation.
Japanese Notice of Allowance received for Japanese Patent Application No. 2018-072632, dated Dec. 7, 2018, 6 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Dec. 4, 2017, 6 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Jul. 10, 2017, 9 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2017-514992, dated Apr. 6, 2018, 9 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2018-018497, dated Dec. 10, 2018, 7 pages with English Translation.
Japanese Office Action received for Japanese Patent Application No. 2018-072632, dated Jul. 9, 2018, 5 Pages with English Translation.
Korean Notice of Allowance received for Korean Patent Application No. 10-2017-7005628, dated Jun. 18, 2018, 4 pages with English Translation.
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated Jan. 30, 2018, 6 pages with English translation.
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated May 10, 2017, 11 pages with English Translation.
Korean Office Action received for Korean Patent Application No. 10-2018-7027006, dated Jan. 14, 2019, 4 pages with English Translation.
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2015354, completed on Jun. 22, 2017, 23 pages with English Translation.
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2019878, dated Apr. 6, 2018, 23 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104107332, dated Oct. 29, 2018, 12 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128519, dated Mar. 29, 2017, 16 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Jul. 31, 2017, 7 pages with English Translation.
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Nov. 2, 2016, 12 pages with English Translation.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,376, dated Jul. 29, 2015, 12 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,376, dated Sep. 2, 2015, 4 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,376, dated Sep. 24, 2015, 5 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,386, dated Jul. 30, 2015, 11 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/503,386, dated Sep. 24, 2015, 5 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/817,572, dated Nov. 30, 2017, 26 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/838,235, dated Dec. 29, 2016, 3 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/838,235, dated Oct. 4, 2016, 7 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,608, dated Jan. 25, 2018, 2 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,608, dated Nov. 14, 2017, 5 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,614, dated Jan. 8, 2019, 3 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,614, dated Oct. 24, 2018, 10 pages.
U.S. Notice of Allowance in U.S. Appl. No. 14/841,623, dated Feb. 23, 2018, 8 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/142,661, dated Oct. 4, 2017, 21 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/425,273, dated Mar. 7, 2019, 8 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/431,435, dated Jan. 23, 2018, 8 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/876,673, dated May 4, 2018, 26 pages.
U.S. Notice of Allowance in U.S. Appl. No. 15/985,570, dated Mar. 13, 2019, 21 pages.
U.S. Office Action in U.S. Appl. No. 14/503,376, dated Dec. 22, 2014, 19 pages.
U.S. Office Action in U.S. Appl. No. 14/503,386, dated Jan. 7, 2015, 18 pages.
U.S. Office Action in U.S. Appl. No. 14/817,572, dated Mar. 23, 2017, 13 pages.
U.S. Office Action in U.S. Appl. No. 14/817,572, dated Sep. 12, 2016, 8 pages.
U.S. Office Action in U.S. Appl. No. 14/838,235, dated Jun. 15, 2016, 17 pages.
U.S. Office Action in U.S. Appl. No. 14/841,608, dated Apr. 12, 2017, 8 pages.
U.S. Office Action in U.S. Appl. No. 14/841,614, dated Jul. 27, 2017, 12 pages.
U.S. Office Action in U.S. Appl. No. 14/841,614, dated May 10, 2018, 12 pages.
U.S. Office Action in U.S. Appl. No. 14/841,623, dated Feb. 2, 2017, 16 pages.
U.S. Office Action in U.S. Appl. No. 14/841,623, dated Sep. 5, 2017, 15 pages.
U.S. Office Action in U.S. Appl. No. 14/928,865, dated Dec. 5, 2018, 14 pages.
U.S. Office Action in U.S. Appl. No. 14/928,865, dated Mar. 27, 2018, 14 pages.
U.S. Office Action in U.S. Appl. No. 15/142,661, dated Jan. 25, 2017, 28 Pages.
U.S. Office Action in U.S. Appl. No. 15/366,763, dated Mar. 8, 2019, 13 pages.
U.S. Office Action in U.S. Appl. No. 15/425,273, dated Oct. 3, 2018, 9 pages.
U.S. Office Action in U.S. Appl. No. 15/431,435, dated Jun. 8, 2017, 10 pages.
U.S. Office Action in U.S. Appl. No. 15/985,570, dated Aug. 16, 2018, 23 pages.
U.S. Office Action received for U.S. Appl. No. 14/838,235, dated Jan. 5, 2016, 18 pages.
absoluteblogger.com' [online]. “WeChat Review—Communication Application with Screenshots” available on or before Jun. 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://www.absoluteblogger.com/2012/10/wechat-review-communication-application.html>. 4 pages.
appps.jp' [online]. “WhatsApp” users over 400 million people! I tried to investigate the most used messaging application in the world Jan. 24, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140410142411/http://www.appps.jp/2128786>. 13 pages, with Machine English Translation.
digitalstreetsa.com' [online]. “Why WeChat might kill Whatsapp's future . . . ” Jul. 3, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://digitalstreetsa.com/why-wechat-might-kill-whatsaoos-future>. 9 pages.
download.cnet.com' [online]. “WeChat APK for Android” Jan. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<htto://download.cnet.com/WeChat/3000-2150 4-75739423.html>. 5 pages.
engadget.com' [online]. “WhatsApp Introduces Major New Audio Features,” Aug. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://www.engadget.com/2013/08/07/whatsapp-introduces-major-new-audio-features>. 12 pages.
heresthethingblog.com' [online]. “iOS 7 tip: Alerts, Banners, and Badgesawhats the Difference?” Jan. 22, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140128072440/http://heresthethingblog.com/2014/0I/22/ios-7-tip-whats-difference-alert/>. 5 pages.
IPhone, “User Guide for iOS 7.1 Software”, Mar. 2014, 162 pages.
Jng.org' [online]. “Affordances and Design,” published on or before Feb. 25, 2010 [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20150318044240/jnd.org/dn.mss/affordancesand.html>. 6 pages.
makeuseof.com' [online]. “MS Outlook Tip: How to Automatically Organize Incoming Emails,” Sep. 27, 2019, retrieved on Apr. 23, 2019], retrieved from: URL <http://www.makeuseof.com/tag/ms-outlook-productivity-tip-how-to-move-emails-to-individual-folders-automatically>. 5 pages.
manualslib.com' [online]. “Samsung Gear 2 User Manual”, 2014, [retrieved on Apr. 23, 2019], retrieved from: URL<http://www.manualslib.com/download/754923/Samsung-Gear-2.html>. 97 pages.
Samsung, “SM-G900F User Manual”, English (EU). Rev. 1.0, Mar. 2014, 249 pages.
Samsung, “SM-R380”, User Manual, 2014, 74 pages.
seechina365.com' [online]. “How to use China's popular social networking service wechat2_ voice message, press together, shake function etc.” Apr. 5, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://seechina365.com/20I4/04/05/wechat02>. 29 pages with Machine English Translation.
slideshare.net.' [online]. “Samsung Gear 2 User manual”, Apr. 2014, [retrieved on Apr. 23, 2019], retrieved from: URL<http://www.slideshare.net/badaindonesia/samsung-gear-2-user-manual>. 58 pages.
wechat.wikia.com' [online]. “WeChat Wiki”, May 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://web.archive.org/web/20130514131044/http://wechat.wikia.corn/wiki/WeChat_ Wiki>. 6 pages.
wikihow.com' [online]. “How to Move Mail to Different Folders in Gmail,” available on or before Jul. 31, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140731230338/http://www.wikihow.com/Move-Mail-to-Different-Folders-in-Gmail>. 4 pages.
youtube.com' [online]. “How to Dismiss Banner Notifications or Toast Notifications on iOS7,” Dec. 17, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=vSiHnBFIW M>. 2 pages.
youtube.com' [online]. “How To Send A Picture Message/MMS-Samsung Galaxy Note 3,” Nov. 3, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=-3dOz8-KeDw>. \2 page.
youtube.com' [online]. “iOS 7 Notification Center Complete Walkthrough,” Jun. 10, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=gATXt-o42LA>. 3 pages.
youtube.com' [online]. “iOS Notification Banner Pull Down to Notification Center in iOS 7 Beta 5”, Aug. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=nP0s6ETPxDg>. 2 pages.
youtube.com' [online]. “Notification & Control Center Problem Issue Solution” Dec. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.voutube.com/watch?v=KOzCue YlaTA>. 3 pages.
youtube.com' [online]. “WeChat TVC—Hold To Talk”, May 11, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=E UxteOWVSo>. 2 page.
Australian Patent Examination Report No. 1 in Australian Application No. 2012202929, dated Sep. 28, 2013, 3 pages.
Chinese Office Action in Chinese Application No. 201210288784.3, dated Jul. 3, 2014, 16 pages (with English Translation).
European Search Report in European Application No. 12168980.6, dated Sep. 21, 2012, 7 pages.
International Preliminary Report on Patent Ability in International Application No. PCT/US/2012/038718, dated Nov. 26, 2013, 5 pages.
Related Publications (1)
Number Date Country
20200028813 A1 Jan 2020 US
Provisional Applications (1)
Number Date Country
62006110 May 2014 US
Continuations (1)
Number Date Country
Parent 14503270 Sep 2014 US
Child 16532349 US