The present disclosure relates to remotely communicating with a mobile device such as a mobile telephone or a media player and more specifically, to cause a mobile device to perform a function through the transmission of one or more remote commands.
Mobile devices have been adapted to a wide variety of applications, including computing, communication, and entertainment. Through recent improvements, mobile devices can now also determine their geographic location by either using a built-in global position system (GPS) antenna or extrapolating its location from the signals it receives through the network of fixed-location cellular antennas. Thus, a user may be able to use the mobile device to determine his or her location.
A mobile device user may wish to have friends or family members know of his or her location and likewise, he or she may like to know the location of his or her friends or family members. Several known systems perform such services. However, one drawback of such services is that determining locations, particularly when using GPS devices, may consume a lot of power.
Balancing battery life and mobile device performance is a chief concern for mobile device makers, and location aware programs are a big part of those concerns. Specifically, applications that must make frequent requests of a GPS device consume a lot of power. Such applications include mapping programs, and social location aware applications such as FOURSQUARE and GOOGLE LATITUDE, which allow a user to share his location with a server so that authorized friends can view the user's location on their mobile devices. Frequently, such services require an application running on the user's mobile device to periodically activate the GPS device, learn the user's location, and update the server. Such repeated use of the GPS device drastically reduces the battery life of the mobile device.
Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be obvious from the description, or can be learned by practice of the herein disclosed principles. The features and advantages of the disclosure can be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features of the disclosure will become more fully apparent from the following description and appended claims, or can be learned by the practice of the principles set forth herein.
Disclosed are systems, methods, and non-transitory computer-readable storage media for determining the location of one or more mobile devices connected to a communications network. The present technology provides a system for allowing users to learn the location of other users whom have given permission to have their location shared. In a preferred embodiment a user can launch an application which allows a user to request permission from a friend to receive information describing their location. The application can list the friends whom have given their permission to a user to view their location information.
When a user desires to see the location of one or more friends, the application can request location information for each friend, or selected friends, from a system server. The server can receive and interpret the request to determine whether the application requires detailed location information or approximate location information. For example, if the application has requested location information for all friends, it would be interpreted as a request for only approximate information because, among other reasons, displaying all friends on a map on a computer screen only requires approximate locations. However, if the application recently received updated approximate information regarding a particular friend, but is now requesting additional location information on just that specific friend, it is likely that the application requires detailed location information.
The difference in detailed location information versus approximate location information is based in not only a threshold of tolerated variance of the location information but also time since updated location information was received by the server, and the power required to learn accurate location information by the friend's device. For example, detailed location information might require an accuracy of +/−3 m, and with present technology, such accuracy is most often obtained using a GPS device. Additionally, detailed location information might only be considered accurate for a duration of 1 minute or less. In contrast, approximate location information may only require a city level of accuracy (e.g., +/−1 km) and be deemed relevant for up to 15 minutes or more.
A request to locate a friend is processed by a central server. Upon receiving a request, the server may forward the request to the friend's device and wait for a response. Alternatively, the server may respond to the request without contacting the friend's device. For example, the server may have cached location information of the friend's device. Because location information is only relevant at certain accuracies and for a certain period of time, the server may compare the cached information with the request and/or any predetermined constraints before sending the cached location information rather than sending a request to the friend's device.
In order to describe the manner in which the above-recited and other advantages and features of the disclosure can be obtained, a more particular description of the principles briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only exemplary embodiments of the disclosure and are not therefore to be considered to be limiting of its scope, the principles herein are described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure.
The present disclosure addresses the need in the art for a mechanism for transmitting location information of a user's mobile device and locating friends and family members through their respective mobile devices. A system, method and non-transitory computer-readable media are disclosed which locate a mobile device by sending a command to the device to determine its present location and report it back to the requestor. A brief introductory description of a basic general purpose system or computing device in
With reference to
The system bus 110 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. A basic input/output (BIOS) stored in ROM 140 or the like, may provide the basic routine that helps to transfer information between elements within the computing device 100, such as during start-up. The computing device 100 further includes storage devices 160 such as a hard disk drive, a magnetic disk drive, an optical disk drive, tape drive or the like. The storage device 160 can include software modules 162, 164, 166 for controlling the processor 120. Other hardware or software modules are contemplated. The storage device 160 is connected to the system bus 110 by a drive interface. The drives and the associated computer readable storage media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the computing device 100. In one aspect, a hardware module that performs a particular function includes the software component stored in a non-transitory computer-readable medium in connection with the necessary hardware components, such as the processor 120, bus 110, output device 170, and so forth, to carry out the function. The basic components are known to those of skill in the art and appropriate variations are contemplated depending on the type of device, such as whether the device 100 is a small, handheld computing device, a desktop computer, or a computer server.
Although the exemplary embodiment described herein employs a storage device 160, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, digital versatile disks, cartridges, random access memories (RAMs) 150, read only memory (ROM) 140, a cable or wireless signal containing a bit stream and the like, may also be used in the exemplary operating environment. Non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
To enable user interaction with the computing device 100, an input device 190 represents any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth. An output device 170 can also be one or more of a number of output mechanisms known to those of skill in the art. In some instances, multimodal systems enable a user to provide multiple types of input to communicate with the computing device 100. The communications interface 180 generally governs and manages the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.
For clarity of explanation, the illustrative system embodiment is presented as including individual functional blocks including functional blocks labeled as a “processor” or processor 120. The functions these blocks represent may be provided through the use of either shared or dedicated hardware, including, but not limited to, hardware capable of executing software and hardware, such as a processor 120, that is purpose-built to operate as an equivalent to software executing on a general purpose processor. For example the functions of one or more processors presented in
The logical operations of the various embodiments are implemented as: (1) a sequence of computer implemented steps, operations, or procedures running on a programmable circuit within a general use computer, (2) a sequence of computer implemented steps, operations, or procedures running on a specific-use programmable circuit; and/or (3) interconnected machine modules or program engines within the programmable circuits. The system 100 shown in
Having disclosed some components of a computing system, the disclosure now turns to
The devices 220, 240, and 245 preferably have one or more location aware applications that may run on them. Of these applications, some may have the functionality to send requests to other user devices to enable a requesting user to locate a friend's device. Upon receiving authorization to locate, a requesting device may then be able to send location requests to requested devices and receive responses containing the location of the requested device. Authorization is preferably managed at the server level, but may also be managed at the device level in addition or as an alternative.
Referring back to
A device such as a user station 220 may also be configured to operate in the computing environment 200. The user station 220 can be any general purpose computing device that can be configured to communicate with a web-enabled application, such as through a web browser. For example, the user station 220 can be a personal computing device such as a desktop or workstation, or a portable computing device, such as a laptop a smart phone, or a post-pc device. The user station 220 can include some or all of the features, components, and peripherals of computing device 100 of
User station 220 can further include a network connection to the communication network 210. The network connection can be implemented through a wired or wireless interface, and can support bi-directional communication between the user station 220 and one or more other computing devices over the communication network 210. Also, the user station 220 may include an interface application, such as a web browser or custom application, for communicating with a web-enabled application.
An application server 230 can also be configured to operate in the computing environment 200. The application server 230 can be any computing device that can be configured to host one or more applications. For example, the application server 230 can be a server, a workstation, or a personal computer. In some implementations, the application server 230 can be configured as a collection of computing devices, e.g., servers, sited in one or more locations. The application server 230 can include some or all of the features, components, and peripherals of computing device 100 of
The application server 230 can also include a network connection to the communication network 210. The network connection can be implemented through a wired or wireless interface, and can support bi-directional communication between the application server 230 and one or more other computing devices over the communication network 210. Further, the application server 230 can be configured to host one or more applications. For example, the application server 230 can be configured to host a remote management application that facilitates communication with one or more mobile devices connected with the network 210. The mobile devices 240, 245 and the application server 230 can operate within a remote management framework to execute remote management functions. The application server 230 can be configured to host a notification service application configured to support bi-directional communication over the network 210 between multiple communication devices included in the computing system 200. For example, the notification service application can permit a variety of messages to be transmitted and received by multiple computing devices.
In some implementations, the notification service can include a defined namespace, in which a unique command collection topic can be created for each subscribing mobile device. A unique identifier can be used to associate a subscribing mobile device with the corresponding command collection topic, such as an assigned number or address. The unique identifier also can be embedded in a Uniform Resource Identifier (URI) that is associated with a subscribed command collection topic. Further, one or more command nodes can be created below a command collection topic, such that each command node corresponds to a particular remote command type. For example, a command collection topic can include a separate command node for a location command.
Through the use of separate command nodes, multiple commands can be transmitted to one or more mobile devices substantially simultaneously. In some implementations, if multiple commands are received in a command collection topic, server time stamps can be compared to determine an order of execution.
Through the notification service, a publisher, such as a remote management application, can publish a remote command message to a command collection topic that is associated with a particular mobile device. When a remote command message is published to the command collection topic, a notification message can be transmitted to the one or more subscribing mobile devices. The mobile device can then access the subscribed topic and retrieve one or more published messages. This communication between the publisher and the mobile device can be decoupled. Further, the remote command message can be published to the appropriate command node of the command collection topic. Additionally, a mobile device receiving a remote command message can publish a response to a result topic hosted by a notification service. A publisher such as a remote management application, can subscribe to the result topic and can receive any published response messages.
Further, the computing environment 200 can include one or more mobile devices, such as mobile device 240 and mobile device 245. These mobile devices are preferably smart phones such as an Apple iPhone® or post-pc device such as an Apple iPad®. Each of the mobile devices included in the computing environment 200 can include a network interface configured to establish a connection to the communication network 210. For example, mobile device 240 can establish a cellular (e.g., GSM, EDGE, 3G, or 4G) network connection that provides data access to the communication network 210. Such a connection may be facilitated by one or more cellular towers 250 located within the range of the mobile devices 240 and 245 and connected to the network 210. Further, mobile device 245 can establish an IEEE 802.11 (i.e., WiFi or WLAN) network connection to the communication network 210. Such a connection may be facilitated by one or more wireless network routers 255 located within the range of the mobile devices 240 and 245 and connected to the network 210. Also, either one of these mobile devices 240, 245 or an additional device may connect to the network 210 through the IEEE 802.16 (i.e., wireless broadband or WiBB) standard. Again, the devices 240, 245 may employ the assistance of a cell tower 250 or wireless router 255 to connect to the communication network 210.
Each of the mobile devices, 240 and 245 also can be configured to communicate with the notification service application hosted by the application server 230 to publish and receive messages. Further, each of the mobile devices 240 and 245 can be configured to execute a remote management application or a remote management function responsive to a remote command received through the notification service application. In some embodiments, the remote management application can be integrated with the operating system of the mobile device.
A mobile device can execute a remote command to perform one or more associated functions. For example the remote commands can include locate commands, notification commands, and message commands. A message command can be used to present a text-based message on the display of a mobile device. A locate command can be used to cause a mobile device to transmit a message indicating its location at the time the locate command is executed. The locate command may also command the mobile device to use certain resources, such as an embedded GPS system, to determine its location.
Additionally, each of the mobile devices 240 and 245 can include an input interface, through which one or more inputs can be received. For example, the input interface can include one or more of a keyboard, a mouse, as joystick, a trackball, a touch pad, a keypad, a touch screen, a scroll wheel, general and special purpose buttons, a stylus, a video camera, and a microphone. Each of the mobile devices 240 and 245 can also include an output interface through which output can be presented, including one or more displays, one or more speakers, and a haptic interface. Further, a location interface, such as a Global Positioning System (GPS) processor, also can be included in one or more of the mobile devices 240 and 245 to receive and process signals sent from GPS satellites 260 for obtaining location information, e.g., an indication of current location. In some implementations, general or special purpose processors included in one or more of the mobile devices 240 and 245 can be configured to perform location estimation, such as through base station triangulation or through recognizing stationary geographic objects through a video interface.
Having disclosed some basic system components and concepts, the disclosure now turns to exemplary method embodiments 300a and 300b shown in
In a preferred embodiment, the server 230 may maintain data associated with the members of one or more services. The maintained data may include certain identification information relating to each member such as, for example, the member's username and other personal identification information, unique identification information relating to the member's phone, and the identification of other members that have chosen to give permission to share their location information with this member. The information may also include recent location information of each member. This location information may be caused to be updated by certain applications/processes on the member's mobile device and/or at the request of a requesting device. For example, an application on a mobile device such as a mapping service or other location aware application may be requested by the user to determine the location of the device and, whenever such a determination is made, the device may provide this information to the application server. The server may then retain this information in storage 335a for a length of time that has been deemed to still be representative of that device's location (such as, for example, 15 minutes or less).
In a preferred embodiment, a user/requester may have an application on his or her computer or mobile device that, when executed, initiates one or more locate requests to all of the devices whose members have agreed to share their location with the requester (the requestor's “friends”). In such embodiments, the application may initially present to the user/requester the location of all of the friends on a map or in a list. The locate request 310a may be received by a server such as application server 210 in
Upon receiving a location request from a mobile device 301a of a requesting user, the server may initially respond with the location data that it has cached in 335a. As mentioned above, in a preferred embodiment, the application server may maintain and/or cache information relating to members of services including recent location information. Updates in location information preferably overwrite older location information. Thus, the server may first, in step 315a, determine whether it is in possession of recent location information. As mentioned before, the server may have a set “time of life” for the location information it maintains. When it has decided that the location information is has is recent, in step 330a, the server retrieves the last known location from storage 335a. Again, in some instances such as when a person may be on the go, only very recent location information would be relevant. Thus, some embodiments, the time of life of the information may be adjusted based on the device's recent location activity. Some examples might include when the owner of the device has designated his/her location, such as home or work, where he/she typically remains for several hours at a time each day. Thus, if the server determines that it is in possession of location information of the requested mobile device deemed to be recent, it will provide that information to the requesting device in step 360a.
The server also preferably maintains this location information at a relatively low level of accuracy. The reason for this is similar to why the location is only deemed relevant for a short period of time: the more accurate the location information is, the more likely the person has since moved from that specific location, thereby rendering the location incorrect. Thus, maintaining recent location information at a lower level of accuracy increases the likelihood that the location is still correct and, therefore, not requiring additional communication with the user device.
Alternatively, the server may determine, in step 315a, that it does not have recent location information relating to the requested device. The server may, in step 320a, send a location request to the one or more requested devices (i.e., those devices associated with the friends). In this step, the server transmits a location request message to each requested device. The message sent by the server may take on any number of forms but has the effect of commanding the requested mobile device to obtain its current location information and transmit it back to the server in the form of a response message. In some alternative embodiments, the server only sends a location request message to the cellular network system, which may continually maintain recent location information relating to the requested device. Such location information may include, for example, the coordinates of the cell sites deemed closest to the requested device.
Some time after sending the request in step 320a, server receives responses in step 340a. Depending on, for example, the location of the requested devices and the network traffic, the responses may arrive in any order and may take different amounts of time. The response messages from the devices preferably include information relating to the location of the responding device and the time at which the location was determined.
This location information may be determined by the device in any number of ways including but not limited to those that have been discussed above. This information may even be obtained indirectly (i.e., not directly from the requested device), such as from the cellular communications network to which the device is communicating. For example, obtaining location information from the cell tower identified as being closest to the mobile device. Although this option may be of lower accuracy, it oftentimes may result in a quicker response and a savings in battery life for the requested device. Accordingly, the level of accuracy of the location information may vary. Thus, the location information may therefore include accuracy information as well.
In some embodiments, the owner of the responding device may have the option to enter unique location identifiers or labels associated with a location. For example, a user may assign labels such as “home,” “work,” or “school” to such locations. The user's mobile device may preferably associate certain geographic coordinates with such a label and transmit location-based messages to the server including the associated label.
Upon receiving this information, in step 350a, the server preferably updates the stored information 335a, if any, that it maintains relating to the device's last known location so that it may be made available to the next requester.
Having received a response from a requested device, in step 360a, the server may then send location information to the requesting device. This step may be performed for each response received by the server from the various requested devices. Although location information relating to some devices may have already been retrieved from cache 335a in step 330a, the server may additionally request and send updated information to the requesting device. In some embodiments, the server may additionally have a step (not shown) to compare the “known location information” that it had initially sent to the requesting device with the location information that it just received from the requested device to determine if sending the recently received location information would be any different. In other words, some embodiments would only send location information to the requesting device if the location of the requested device has changed. In such embodiments, a reduction in the amount of data that needs to be communicated may be realized.
In addition to temporal accuracy, the server may also have logic to determine how to handle a location request having a certain geographic location accuracy.
In a preferred embodiment, in step 310b, the server receives a request to acquire location information relating to a requested device at a certain acceptable level of accuracy (accuracy y). In a preferred embodiment, the server typically only maintains, in storage 335b, location information relating to devices at one level of accuracy (accuracy x). After receiving the request, in step 315b, the server determines whether the accuracy of the location information it has in storage 335b is greater than or equal to the accuracy requested by the requesting device (i.e., accuracy x≥accuracy y). If so, the level of accuracy is deemed acceptable and, in step 330b, the server retrieves the stored location information and, in step 360b, sends the location information to the requested device.
More typically, however, when the server receives a request for location information of a requested device, the requested accuracy (accuracy y) is greater than the accuracy of the information stored in 335b (accuracy x) (i.e, accuracy y>accuracy x). When this is determined in step 315b, the server sends a request to the requested device in step 320b. This request may be in several different forms. For example, the server may simply transmit the contents of the request to the requested device, containing the requested accuracy information, and leave it to the requested device (through its hardware, operating system, and applications) to determine how to respond to the request. Alternatively, the server may have sufficient information relating to the capabilities of the requested device (such as it having a GPS antenna of a certain accuracy) and the message sent is simply a command to determine its location using its GPS antenna and send this information to the server. The server, in step 340b, then receives the location information from the requested device. Again, this information may be in several different forms and may depend on the device information known by the server. For example, the response may include accuracy information provided by the requested device or may simply include the location and the means by which it was obtained. In the latter form, the server, preferably knowing the model features of the requested device, may then determine the accuracy provided by the requested device. Also, depending on the request sent by the server, the means information may not be provided in the response but may be implied by the server as the same as what was requested. Once the location information is received by the server, in step 350b, it updates its stored location information, 335b, and sends location information to the requesting device in step 360b.
Generally, the location information that is handled is of a low accuracy, such as at a city level or within a few miles of accuracy. As mentioned above, such information may be obtained by the server indirectly by, for example, knowing the geographic location of the cell phone tower or ISP to which the requested device is communicating. It is generally understood that mobile phones communicating with a cellular communications network periodically seek out cell sites having the strongest signal. In many cases, the strongest signals are measured by those cells that are the shortest distance away. Thus, in an area where there is a cell-phone tower every 4 miles, for example, the location of the mobile device may be extrapolated to be within 2 miles of the closest cell tower. A more accurate method of determining the location of a mobile device may be by determining the time difference of arrival (TDOA). The TDOA technique works based on trilateration by measuring the time of arrival of a mobile station radio signal at three or more separate cell sites. Such a method may be based on the availability of certain equipment supplied by the cellular network which may not be universally available and is therefore only an alternative embodiment. In either case, the location/accuracy determination may be performed by the communications network rather than by the mobile device. Such low accuracy information may preferably be transmitted by the server to the requesting device initially to give the device user a quick read on where his or her friends are located. The actions associated with obtaining such low accuracy information is herein referred to as a “shallow locate.”
Such low accuracy (i.e., less accuracy) location requests are only approximations but are preferably made initially, as they may result in the fastest response and require fewer resources from the requested device. On the other hand, a “deep locate request” may be requested by a user of a requesting device to obtain location information of a relatively higher accuracy (i.e., more accurate) from the requested device. For example, a “deep locate request” may command the requested device to use its GPS location resources to obtain location information having a level of accuracy that may be greater than that of some of the other location methods discussed above. While using a device feature such as GPS may be more accurate, the time and resources required to obtain signals from a sufficient number of GPS satellites and calculate the location oftentimes may take longer and require more energy. Thus, the “deep locate request” option is preferably reserved for specific requests made by the user of the requesting device.
This concept of a “shallow locate request” and a “deep locate request” is further illustrated from the perspective of the requesting device, such as a mobile device, in exemplary method 400 of
As individuals are often on the go, it is of value to the requesting user to occasionally have the location information of friends updated from time to time. The updating or refreshing of location information, performed in step 450, may be done automatically at predetermined intervals, such as every 15 seconds or 15 minutes, and/or may be done at the request of the user. These predetermined timing intervals may be consistently applied to every user or may be individually applied to each user differently based on the individual user's observed moment frequency in combination with the heuristics of observed general user-movement data (e.g., determine a shorter time interval for a user observed to be traveling on a highway but determine a longer time interval for a user who has “checked-in” to a location such as a restaurant). As is shown in method 400, a refresh step 450 will operate to repeat a request for shallow location information of all of the user's friends.
In addition to requesting and obtaining shallow location information of all of the user's friends, the user may request and obtain more detailed or “deep” location information of one or more friends, beginning in step 460. To perform a “deep locate request,” in a preferred embodiment the user may select a friend that has been presented to the user after a shallow locate request. In this preferred embodiment, a deep locate request is sent to the server which will send a command to the requested device to provide more detailed location information. This request may include commanding the device to obtain accurate location information from its GPS system. Upon the receipt of the response in step 470, the requesting device may display the deep location of the friend to the user in step 480. The accuracy of the deep location may also be displayed to the requesting user.
One way a user may gain authorization to obtain location information of a device associated with a friend is shown by method 500 in
Upon receiving a request from a user, the requested person (i.e., “friend”) is preferably presented with a message explaining the nature of the request and where he or she may either accept the request or reject the request. When the friend accepts the request in step 530, an acceptance response is sent from that friend's device in step 540. Upon receiving an accepting response, the server may update the information it maintains on either or both the requesting user and accepting friend such that when the user sends a location request, the server will process that request in step 550. In addition, a notice may be sent by the server back to the requesting user to indicate to the user and/or the user's device that the authorization request has been accepted. Accordingly, the user may now obtain location information relating to that friend. In a preferred embodiment, the friend may revoke the authorization given to the user at any time; thus, the friend maintains control over the privacy of his or her location information.
On the other hand, a friend who has received a request to authorize the user to locate him or her but has rejected or ignored the request in step 560 may not be able to obtain location information relating to that friend. Thus, if the user subsequently attempts to locate that friend, in step 570, both the device and the server will not process that request. From the requesting user and device perspective, such a friend would be displayed as having a status of “awaiting a response,” location not available,” or simply will not be listed. Of course, in some embodiments, the user may be able to send another request to the friend subsequently.
As shown in
On the other hand,
After a brief time has elapsed and the device has received location information relating to the user's friends, the location information may be presented to the user in display interface 900, as shown in
Referring again to
When a user wishes to send to a friend an invitation to share their location, “Add Friend” interface 1300, as shown in
When the friend selects to view the invitation, he or she is presented with a request message 1700, as shown in
Referring now to
With respect to assigning labels to certain locations, interface 1900 of
To further explain certain embodiments in this disclosure, the following use scenarios are presented to show how certain users of mobile devices may be able to use one or more embodiments in the disclosure to locate his or her friends.
One scenario may occur when a mobile device user is located somewhere, say downtown Palo Alto, at noon and wants to know if any of his friends are in the vicinity and are available for a quick lunch. The user may be able to use an embodiment in the present disclosure to see the location of his or her friends, identify one that is close by, and subsequently make contact.
A second scenario may arise when there is a need or desire by users of mobile devices to allow others to know where they are at certain times. One such situation is where a mobile device user may, for example, be training for a marathon and is outside running for miles each day. This user wishes to have her partner aware of her location during this period of time so that she can always be located in case something happens and may therefore benefit from embodiments in this disclosure. Also, when this person is actually participating in the marathon, her friends may want to know at what part of the course she has made it to so that they may be able to be present at certain locations during the race to cheer her on. In such a scenario, the user would benefit from embodiments of the disclosure having a map of the race course superimposed onto a street map of the area such that the users may be able to see the location of the runner and have some indication about the location where she will be heading to next.
A third scenario may arise when users of mobile devices wish to receive an indication that someone has reached a certain location. In such a scenario, one user of a mobile device may, for example, be embarking on a road trip and another person wants to be notified when he or she has arrived. Such a scenario might include a parent who is allowing her teenage son to take the family car on a holiday weekend to drive to visit his cousins that live several hours away. Although the parent has asked that the son call as soon as he arrives, he is often forgetful and does not do so. To overcome this, the parent or son may take advantage of an embodiment of the present disclosure where they may set an alert to automatically notify the parent when the son has arrived at the destination. In the interim, the parent may additionally use other embodiments to manually locate the son's mobile device to make sure that he has not gotten lost.
A fourth scenario may arise when users of mobile devices wish to receive a notification when someone has entered a certain geographic location. For example, a person commutes to and from the city using public transportation but does not live in walking distance to the train or bus stop. Rather than driving and parking, the person may rely on a spouse or partner to pick her up in the evenings or whenever there is inclement weather. As certain busses and train cars have rules and courtesies prohibiting talking on cell phones, the commuter may have to wait to call her spouse or partner until after she arrives and subsequently having to wait, for example, in the rain. The users would benefit from some embodiments of the disclosure that would allow for a way for the commuter's mobile device to notify her partner's device whenever she enters into a certain geographic region (i.e., is close to arriving at the bus or train stop) without requiring the commuter to place a call. Thus, the commuter and her partner may both arrive to the stop close to the same time.
Similarly, a fifth scenario includes users having certain household appliances that may be connected to a network and can perform certain tasks upon receiving a notification when a person enters a certain area. For example, when a person is traveling to her vacation home out in the mountains, certain appliances in the vacation home such as, for example, the furnace and front porch light, may turn on when the person enters into a certain geographic area (i.e., gets close to the home). An embodiment of this disclosure would enable a user to have and benefit from such a configuration.
A sixth scenario may arise when someone wishes to receive a notification when a mobile device user has left a certain geographic location. For example, a parent has asked his daughter to stay at home for the weekend to finish a school assignment that is due the following Monday. If the daughter leaves the neighborhood with her mobile device, the parent may be notified. Aspects of the disclosed technology would enable a parent to receive such notifications.
A seventh scenario may arise when some mobile device users wish to be located for only a brief period of time. For example, a person is on a business trip in a city and wants to be able to meet up for dinner with an old friend who lives in that city. Since she is not normally in that city and does not often interact with this old friend, she does not want the old friend to be able to locate her all the time. One embodiment of the disclosure employs a “day pass” which the person may send to the old friend to allow the old friend to locate her for the next 24 hours. After that time, the day pass is expired and the old friend may not be able to locate the person anymore.
In an eighth scenario, a user may select a number of persons in her contact list to all share location information with each other for a limited period of time. For example, a user is in town to attend a conference such as Apple's WWDC. The user knows that some people that she knows are also attending the conference and she would like to know their whereabouts during the event. One embodiment of the disclosure enables this user to send an invitation to the persons that she wants to locate at the conference. When the user's acquaintances accept her invitation, she and the acquaintances will be able to locate each other. Certain limits on this ability to locate each other may be set by a user, however, such as certain windows of time during the day (such as, only during the conference), or until an expiration time.
In the exemplary interface shown in
As described above, one aspect of the present technology is the gathering and use of data available from a user's mobile device. The present disclosure contemplates that in some instances, this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include hardware information relating to the user device, location-based data, telephone numbers, email addresses, social media IDs such as TWITTER IDs, work and home addresses, friends, or any other identifying information. The user typically enters this data when establishing an account and/or during the use of the application.
The present disclosure recognizes that the use of such personal information data in the present technology can be used to the benefit of users. In addition to being necessary to provide the core feature of the present technology (i.e., locating users), the personal information data can also be used to better understand user behavior and facilitate and measure the effectiveness applications. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure.
The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy and security policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of location aware services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the sending of personal information data. The present disclosure also contemplates that other methods or technologies may exist for blocking access to user's personal information data.
Embodiments within the scope of the present disclosure may also include tangible and/or non-transitory computer-readable storage media for carrying or having computer-executable instructions or data structures stored thereon. Such non-transitory computer-readable storage media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above. By way of example, and not limitation, such non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of the computer-readable media.
Computer-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Computer-executable instructions also include program modules that are executed by computers in stand-alone or network environments. Generally, program modules include routines, programs, components, data structures, objects, and the functions inherent in the design of special-purpose processors, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.
Those of skill in the art will appreciate that other embodiments of the disclosure may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
The various embodiments described above are provided by way of illustration only and should not be construed to limit the scope of the disclosure. Those skilled in the art will readily recognize various modifications and changes that may be made to the principles described herein without following the example embodiments and applications illustrated and described herein, and without departing from the spirit and scope of the disclosure.
This application is a continuation of and claims priority to U.S. patent application Ser. No. 16/146,774, filed Sep. 28, 2018, which is a continuation of U.S. patent application Ser. No. 15/219,239, filed Jul. 25, 2016, which is a continuation of U.S. patent application Ser. No. 14/636,106, filed Mar. 2, 2015, now issued as U.S. Pat. No. 9,402,153 on Jul. 26, 2016, which is a continuation of U.S. patent application Ser. No. 13/113,856, filed May 23, 2011, now issued as U.S. Pat. No. 8,971,924 on Mar. 3, 2015, the entire contents of each of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4856066 | Lemelson | Aug 1989 | A |
5475653 | Yamada et al. | Dec 1995 | A |
5801700 | Ferguson | Sep 1998 | A |
6002402 | Schacher | Dec 1999 | A |
6040781 | Murray | Mar 2000 | A |
6191807 | Hamada et al. | Feb 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6362842 | Tahara et al. | Mar 2002 | B1 |
6515585 | Yamamoto | Feb 2003 | B2 |
6570557 | Westerman et al. | May 2003 | B1 |
6677932 | Westerman et al. | Jan 2004 | B1 |
6809724 | Shiraishi et al. | Oct 2004 | B1 |
7015817 | Copley et al. | Mar 2006 | B2 |
7039420 | Koskinen et al. | May 2006 | B2 |
7076257 | Kall | Jul 2006 | B2 |
7224987 | Bhela | May 2007 | B1 |
7365736 | Marvit et al. | Apr 2008 | B2 |
7593749 | Vallstrom et al. | Sep 2009 | B2 |
7614008 | Ording et al. | Nov 2009 | B2 |
7633076 | Huppi et al. | Dec 2009 | B2 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7738883 | Hull | Jun 2010 | B2 |
7789225 | Whiteis | Sep 2010 | B2 |
7801542 | Stewart | Sep 2010 | B1 |
7834861 | Lee | Nov 2010 | B2 |
7844914 | Andre et al. | Nov 2010 | B2 |
7908219 | Abanami et al. | Mar 2011 | B2 |
7953393 | Chin et al. | May 2011 | B2 |
7957762 | Herz et al. | Jun 2011 | B2 |
8006002 | Kalayjian et al. | Aug 2011 | B2 |
8102253 | Brady, Jr. | Jan 2012 | B1 |
8121586 | Araradian et al. | Feb 2012 | B2 |
8150930 | Satterfield et al. | Apr 2012 | B2 |
8239784 | Hotelling et al. | Aug 2012 | B2 |
8244468 | Scailisi et al. | Aug 2012 | B2 |
8255830 | Ording et al. | Aug 2012 | B2 |
8279180 | Hotelling et al. | Oct 2012 | B2 |
8285258 | Schultz et al. | Oct 2012 | B2 |
8369867 | Van Os et al. | Feb 2013 | B2 |
8374575 | Mullen | Feb 2013 | B2 |
8381135 | Hotelling et al. | Feb 2013 | B2 |
8412154 | Leemet et al. | Apr 2013 | B1 |
8427303 | Brady, Jr. et al. | Apr 2013 | B1 |
8441367 | Lee et al. | May 2013 | B1 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8509803 | Gracieux | Aug 2013 | B2 |
8572493 | Qureshi | Oct 2013 | B2 |
8786458 | Wiltzius et al. | Jul 2014 | B1 |
8855665 | Buford et al. | Oct 2014 | B2 |
8922485 | Lloyd | Dec 2014 | B1 |
8971924 | Pai et al. | Mar 2015 | B2 |
8989773 | Sandel et al. | Mar 2015 | B2 |
9042919 | Trussel | May 2015 | B2 |
9204283 | Mullen | Dec 2015 | B2 |
9247377 | Pai et al. | Jan 2016 | B2 |
9294882 | Sandel et al. | Mar 2016 | B2 |
9369833 | Tharshanan et al. | Jun 2016 | B2 |
9402153 | Pai et al. | Jul 2016 | B2 |
9635540 | Mullen | Apr 2017 | B2 |
9699617 | Sandel et al. | Jul 2017 | B2 |
10382895 | Pai et al. | Aug 2019 | B2 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020037715 | Mauney et al. | Mar 2002 | A1 |
20020102989 | Calvert et al. | Aug 2002 | A1 |
20020115478 | Fujisawa et al. | Aug 2002 | A1 |
20020126135 | Ball et al. | Sep 2002 | A1 |
20030081506 | Karhu | May 2003 | A1 |
20030128163 | Mizugaki et al. | Jul 2003 | A1 |
20040041841 | LeMogne et al. | Mar 2004 | A1 |
20040070511 | Kim | Apr 2004 | A1 |
20040180669 | Kall | Aug 2004 | A1 |
20040203854 | Nowak | Oct 2004 | A1 |
20050032532 | Kokkonen et al. | Feb 2005 | A1 |
20050138552 | Venolia | Jun 2005 | A1 |
20050148340 | Guyot | Jul 2005 | A1 |
20050190059 | Wehrenberg | Sep 2005 | A1 |
20050191159 | Benko | Sep 2005 | A1 |
20050222756 | Davis et al. | Oct 2005 | A1 |
20050268237 | Crane et al. | Dec 2005 | A1 |
20050288036 | Brewer et al. | Dec 2005 | A1 |
20060017692 | Wehrenberg et al. | Jan 2006 | A1 |
20060019649 | Feinleib | Jan 2006 | A1 |
20060026245 | Cunningham et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060030333 | Ward | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060044283 | Eri et al. | Mar 2006 | A1 |
20060058948 | Blass et al. | Mar 2006 | A1 |
20060063538 | Ishii | Mar 2006 | A1 |
20060092177 | Blasko | May 2006 | A1 |
20060195787 | Topiwala et al. | Aug 2006 | A1 |
20060197753 | Hotelling et al. | Sep 2006 | A1 |
20060223518 | Haney | Oct 2006 | A1 |
20070036300 | Brown et al. | Feb 2007 | A1 |
20070085157 | Fadell et al. | Apr 2007 | A1 |
20070117549 | Amos | May 2007 | A1 |
20070129888 | Rosenberg | Jun 2007 | A1 |
20070150834 | Muller et al. | Jun 2007 | A1 |
20070150836 | Deggelmann et al. | Jun 2007 | A1 |
20070216659 | Amineh | Sep 2007 | A1 |
20070236475 | Wherry | Oct 2007 | A1 |
20080004043 | Wilson et al. | Jan 2008 | A1 |
20080014989 | Sandegard et al. | Jan 2008 | A1 |
20080045232 | Cone et al. | Feb 2008 | A1 |
20080052945 | Matas et al. | Mar 2008 | A1 |
20080055264 | Anzures et al. | Mar 2008 | A1 |
20080057926 | Forstall et al. | Mar 2008 | A1 |
20080070593 | Altman | Mar 2008 | A1 |
20080079589 | Blackadar | Apr 2008 | A1 |
20080114539 | Lim | May 2008 | A1 |
20080139219 | Boeiro et al. | Jun 2008 | A1 |
20080153517 | Lee | Jun 2008 | A1 |
20080165136 | Christie et al. | Jul 2008 | A1 |
20080176583 | Brachet et al. | Jul 2008 | A1 |
20080186165 | Bertagna et al. | Aug 2008 | A1 |
20080216022 | Lorch et al. | Sep 2008 | A1 |
20080287151 | Fjelstad | Nov 2008 | A1 |
20080320391 | Lemay et al. | Dec 2008 | A1 |
20090005011 | Christie et al. | Jan 2009 | A1 |
20090005018 | Forstall et al. | Jan 2009 | A1 |
20090006566 | Veeramachaneni et al. | Jan 2009 | A1 |
20090011340 | Lee et al. | Jan 2009 | A1 |
20090037536 | Braam | Feb 2009 | A1 |
20090049502 | Levien et al. | Feb 2009 | A1 |
20090051648 | Shamaie et al. | Feb 2009 | A1 |
20090051649 | Rondel | Feb 2009 | A1 |
20090055494 | Fukumoto | Feb 2009 | A1 |
20090066564 | Burroughs et al. | Mar 2009 | A1 |
20090085806 | Piersol et al. | Apr 2009 | A1 |
20090098903 | Donaldson et al. | Apr 2009 | A1 |
20090113340 | Bender | Apr 2009 | A1 |
20090164219 | Yeung et al. | Jun 2009 | A1 |
20090177981 | Christie et al. | Jul 2009 | A1 |
20090181726 | Vargas et al. | Jul 2009 | A1 |
20090187842 | Collins et al. | Jul 2009 | A1 |
20090254840 | Churchill et al. | Oct 2009 | A1 |
20090298444 | Shigeta | Dec 2009 | A1 |
20090303066 | Lee et al. | Dec 2009 | A1 |
20090312032 | Bornstein et al. | Dec 2009 | A1 |
20090313582 | Rupsingh et al. | Dec 2009 | A1 |
20090319616 | Lewis et al. | Dec 2009 | A1 |
20090322560 | Tengler et al. | Dec 2009 | A1 |
20090325603 | Van Os et al. | Dec 2009 | A1 |
20100004005 | Pereira et al. | Jan 2010 | A1 |
20100017126 | Holeman | Jan 2010 | A1 |
20100058231 | Duarte et al. | Mar 2010 | A1 |
20100069035 | Jonhson | Mar 2010 | A1 |
20100124906 | Hautala | May 2010 | A1 |
20100125411 | Goel | May 2010 | A1 |
20100125785 | Moore et al. | May 2010 | A1 |
20100144368 | Sullivan | Jun 2010 | A1 |
20100203901 | Dinoff | Aug 2010 | A1 |
20100205242 | Marchioro, II et al. | Aug 2010 | A1 |
20100211425 | Govindarajan et al. | Aug 2010 | A1 |
20100240398 | Hotes et al. | Sep 2010 | A1 |
20100248744 | Bychkov et al. | Sep 2010 | A1 |
20100250131 | Relyea et al. | Sep 2010 | A1 |
20100250727 | King et al. | Sep 2010 | A1 |
20100274569 | Reudink | Oct 2010 | A1 |
20100281409 | Rainisto et al. | Nov 2010 | A1 |
20100287178 | Lambert et al. | Nov 2010 | A1 |
20100295676 | Khachaturov et al. | Nov 2010 | A1 |
20100299060 | Snavely et al. | Nov 2010 | A1 |
20100325194 | Williamson et al. | Dec 2010 | A1 |
20100330952 | Yeoman et al. | Dec 2010 | A1 |
20100332518 | Song et al. | Dec 2010 | A1 |
20110003587 | Belz et al. | Jan 2011 | A1 |
20110051658 | Jin et al. | Mar 2011 | A1 |
20110054780 | Dhanani et al. | Mar 2011 | A1 |
20110054979 | Cova et al. | Mar 2011 | A1 |
20110059769 | Brunolli | Mar 2011 | A1 |
20110066743 | Hurley | Mar 2011 | A1 |
20110080356 | Kang et al. | Apr 2011 | A1 |
20110096011 | Suzuki | Apr 2011 | A1 |
20110118975 | Chen | May 2011 | A1 |
20110137813 | Stewart | Jun 2011 | A1 |
20110137954 | Diaz | Jun 2011 | A1 |
20110138006 | Stewart | Jun 2011 | A1 |
20110148626 | Acevedo | Jun 2011 | A1 |
20110151418 | Delespaul et al. | Jun 2011 | A1 |
20110157046 | Lee et al. | Jun 2011 | A1 |
20110164058 | Lemay | Jul 2011 | A1 |
20110167383 | Schuller et al. | Jul 2011 | A1 |
20110183650 | McKee | Jul 2011 | A1 |
20110225547 | Fong et al. | Sep 2011 | A1 |
20110239158 | Barraclough et al. | Sep 2011 | A1 |
20110250909 | Mathias | Oct 2011 | A1 |
20110254684 | Antoci | Oct 2011 | A1 |
20110265041 | Ganetakos et al. | Oct 2011 | A1 |
20110276901 | Lambetti et al. | Nov 2011 | A1 |
20110279323 | Hung et al. | Nov 2011 | A1 |
20110306366 | Trussel et al. | Dec 2011 | A1 |
20110306393 | Goldman et al. | Dec 2011 | A1 |
20110307124 | Morgan et al. | Dec 2011 | A1 |
20110316769 | Boettcher et al. | Dec 2011 | A1 |
20120008526 | Borghei | Jan 2012 | A1 |
20120022872 | Gruber et al. | Jan 2012 | A1 |
20120040681 | Yan et al. | Feb 2012 | A1 |
20120054028 | Tengler et al. | Mar 2012 | A1 |
20120077463 | Robbins et al. | Mar 2012 | A1 |
20120088521 | Nishida et al. | Apr 2012 | A1 |
20120095918 | Jurss | Apr 2012 | A1 |
20120102437 | Worley et al. | Apr 2012 | A1 |
20120105358 | Momeyer | May 2012 | A1 |
20120108215 | Kameli et al. | May 2012 | A1 |
20120117507 | Tseng et al. | May 2012 | A1 |
20120131458 | Hayes | May 2012 | A1 |
20120136997 | Yan et al. | May 2012 | A1 |
20120144452 | Dyor | Jun 2012 | A1 |
20120149405 | Bhat | Jun 2012 | A1 |
20120150970 | Peterson et al. | Jun 2012 | A1 |
20120158511 | Lucero et al. | Jun 2012 | A1 |
20120166531 | Sylvain | Jun 2012 | A1 |
20120172088 | Kirch et al. | Jul 2012 | A1 |
20120208592 | Davis et al. | Aug 2012 | A1 |
20120216127 | Meyr | Aug 2012 | A1 |
20120218177 | Pang et al. | Aug 2012 | A1 |
20120222083 | Vaha-Sipila et al. | Aug 2012 | A1 |
20120239949 | Kalvanasundaram et al. | Sep 2012 | A1 |
20120258726 | Bansal et al. | Oct 2012 | A1 |
20120265823 | Parmar et al. | Oct 2012 | A1 |
20120276919 | Bi et al. | Nov 2012 | A1 |
20120290648 | Sharkey | Nov 2012 | A1 |
20120302256 | Pai et al. | Nov 2012 | A1 |
20120302258 | Pai et al. | Nov 2012 | A1 |
20120304084 | Kim et al. | Nov 2012 | A1 |
20120306770 | Moore et al. | Dec 2012 | A1 |
20130002580 | Sudou | Jan 2013 | A1 |
20130007665 | Chaudhri et al. | Jan 2013 | A1 |
20130014358 | Williams et al. | Jan 2013 | A1 |
20130045759 | Smith et al. | Feb 2013 | A1 |
20130063364 | Moore | Mar 2013 | A1 |
20130065566 | Gisby et al. | Mar 2013 | A1 |
20130091298 | Ozzie et al. | Apr 2013 | A1 |
20130093833 | Ai-Asaaed et al. | Apr 2013 | A1 |
20130120106 | Cauwels et al. | May 2013 | A1 |
20130143586 | Williams et al. | Jun 2013 | A1 |
20130159941 | Langlois et al. | Jun 2013 | A1 |
20130226453 | Trussel et al. | Aug 2013 | A1 |
20130303190 | Khan et al. | Nov 2013 | A1 |
20130305331 | Kim | Nov 2013 | A1 |
20130307809 | Sudou | Nov 2013 | A1 |
20130310089 | Gianoukos et al. | Nov 2013 | A1 |
20140062790 | Letz et al. | Mar 2014 | A1 |
20140099973 | Cecchini et al. | Apr 2014 | A1 |
20140122396 | Swaminathan et al. | May 2014 | A1 |
20140179344 | Bansal et al. | Jun 2014 | A1 |
20140222933 | Stovicek et al. | Aug 2014 | A1 |
20140237126 | Bridge | Aug 2014 | A1 |
20140310366 | Fu et al. | Oct 2014 | A1 |
20150172393 | Oplinger et al. | Jun 2015 | A1 |
20150180746 | Day, II et al. | Jun 2015 | A1 |
20150346912 | Yang et al. | Dec 2015 | A1 |
20150350130 | Yang et al. | Dec 2015 | A1 |
20150350140 | Garcia et al. | Dec 2015 | A1 |
20150350141 | Yang et al. | Dec 2015 | A1 |
20160036735 | Pycock et al. | Feb 2016 | A1 |
20160073223 | Woolsey et al. | Mar 2016 | A1 |
20160234060 | Pai et al. | Aug 2016 | A1 |
20170026796 | Pai et al. | Jan 2017 | A1 |
20180091951 | Sandel et al. | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
1475924 | Feb 2004 | CN |
1852335 | Oct 2006 | CN |
101390371 | Mar 2009 | CN |
102098656 | Jun 2011 | CN |
102111505 | Jun 2011 | CN |
201928419 | Aug 2011 | CN |
103583031 | Feb 2014 | CN |
103959751 | Jul 2014 | CN |
1387590 | Feb 2004 | EP |
2574026 | Mar 2013 | EP |
2610701 | Jul 2013 | EP |
2610701 | Apr 2014 | EP |
H1145117 | Feb 1999 | JP |
2002366485 | Dec 2002 | JP |
2003516057 | May 2003 | JP |
2003207556 | Jul 2003 | JP |
2006072489 | Mar 2006 | JP |
2006079427 | Mar 2006 | JP |
2006113637 | Apr 2006 | JP |
2006129429 | May 2006 | JP |
2009081865 | Apr 2009 | JP |
2010503126 | Jan 2010 | JP |
2010503332 | Jan 2010 | JP |
2010288162 | Dec 2010 | JP |
2010539804 | Dec 2010 | JP |
2011107823 | Jun 2011 | JP |
2012508530 | Apr 2012 | JP |
2012198369 | Oct 2012 | JP |
2013048389 | Mar 2013 | JP |
1020040089329 | Oct 2004 | KR |
1020070096222 | Oct 2007 | KR |
1020080074813 | Aug 2008 | KR |
200532429 | Oct 2005 | TW |
2008030972 | Mar 2008 | WO |
2009071112 | Jun 2009 | WO |
2010054373 | May 2010 | WO |
2012128824 | Sep 2012 | WO |
2012170446 | Dec 2012 | WO |
2013093558 | Jun 2013 | WO |
Entry |
---|
‘seechina365.com’ [online]. “How to use China's popular social networking service wechat2_voice message, press together, shake function etc.” Apr. 5, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://seechina365.com/2014/04/05/wechat02>. 29 pages with Machine English Translation. |
‘slideshare.net.’ [online]. “Samsung Gear 2 User manual”, Apr. 2014, [retrieved on Apr. 23, 2019], retrieved from: URL<http://www.slideshare.net/badaindonesia/samsung-gear-2-user-manual>. 58 pages. |
U.S. Corrected U.S. Notice of Allowance received for U.S. Appl. No. 14/841,614, dated Jan. 8, 2019, 3 pages. |
U.S. Final Office Action received for U.S. Appl. No. 14/841,614, dated May 10, 2018, 12 pages. |
U.S. Final Office Action received for U.S. Appl. No. 14/841,623, dated Sep. 5, 2017, 15 pages. |
U.S. Final Office Action received for U.S. Appl. No. 14/928,865, dated Dec. 5, 2018, 14 pages. |
U.S. Final Office Action received for U.S. Appl. No. 14/817,572, dated Mar. 23, 2017, 13 pages. |
U.S. Final Office Action received for U.S. Appl. No. 14/838,235, dated Jun. 15, 2016, 17 pages. |
U.S. Non Final Office Action received for U.S. Appl. No. 14/503,386, dated Jan. 7, 2015, 18 pages. |
U.S. Non Final Office Action received for U.S. Appl. No. 14/817,572, dated Sep. 12, 2016, 8 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 14/841,608, dated Apr. 12, 2017, 8 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 14/841,614, dated Jul. 27, 2017, 12 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 14/841,623, dated Feb. 2, 2017, 16 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 14/928,865, dated Mar. 27, 2018, 14 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 15/142,661, dated Jan. 25, 2017, 28 Pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 15/425,273, dated Oct. 3, 2018, 9 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 15/431,435, dated Jun. 8, 2017, 10 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 15/985,570, dated Aug. 16, 2018, 23 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 14/503,376, dated Dec. 22, 2014, 19 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 15/366,763, dated Mar. 8, 2019, 13 pages. |
U.S. Non-Final Office Action received for U.S. Appl. No. 14/838,235, dated Jan. 5, 2016, 18 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/841,608, dated Nov. 14, 2017, 5 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/841,614, dated Oct. 24, 2018, 10 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/841,623, dated Feb. 23, 2018, 8 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 15/142,661, dated Feb. 15, 2018, 9 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 15/142,661, dated Oct. 4, 2017, 21 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 15/425,273, dated Mar. 7, 2019, 8 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 15/431,435, dated Jan. 23, 2018, 8 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 15/985,570, dated Mar. 13, 2019, 21 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,376, dated Jul. 29, 2015, 12 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,376, dated Sep. 2, 2015, 4 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,376, dated Sep. 24, 2015, 5 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,386, dated Jul. 30, 2015, 11 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/503,386, dated Sep. 24, 2015, 5 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/817,572, dated Nov. 30, 2017, 26 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/838,235, dated Dec. 29, 2016, 3 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 14/838,235, dated Oct. 4, 2016, 7 pages. |
U.S. Notice of Allowance received for U.S. Appl. No. 15/876,673, dated May 4, 2018, 26 pages. |
U.S. Supplemental Notice of Allowance received for U.S. Appl. No. 14/841,608, dated Jan. 25, 2018, 2 pages. |
‘wechat.wikia.com’ [online]. “WeChat Wiki”, May 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://web.archive.org/web/20130514131044/http://wechat.wikia.com/wiki/WeChat_Wiki>. 6 pages. |
‘wikihow.com’ [online]. “How to Move Mail to Different Folders in Gmail,” available on or before Jul. 31, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140731230338/http://www.wikihow.com/Move-Mail-to-Different-Folders-in-Gmail>. 4 pages. |
‘youtube.com’ [ online]. “How to Dismiss Banner Notifications or Toast Notifications on iOS7,” Dec. 17, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=vSiHnBFIW M>. 2 pages. |
‘youtube.com’ [online]. “How to Send a Picture Message/MMS—Samsung Galaxy Note 3,” Nov. 3, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=-3dOz8-KeDw>. \2 page. |
‘youtube.com’ [online]. “iOS 7 Notification Center Complete Walkthrough,” Jun. 10, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=gATXt-o42LA>. 3 pages. |
‘youtube.com’ [online]. “iOS Notification Banner Pull Down to Notification Center in iOS 7 Beta 5”, Aug. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=nP0s6ETPxDg>. 2 pages. |
‘youtube.com’ [online]. “Notification & Control Center Problem Issue Solution” Dec. 6, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.voutube.com/watch?v=KOzCue YlaTA>. 3 pages. |
‘youtube.com’ [online]. “WeChat TVC—Hold to Talk”, May 11, 2013, [retrieved on Apr. 23, 2019], retrieved from: URL<https://www.youtube.com/watch?v=E UxteOWVSo>. 2 page. |
Australian Patent Examination Report No. 1 in Australian Application No. 2012202929, dated Sep. 28, 2013, 3 pages. |
Chinese Office Action in Chinese Application No. 201210288784.3, dated Jul. 3, 2014, 16 pages (with English Translation). |
European Search Report in European Application No. 12168980.6, dated Sep. 21, 2012, 7 pages. |
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365358.4, dated Nov. 20, 2015, 2 pages with English Translation. |
Chinese Notice of Allowance received for Chinese Patent Application No. 201520365843.1, dated Feb. 15, 2016, 3 pages with English Translation. |
Chinese Notice of Allowance received for Chinese Patent Application No. 201520669842.6, dated May 18, 2016, 2 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201520365358.4, dated Aug. 11, 2015, 4 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Aug. 25, 2015, 4 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201520365843.1, dated Nov. 16, 2015, 3 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201520669842.6, dated Dec. 4, 2015, 7 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201620393549.6, dated Aug. 18, 2016, 2 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201620393748.7, dated Aug. 18, 2016, 2 pages with English Translation. |
Invitation to Pay Additional Fees and Partial Search Report received for PCT Patent Application No. PCT/US2015/043487, dated Nov. 9, 2015, 4 pages |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/044083, dated Nov. 4, 2015, 11 pages |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2015/046787, dated Dec. 15, 2015, 8 pages |
Invitation to Pay Additional Fees received for PCT Patent Application No. PCT/US2016/046828, dated Sep. 23, 2016, 2 pages |
Taiwanese Office Action received for Taiwanese Patent Application No. 104107332, dated Oct. 29, 2018, 12 pages with English Translation. |
Taiwanese Office Action received for Taiwanese Patent Application No. 104128519, dated Mar. 29, 2017, 16 pages with English Translation. |
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Jul. 31, 2017, 7 pages with English Translation. |
Taiwanese Office Action received for Taiwanese Patent Application No. 104128704, dated Nov. 2, 2016, 12 pages with English Translation. |
‘absoluteblogger.com’ [online]. “WeChat Review—Communication Application with Screenshots” available on or before Jun. 14, 2013, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<http://www.absoluteblogger.com/2012/10/wechat-review-communication-application.html>. 4 pages. |
‘appps.jp’ [online]. “WhatsApp” users over 400 million people! I tried to investigate the most used messaging application in the world Jan. 24, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140410142411/http://www.appps.jp/2128786>. 13 pages, with Machine English Translation. |
Australian Certificate of Examination in Australian Patent Application No. 2017100760 dated Feb. 9, 2018, 2 pages. |
Australian Notice of Acceptance in Australian Patent Application No. 2015267259, dated Jan. 30, 2018, 3 pages. |
Australian Notice of Acceptance in Australian Patent Application No. 2015267260, dated Jan. 30, 2018, 3 pages. |
Australian Notice of Acceptance in Australian Patent Application No. 2015312369, dated Mar. 21, 2018, 3 pages. |
Australian Office Action in Australian Patent Application No. 2015100711, dated Jul. 27, 2015, 7 pages. |
Australian Office Action in Australian Patent Application No. 2015100711, dated Nov. 19, 2015, 6 pages. |
Australian Office Action in Australian Patent Application No. 2015101188, dated Apr. 14, 2016, 3 pages. |
Australian Office Action in Australian Patent Application No. 2015267259, dated Jun. 2, 2017, 2 pages. |
Australian Office Action in Australian Patent Application No. 2015267260, dated Jun. 2, 2017, 2 pages. |
Australian Office Action in Australian Patent Application No. 2015312369, dated Mar. 29, 2017, 3 pages. |
Australian Office Action in Australian Patent Application No. 2016102028, dated Feb. 13, 2017, 4 pages. |
Australian Office Action in Australian Patent Application No. 2016102029, dated Feb. 22, 2017, 4 pages. |
Australian Office Action in Australian Patent Application No. 2017100197, dated Apr. 28, 2017, 4 pages. |
Australian Office Action in Australian Patent Application No. 2017100198, dated Apr. 20, 2017, 4 pages. |
Australian Office Action in Australian Patent Application No. 2017100760, dated Aug. 10, 2017, 4 pages. |
Australian Office Action in Australian Patent Application No. 2017100760, dated Jan. 30, 2018, 3 pages. |
Australian Office Action in Australian Patent Application No. 2018204430, dated Aug. 15, 2018, 5 pages. |
Chinese Notice of Allowance received for Chinese Patent Application No. 201510290133.1, dated Jan. 9, 2019, 3 pages with English Translation. |
Chinese Notice of Allowance received for Chinese Patent Application No. 201510291012.9, dated Jan. 9 2019, 3 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201510290133.1, dated Feb. 9, 2018, 10 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201510291012. 9, dated Feb. 8, 2018, 9 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Aug. 7, 2018, 7 pages with English Translation. |
Chinese Office Action received for Chinese Patent Application No. 201510549056.7, dated Nov. 24, 2017, 22 pages with English Translation. |
Danish Decision to Grant received for Danish Patent Application No. P A201770126, dated Mar. 27, 2018, 2 pages. |
Danish Intention to Grant received for Demnark Patent Application No. PA201570550, dated Dec. 22, 2016, 2 pages. |
Danish Intention to Grant received for Demnark Patent Application No. P A201770126, dated Jan. 19, 2018, 2 pages. |
Danish Notice of Allowance received for Danish Patent Application No. P A201570550, dated Mar. 20, 2017, 2 pages. |
Danish Office Action received for Danish Patent Application No. P A201570550, dated Oct. 19, 2016, 3 pages. |
Danish Office Action received for Danish Patent Application No. P A201570550, dated Dec. 7, 2015, 5 pages. |
Danish Office Action received for Danish Patent Application No. P A201570550, dated Jan. 19, 2016, 2 pages. |
International Preliminary Report on Patent Ability in International Application No. PCT/US/2012/038718, dated Nov. 26, 2013, 5 pages. |
International Search Report in International Application No. PCT/US2012/038718, dated Aug. 17, 2012, 5 pages. |
International Search Report and Written Opinion in International Application No. PCT/US13/41780, dated Dec. 1, 2014, 18 pages. |
International Preliminary Report on Patent Ability in International Application No. PCT/US13/41780, dated Dec. 9, 2014, 8 pages. |
Japanese Office Action in Japanese Application No. 2012-113725, dated May 27, 2013, 5 pages. |
Korean Preliminary Rejection in Korean Application No. 10-2012-54888, dated Sep. 5, 2014, 9 pages (with English Translation). |
Search and Examination Report in GB Application No. GB1209044.5, dated Aug. 24, 2012, 10 pages. |
U.S. Final Office Action in U.S. Appl. No. 13/113,856, dated Nov. 7, 2012, 19 pages. |
U.S. Final Office Action in U.S. Appl. No. 13/488,430, dated May 8, 2013, 19 pages. |
U.S. Non-Final Office Action in U.S. Appl. No. 13/113,856, dated Jul. 18, 2012, 14 pages. |
U.S. Non-Final Office Action in U.S. Appl. No. 13/488,430, dated Dec. 5, 2012, 13 pages. |
Written Opinion in International Application No. PCT/US/2012/038718, dated Aug. 17, 2012, 4 pages. |
Australian Patent Examination Report No. 1 in Australian Application No. 2013203926, dated Oct. 7, 2014, 5 pages. |
Australian Patent Examination Report No. 2 in Australian Application No. 2013203926, dated Jan. 13, 2016, 3 pages. |
European Extended Search Report in Aoolication No. 16155938.0, dated Jun. 7, 2016, 8 pages. |
Chinese Office Action for Aoolication No. 201210288784.3, dated Jan. 5, 2017, 14 pages. |
India Office Action for Application No. 2030/CHE/2012, dated Dec. 27, 2016, 9 pages. |
Chinese Notification of Reexamination for Application No. 201210288784.3, dated Sep. 27, 2017, 18 pages. |
Danish Office Action received for Danish Patent Application No. P A201770125, dated Jan. 26, 2018, 5 pages. |
Danish Office Action received for Danish Patent Application No. P A201770125, dated Jul. 20, 2018, 2 pages. |
Danish Office Action received for Danish Patent Application No. P A201770126, dated Oct. 18, 2017, 3 pages. |
Danish Search Report received for Danish Patent Application No. P A201770125, dated May 5, 2017, 10 pages. |
Danish Search Report received for Danish Patent Application No. P A201770126, dated Apr. 26, 2017, 8 Pages. |
‘digitalstreetsa.com’ [online]. “Why WeChat might kill Whatsapp's future . . . ” Jul. 3, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://digitalstreetsa.com/whv-wechat-might-kill-whatsanns-future>. 9 pages. |
‘download.cnet.com’ [online]. “WeChat APK for Android” Jan. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://download.cnet.com/WeChat/3000-2150 4-75739423.html>. 5 pages. |
‘engadget.com’ [online]. “WhatsApp Introduces Major New Audio Features,” Aug. 7, 2013, retrieved on Apr. 23, 2019], retrieved from: URL<http://www.engadget.com/2013/08/07/whatsapp-introduces-major-new-audio-features>. 12 pages. |
European Extended Search Report in European Patent Application No. 17167629.9, dated Jun. 2, 2017, 7 pages. |
European Extended Search Report in European Patent Application No. 18170262.2, dated Jul. 25, 2018, 8 pages. |
European Office Action in European Patent Application No. 15728307.8, dated Feb. 8, 2018, 7 pages. |
European Office Action in European Patent Application No. 15729286.3, dated Feb. 7, 2018, 7 pages. |
European Office Action in European Patent Application No. 15759981.2, dated Apr. 19, 2018, 6 pages. |
European Office Action in European Patent Application No. 15759981.2, dated Aug. 6, 2018, 10 pages. |
European Office Action in European Patent Application No. 15759981.2, dated May 16, 2018, 6 pages. |
European Office Action in European Patent Application No. 17167629.9, dated Jan. 25, 2019, 7 pages. |
‘heresthethingblog.com’ [online]. “iOS 7 tip: Alerts, Banners, and Badgesawhats the Difference?” Jan. 22, 2014, [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20140128072440/http://heresthethingblog.com/2014/0I/22/ios-7-tip-whats-difference-alert/>. 5 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/032305, dated Dec. 15, 2016, 7 pages. |
International Preliminary Report on Patentability received for PCT Application No. PCT/US2015/032309, dated Dec. 15, 2016, 7 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/043487, dated Feb. 16, 2017, 12 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/044083, dated Mar. 16, 2017, 24 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/046787, dated Mar. 16, 2017, 18 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT /US2016/046828, dated Mar. 1, 2018, 19 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032305, dated Sep. 10, 2015, 9 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/032309, dated Sep. 2, 2015, 9 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/043487, dated Jan. 29, 2016, 17 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/044083, dated Feb. 4, 2016, 31 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2015/046787, dated Apr. 1, 2016, 26 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2016/046828, dated Dec. 15, 2016, 21 pages. |
IPhone, “User Guide for iOS 7.1 Software”, Mar. 2014, 162 pages. |
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-510297, dated May 7, 2018, 5 pages with English Translation. |
Japanese Notice of Allowance received for Japanese Patent Application No. 2017-514992, dated Feb. 15, 2019, 5 pages with English Translation. |
Japanese Notice of Allowance received for Japanese Patent application No. 2017514993, dated Jan. 12, 2018, 6 pages with English Translation. |
Japanese Notice of Allowance received for Japanese Patent Application No. 2018-072632, dated Dec. 7, 2018, 6 pages with English Translation. |
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Dec. 4, 2017, 6 pages with English Translation. |
Japanese Office Action received for Japanese Patent Application No. 2017-510297, dated Jul. 10, 2017, 9 pages with English Translation. |
Japanese Office Action received for Japanese Patent Application No. 2017-514992, dated Apr. 6, 2018, 9 pages with English Translation. |
Japanese Office Action received for Japanese Patent Application No. 2018-018497, dated Dec. 10, 2018, 7 pages with English Translation. |
Japanese Office Action received for Japanese Patent Application No. 2018-072632, dated Jul. 9, 2018, 5 pages with English Translation. |
‘Jng.org’ [online]. “Affordances and Design,” published on or before Feb. 25, 2010 [retrieved on Apr. 23, 2019], via Internet Archive: Wayback Machine URL<https://web.archive.org/web/20150318044240/jnd.org/dn.mss/affordancesand.html>. 6 pages. |
Korean Notice of Allowance received for Korean Patent Application No. 10-2017-7005628, dated Jun. 18, 2018, 4 pages with English Translation. |
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated Jan. 30, 2018, 6 pages with English translation. |
Korean Office Action received for Korean Patent Application No. 10-2017-7005628, dated May 10, 2017, 11 pages with English Translation. |
Korean Office Action received for Korean Patent Application No. 10-2018-7027006, dated Jan. 14, 2019, 4 pages with English Translation. |
‘makeuseof.com’ [online]. “MS Outlook Tip: How to Automatically Organize Incoming Emails,” Sep. 27, 2019, retrieved on Apr. 23, 2019], retrieved from: URL <http://www.makeuseof.com/tag/ms-outlook-productivity-tip-how-to-move-emails-to-individual-folders-automatically>. 5 pages. |
‘manualslib.com’ [online]. “Samsung Gear 2 User Manual”, 2014, [retrieved on Apr. 23, 2019], retrieved from: URL<http://www.manualslib.com/download/754923/Samsung-Gear-2.html>. 97 pages. |
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2015354, completed on Jun. 22, 2017, 23 pages with English Translation. |
Netherland Search Report and Opinion received for Netherlands Patent Application No. 2019878, dated Apr. 6, 2018, 23 pages with English Translation. |
Samsung, “SM-G900F User Manual”, English (EU). Rev.1.0, Mar. 2014, 249 pages. |
Samsung, “SM-R380”, User Manual, 2014, 74 pages. |
Number | Date | Country | |
---|---|---|---|
20200008010 A1 | Jan 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16146774 | Sep 2018 | US |
Child | 16450993 | US | |
Parent | 15219239 | Jul 2016 | US |
Child | 16146774 | US | |
Parent | 14636106 | Mar 2015 | US |
Child | 15219239 | US | |
Parent | 13113856 | May 2011 | US |
Child | 14636106 | US |