Embodiments described herein relate generally to a system and method of locating wireless devices and accessories. More specifically, embodiments relate to an infrastructure to enable proximity enhancement to the user experience of device locator applications.
Current security features in handheld and portable products allow the location of the product to be identified when requested by the user, such as in instances where the product is lost or stolen. If the wireless device includes positioning technology, the device can be configured to report its last location to the server computer, which is displayed by the service on a map presented to the user. Often wireless devices are used with wireless accessory devices that cannot determine their location and cannot communicate with a remote tracking services over a wide area network. These accessory devices can include, for example, wireless earbuds, headphones, headsets and other wearable devices (e.g., smartwatches, fitness bands, optical head-mounted displays) that communicate directly with the wireless device using peer-to-peer communications. For wireless accessory devices that cannot determine their location and cannot communicate with the remote tracking service, those devices cannot be tracked by the service when lost or stolen.
Embodiments described herein provide systems and methods to enable proximity enhancements for a location query performed for a target wireless device.
One embodiment provides for a non-transitory machine-readable medium storing instructions to cause one or more processors of a wireless communication device to perform operations comprising determining a range and direction to a target wireless device using one our more wireless ranging operations when within a threshold range of the target wireless device, determining a pose of the wireless communication device via sensor data received from an inertial measurement unit within the wireless communication device; determining a target position estimate of the target wireless device relative to the wireless communication device based on the range, direction, and pose, generating an augmented reality view that includes the target position estimate of the wireless device, and presenting the augmented reality view via a display of the wireless communication device.
One embodiment provides for an electronic device comprising a wireless controller including a ranging sensor and a communications radio, a display device, memory to store instructions, and one or more processors to execute the instructions. The instructions cause the one or more processors to scan for a beacon advertisement via the communications radio of the wireless controller and in response to detection of a beacon advertisement of a target wireless device, transmitting a command to the target wireless device to initiate a bi-directional ranging operation with the electronic device. The one or more processors can additionally determine a range and direction to the target wireless device via the ranging sensor of the wireless controller during the bi-directional ranging operation, determine a target position estimate for the target wireless device relative to the electronic device based on the range and direction to the target wireless device, generate an augmented reality view that includes the target position estimate for the wireless device, and present the augmented reality view via the display device.
One embodiment provides for a method comprising, on a wireless communication device including one or more processors, determining a range and direction to a target wireless device using one our more wireless ranging operations when the wireless communication device is within a threshold range of the target wireless device. The method additionally includes determining a pose of the wireless communication device via sensor data received from an inertial measurement unit within the wireless communication device, determining a target position estimate for the target wireless device relative to the wireless communication device based on the range and direction to the target wireless device and the pose of the wireless communication device, and generating an augmented reality view that includes the target position estimate for the wireless device. The method additionally includes presenting the augmented reality view via a display of the wireless communication device.
The above summary does not include an exhaustive list of all embodiments in this disclosure. All systems and methods can be practiced from all suitable combinations of the various aspects and embodiments summarized above, and also those disclosed in the Detailed Description below.
The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements, and in which:
Embodiments described herein provide techniques to enable secure crowdsourced locator services for lost or misplaced devices that cannot communicate with a wide area network. Various embodiments and aspects will be described with reference to details discussed below, and the accompanying drawings will illustrate the various embodiments. The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of various embodiments. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments.
The terminology used in this description is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In the discussion that follows, a computing device that includes a touch-sensitive display is described. It should be understood, however, that the computing device may include one or more other physical user-interface devices. The various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device may support the variety of applications with user interfaces that are intuitive and transparent.
Some processes are described below in terms of some sequential operations. However, it should be appreciated that some of the operations described may be performed in a different order. Moreover, some operations may be performed in parallel rather than sequentially.
In some implementations, both voice and data communications can be established over the wireless network 112 and/or the access device 118. For example, mobile device 102A can place and receive phone calls (e.g., using VoIP protocols), send and receive e-mail messages (e.g., using POP3 protocol), and retrieve electronic documents and/or streams, such as web pages, photographs, and videos, over the wireless network 112, gateway 116, and wide area network 114 (e.g., using TCP/IP or UDP). In some implementations, mobile device 102A can place and receive phone calls, send and receive e-mail messages, and retrieve electronic documents over the access device 118 and the wide area network 114. In some implementations, mobile device 102A or mobile device 102B can be physically connected to the access device 118 using one or more cables, for example, where the access device 118 is a personal computer. In this configuration, mobile device 102A or mobile device 102B can be referred to as a “tethered” device. In one embodiment, mobile device 102A can communicate with mobile device 102B via a wireless peer-to-peer connection 120. The wireless peer-to-peer connection 120 can be used to synchronize data between the devices.
Mobile device 102A or mobile device 102B can communicate with a service provider 115 that provides or enables one or more services. Exemplary services include a telephony service 130, a messaging service 140, a media service 150, a storage service 160, and a device locator service 170 over the one or more wired and/or wireless networks 110. For example, the telephony service 130 can enable telephonic communication between mobile device 102A and mobile device 102B, or between a mobile device and a wired telephonic device. The telephony service 130 can route voice over IP (VoIP) calls over the wide area network 114 or can access a cellular voice network (e.g., wireless network 112). The messaging service 140 can, for example, provide e-mail and/or other messaging services. The media service 150 can, for example, provide access to media files, such as song files, audio books, movie files, video clips, and other media data. The storage service 160 can provide network storage capabilities to mobile device 102A and mobile device 102B to store documents and media files. The device locator service 170 can enable a user to locate a lost or misplaced device that was, at least at some point, connected to the one or more wired and/or wireless networks 110. For example, mobile device 102A can perform a location query for mobile device 102B. The device locator service 170 can also enable location queries for devices that do not have a network connection via the use of a network of finder devices, as shown below in
In one embodiment, the wireless accessory 201 can periodically transmit a wireless beacon signal. The wireless accessory 201 can transmit the beacon signal using one of a variety of wireless technologies described herein (e.g., Bluetooth, Wi-Fi, etc.) and in one embodiment can also beacon using an ultra-wideband (UWB) radio technology. The beacon signal can be transmitted using a single wireless technology, one of multiple selectable wireless technologies, or multiple simultaneous wireless technologies. The beacon signal can transmit a beacon identifier that includes information to specifically identify the wireless accessory 201. In one embodiment, the beacon identifier is a public encryption key associated with the device.
The beacon signal can also convey information about the wireless accessory 201, such as a beacon type, device classification, battery level. In one embodiment the beacon signal can also convey device status, such as a lost status, alarm status, or a near-owner status. The beacon signal can also include information that specifies battery life, charging status, and/or other status information. The lost status can indicate that the wireless accessory 201 has determined itself to be lost or has been placed into a lost state by the owner of the device. The alarm status can indicate that the wireless accessory 201 was placed in a state that the device should trigger an alarm if moved from a current location. The near-owner status can indicate that the wireless accessory 201 has detected the nearby presence of the mobile device 102 associated with the owner of the accessory.
The beacon signal can be detected by a finder device 202, which is locally proximate to the wireless accessory 201. The finder device 202 can be a similar device as the mobile device 102 and can receive and transmitting data over a wide area network 114 and receiving and transmitting using similar wireless technologies as the wireless accessory 201 (e.g., Bluetooth, etc.). Particularly, the finder device 202 can receive data using the wireless protocol over which the beacon signal is transmitted. The finder device 202 can determine a location using one or more location and/or positioning services including, but not limited to a satellite positioning service 206 or a terrestrial positioning system using RF signals received from wireless base stations 205 such as Wi-Fi access points or cell tower transmitters of a cellular telephone network. In an embodiment, the finder device 202 periodically stores its location as determined based on the one or more location and/or positioning services. The stored location can be associated with a timestamp for which the location was determined. When the finder device 202 receives a beacon signal from the wireless accessory 201, the finder device 202 can transmit a location for the finder device over the wide area network 114 to a device locator server 203. The timestamp for a determined location for the finder device 202 can be correlated with a timestamp for which a beacon signal was received to associate a geographic location with a received beacon signal.
Where the wireless accessory 201 provides a public key within the beacon signal, the finder device 202 can encrypt the determined location data and transmit the encrypted location data to the device locator server 203 over the wide area network 114. In one embodiment, additional data can either be encrypted and transmitted along with the location data or transmitted unencrypted to the device locator server 203. For example, a received signal strength indicator (RSSI) for the beacon signal can be transmitted along with the location data. The RSSI data can then be used to determine the distance of the wireless accessory 201 from the finder device 202 and assist in triangulation on the owner device. Where the RSSI data is transmitted in an unencrypted state, in one embodiment the server can use RSSI information to reduce noise by discarding very weak signals if other, stronger signals are present. In one embodiment, UWB ranging data can also be provided, where such data is available.
In one embodiment, the finder device 202 can behave differently upon receiving a beacon signal from a wireless accessory 201 depending upon a device status conveyed by the wireless accessory 201. For standard beacon signals, the finder device 202 can place encrypted location data into a queue and transmit the location data to the device locator server 203 during a periodic transmission window. However, if the wireless accessory 201 is indicating an alarm state, the finder device 202 can transmit the location data to the device locator server 203 immediately. Additionally, the finder device 202 may not transmit the location data to the device locator server 203 if the beacon signal of the wireless accessory 201 indicates that the accessory is near the owner of the accessory. Alternatively, the finder device 202 may delay transmission of encrypted location data.
If the owner of the wireless accessory 201 wishes to locate the wireless accessory, the owner can access a device locator user interface 204 on the mobile device 102. The device locator user interface 204 can be associated with a device locator application that is used to locate electronic devices and accessories that are registered with an online account of the user, such as a cloud services account or another type of online account. The device owner, using the device locator UI 204, can query the device locator server 203 for location data that may have been transmitted to the device locator server by a finder device 202 of the wireless accessory 201. In one embodiment, the mobile device 102 can transmit the public encryption key associated with the wireless accessory 201 to the device locator server 203. The device locator server 203 can then return any stored location data that corresponds with the public encryption key. The location data returned to the mobile device 102 can be encrypted data that is encrypted by the finder device 202 using the public encryption key. The mobile device 102 can use an associated private key to decrypt the encrypted location data. The decrypted location data can then be processed by the mobile device 102 to determine a most probable location for the wireless accessory 201. In various embodiments, the most probable location for the wireless accessory 201 can be determined by triangulation from multiple received locations and using other data, such as a beacon signal RSSI associated with each location and timestamp or UWB ranging data included within the location data.
After the wireless accessory 201 has been paired with the mobile device 102, the wireless accessory 201 can periodically broadcast a beacon signal 301 that includes device status information and a beacon identifier. In one embodiment the beacon identifier is a public key derived from a shared secret that is established during the public key exchange (310). Additionally, the wireless accessory 201 can periodically perform a public key derivation (315) to generate a new public key and begin broadcasting the new public key as the beacon identifier. The public key is a K-byte key, with a new K-byte key generated every M minutes. The value K and M can vary between embodiments. In one embodiment, a K value of 28 bytes is used. In one embodiment, a K value of 27 bytes is used. The value K can be determined at least in part based on the beacon length associated with the wireless protocol used to transmit the beacon signal 301. In one embodiment, the beacon signal can transmit a variant of beacon advertisement packet associated with a low-energy radio protocol, such as Bluetooth Low Energy.
The value M, in one embodiment, is 15 minutes, such that a new K-byte key is generated every 15 minutes. The public key can be derived deterministically based on a timestamp and an anti-tracking secret generated during the public key exchange 310. The public key derivation (315) process enables the wireless accessory 201 to use different keys over time, preventing the long-term association with a specific key with a specific device. The key can be derived based on an anti-tracking secret known only to the mobile device 102 and the wireless accessory 201, allowing the mobile device 102, and only the mobile device, to determine which public key will be broadcast by the wireless accessory 201 at any given timestamp. The anti-tracking secret can be generated along with an ECDH public key and transferred to the wireless accessory 201. The anti-tracking secret can then be used to enable the wireless accessory 201 to generate a sequence of public keys Pi. In one embodiment, the sequence of public keys Pi=λi·P, which defines a group operation between a scalar or exponent value λi and group elements, such as, for example, Elliptic Curve points P. The scalar or exponent value λ=KDF(AT, i), where KDF is a key derivation function, AT is the anti-tracking secret, and i is a counter or timestamp.
In one embodiment, backtracking resistance can be enabled to protect the anti-tracking secret in the event the wireless accessory 201 is compromised. When backtracking resistance is enabled, the anti-tracking secret is transferred to the wireless accessory 201 but is not retained by the wireless accessory. Instead, the accessory computes a value λi+1=H(λi∥ time), with λ0=AT and H being a cryptographic hash function. The wireless accessory 201 then stores λi for a given time period i. If the wireless accessory 201 is compromised, only λi for current and future values of i is exposed, without exposing the anti-tracking secret AT. In one embodiment, backtracking resistance is performed by periodically writing λi to non-volatile memory of the wireless accessory 201.
In one embodiment the wireless accessory 201 can transmit the beacon signal 301 every two seconds, although other beacon rates can be used, and the beacon rate can vary under certain circumstances. For example, the wireless accessory 201 can decrease a beacon rate when in a near-owner state. Beacon rate can also vary based on accelerometer triggered events. For example, the wireless accessory 201 can increase the beacon rate when in an alarm state, which can be triggered by the accelerometer on the wireless accessory 201.
The wireless accessory 201 can enter the near-owner state if, after transmitting the beacon signal 301, the wireless accessory 201 receives a reply from the mobile device 102 associated with the user of the accessory, which indicates that the mobile device 102 is within range of the wireless accessory. Additionally, while the wireless accessory is in the near-owner state, the amount of data transmitted by the beacon signal 301 may be reduced. In one embodiment, the rate at which new public keys are generated can also be reduced while the wireless accessory is in the near-owner state.
The wireless accessory 201 can enter an alarm state upon receiving a message from the mobile device 102 that indicates that the wireless accessory 201 should enter the alarm state. When in the alarm state, the wireless accessory can initially enter an armed state in which the wireless accessory 201 can reduce or cease the transmission of locator beacon signals, although other types of wireless signaling can persist. The wireless accessory 201 can remain in the armed state until the state is deactivated by the mobile device 102 or alarm is triggered. The alarm can be triggered, in one embodiment, upon detection of movement, for example, via an accelerometer within the wireless accessory 201. The alarm can also be triggered, in one embodiment, upon detection that the wireless accessory has moved out of range of the mobile device and is no longer in the near-owner state. When the alarm is triggered, the rate at which the beacon signal 301 can be increased, to increase the speed by which the wireless accessory 201 can be located.
The beacon signal 301 transmitted by the wireless accessory 201 can be detected by a set of finder devices 303, which are other electronic devices that can receive the beacon signal transmitted by the wireless accessory and are transmit location and other data associated with the beacon signal 301 to the device locator server 203 via the wide area network 114. In one embodiment the set of finder devices 303 include variants of the mobile device 102 or can be other types of electronic devices. The set of finder devices 303 can include a variant of the finder device 202 of
The set of finder devices 303 can encrypt the location data with the beacon identifier (e.g., public key) received within the beacon signal 301 and send the location data (325) to the device locator server 203. The data sent by the set of finder devices 303 is sent anonymously and no identifying information for the finder devices is stored with the data sent by the finder devices.
The device locator server 203 can store encrypted location data in a data store 304, which in one embodiment can be a distributed database having multiple nodes. Hashes of the beacon identifier/public key of an accessory can be sent along with encrypted location data. The encrypted location data can be stored to a database node based on a hash of the beacon identifier. The encrypted location data can be indexed by the device locator server 203 using the hash of the beacon identifier, which prevents the storage of the full beacon identifier to the server. Other information can also be sent and stored with the location data, either in an encrypted or unencrypted state. The other information can include timestamps for when the beacon signal 301 was received, RSSI information for the received beacon, and/or ranging information determined, for example, via UWB ranging.
When the user or owner of the wireless accessory 201 wishes to locate the accessory, the user or owner can access the device locator UI 204 on the mobile device 102. The device locator UI 204 can be associated with a device locator application or feature of the mobile device 102. The device locator UI 204 may also have a web-based interface that can be accessed from the mobile device 102 or another type of electronic device, such as a laptop or desktop device. The mobile device 102, upon loading the device locator UI 204, can send a request (330) for location data to the device locator server 203. The request 330 can include a set of public key hashes, which can be used to locate the encrypted location data for the device. The mobile device 102 can generate the set of public key hashes based on the secret information held by the mobile device 102 and the wireless accessory 201 and the timestamps over which the mobile device 102 wishes to receive location data. In one embodiment the set of public key hashes is based on the sequence of public keys Pi that are generated based on the anti-tracking secret. The sequence of public keys Pi corresponds to a matching sequence of private keys di. The mobile device 102 can generate the sequence of public keys, as well as the corresponding sequence of public keys di, where i is a counter or timestamp. In one embodiment, the mobile device 102 can generate and send the previous 24 hour of public keys within the request 330. If no data is found for 24 hour of public keys, the mobile device 102 can send generate keys for an earlier period, back to a pre-determined location data retention limit.
In one embodiment, the wireless accessory 201 can be placed in a light lost mode. In the light lost mode, a set of future public keys can be generated for the wireless accessory and transmitted to the device locator server 203. The device locator server 203 can then notify the mobile device 102 if any location data is received that correspond with a key in the set of future public keys. In one embodiment, a finder device that sends a location for a wireless accessory that is in the light lost mode can be directed by the device locator server 203 to relay a message to the wireless accessory 201 that notifies the wireless accessory that it is in the light lost mode. A similar mechanism can be used to relay a message to the wireless accessory 201 that places the accessory in an explicit lost mode. The explicit lost mode can be enabled by the user via the device locator UI 204. In the explicit lost mode, the wireless accessory 201 cannot be paired with another device unless unlocked by the owner.
As shown in
The mobile device can then generate a public/private key pair and one or more additional shared secrets (block 402). The device can then send the public key and one or more additional shared secrets to the wireless accessory (block 403). A variety of key generation techniques can be used. In one embodiment, a variant of ECDH is used to generate a public key pair for encryption. In one embodiment, the one or more additional shared secrets can include an anti-tracking secret that enables the wireless accessory to derive a new public key based on an existing public key.
After generating the public/private keypair and one or more additional shared secrets, the mobile device can store public/private key pair to keystore (block 404). In one embodiment the keystore is a cloud-based keystore that can be synchronized with other devices associated with the same cloud services account, or family of cloud services accounts, to which the mobile device and wireless accessory are associated. The cloud-based keystore allows the wireless accessory to be located by other synchronized devices. The mobile device can then register the wireless accessory with a device management server (block 405). Registering the wireless accessory with the device management server can form an association between the wireless accessory and the cloud services account to which the mobile device is associated. The device management server can be associated with other cloud-based servers that are used to facilitate cloud-based services accessible to the mobile device, such as the device locator server 203 of
As shown in
Processing the location data can include a variety of different operations. In one embodiment the location data includes latitude and longitude information along with a timestamp for which the location was determined. The electronic device can triangulate based on the timestamps and remove noise or outlier locations. In one embodiment the location data specifies the location of the finder device that detected the beacon. The location data can additionally include UWB ranging information and/or RSSI information for the beacon detected by the finder device. The electronic device can analyze the UWB ranging information and/or RSSI information in context with the device locations to develop a more accurate location for the wireless accessory. Data that can be transmitted by a finder device and used for location processing is shown in
As shown in
If data is not returned by the server (block 423, “no”) the electronic device can generate a second set of public keys that were included within a beacon signal broadcast by the wireless accessory during a second period (block 424). The second period can be the 24, 48, or another number of hours before the first period. The electronic device can then request for the device locator server to send data that corresponds with the second set of public keys (block 425). If, in response to the request, data is returned by the server (block 426, “yes”), method 420 can proceed to block 429, in which the electronic device decrypts the received data. If data is not returned by the server (block 426, “no”), or the server sends a reply that indicates data is not available, method 420 includes for the electronic device can widen the search time by requesting successively older time periods until the max period is reached (block 427).
After transmitting a beacon signal, the wireless accessory can listen for a response from the owner device. If the wireless signal receives a response from the owner device (block 504, “yes”), the wireless accessory can enter a near owner state (block 505) and begin to transmit the beacon signal at a second, slower transmission interval (block 507). If the wireless accessory does not receive a response from the owner device (block 504, “no”), the wireless accessory can continue beaconing at the first transmission interval (block 506).
Method 500 additionally includes for the wireless device, while beaconing, to rotate the public key every M minutes, where the value of M can vary across embodiments and/or based on the device state. Based on a timer expiration, counter, or another mechanism, the wireless accessory can determine whether the accessory has entered a new key period (block 508). While the wireless accessory has not entered a new key period (block 508, “no”), the accessory can continue beaconing using the current public key (block 510). When the wireless accessory detects that it has entered a new key period (block 508, “yes”) the accessory can derive a new public key using the current timestamp (block 509). In one embodiment the new public key can be derived using an existing public key, a timestamp, and an anti-tracking secret.
As shown in
Method 600 additionally includes for the finder device to perform periodic Wi-Fi scans using the wireless processor while application processor is in a low power mode (block 603). While the Wi-Fi scans can also be performed when the application processor is active, Wi-Fi scans can be performed by the wireless processor and a wireless radio receiver as a low power operation while the finder device is idle, inactive, or otherwise in a low power state. The finder device can then store Wi-Fi service set identifiers (SSIDs) and scan timestamps to a Wi-Fi scan buffer on the finder device (block 604).
In one embodiment, the Wi-Fi scan buffer is a rolling buffer that stores the most recently detected SSIDs, while overwriting older detected SSIDs. In one embodiment the beacon scan buffer can be a fixed-size buffer having space for a pre-determined number of entries. The finder device can wake the application processor when the beacon scan buffer becomes full (block 605) and correlate those beacon scan with the most recently detected SSIDs in the Wi-Fi scan buffer. That correlation can enable the finder device to determine a set of device locations that correspond with received beacons based on Wi-Fi scan buffer data (block 606).
Method 600 continues in
The owner device can retrieve the RSSI or UWB information from the device locator server along with location data, which in one embodiment is provided the form of latitude and longitude information, along with timestamps for which the locations were determined. The owner device can then use the location data, timestamps, and signal information to triangulate a most probable location for the wireless accessory 201.
In one embodiment the device locator server 203 includes a locator service front-end 803, an account database 825, a database cluster manager 813, and a set of database cluster nodes 823A-823C. The locator service front-end 803 is a front-end interface to which the mobile device 102 and the set of finder devices 303 can communicate. The account database 825 stores account profile data for accounts of a cloud service provider to which the mobile device 102 and the finder devices 303 are associated. The database cluster manager 813 can configure the database cluster nodes 823A-823C as a distributed location database that can store location, signal, and ranging data in association with beacon identifiers for signal beacons received by the set of finder devices 303.
In one embodiment, the account database 825 can contain a list of devices that are associated with each cloud services account. In response to a request to locate a given device, including a wireless accessory as described herein, the account database 825 can verify that the request is coming from a device that is authorized to request the location of the given device. In one embodiment, when a user launches a device locator UI and communicates with the locator service front-end 803, the locator service front-end can communicate with the account database 825 and provide a current or last known location for each device that is associated with a requesting user, including devices and/or wireless accessories associated with other users that are in a family of accounts associated with the requesting user.
In one embodiment, the database cluster manager 813 can select a database cluster node 823A-823C to which beacon data is to be stored based on the hash of the beacon ID associated with a set of location data. Each database cluster node 823A-823C can be associated with a range of hash values. The database cluster manager can then store location data to the cluster node that corresponds with the range of hash values associated with the hash of a given beacon ID, with the location data indexed based on the hash of the beacon ID. Although three database cluster nodes are illustrated, embodiments are not limited to any specific number of nodes and greater or fewer nodes may be used.
As shown in
As shown in
In one embodiment the wireless accessory can detect movement via an accelerometer or another type of motion sensor within the wireless accessory. The notification can be initiated by the wireless accessory by setting a flag in the data packet transmitted by the beacon signal of the wireless accessory that indicates the wireless accessory alarm has been triggered. In various embodiments, other trigger or notification modes can be used. In one embodiment, the alarm can optionally be triggered by the mobile device upon detection that the wireless accessory has moved out of range of the mobile device and is no longer in the near-owner state. In one embodiment, the alarm can optionally be triggered when the wireless accessory is out of range of, or otherwise cannot be located by, any of the devices associated with the account or family of user accounts to which the wireless accessory is associated.
As shown in
Another selectable user interface element 907 can place the wireless accessory into an explicit lost mode. When explicitly placed into lost mode, the wireless accessory will be unable to be paired with other devices until the accessory is unlocked by the user or owner that places the device into lost mode. When sending a request to place a wireless accessory into lost mode, the requesting user can be required to enter authenticating information to ensure that the requesting user is authorized to request that lost mode be initiated on the lost accessory. The authenticating information can include a username or password associated with an account of a user, such as a cloud services account to which the user, electronic device, and wireless accessory are associated. The authenticating information can also include biometric information, such as a fingerprint or facial recognition data.
In one embodiment, a message and contact information provided by the requesting user can be displayed on the user device to alert a person who finds the lost wireless accessory on how to contact the requesting user. In one embodiment, the message and contact information can be displayed when another user attempts to pair another electronic device with the lost accessory.
Embodiments described herein include one or more application programming interfaces (APIs) in an environment in which calling program code interacts with other program code that is called through one or more programming interfaces. Various function calls, messages, or other types of invocations, which further may include various kinds of parameters, can be transferred via the APIs between the calling program and the code being called. In addition, an API may provide the calling program code the ability to use data types or classes defined in the API and implemented in the called program code.
An API allows a developer of an API-calling component (which may be a third-party developer) to leverage specified features provided by an API-implementing component. There may be one API-calling component or there may be more than one such component. An API can be a source code interface that a computer system or program library provides to support requests for services from an application. An operating system (OS) can have multiple APIs to allow applications running on the OS to call one or more of those APIs, and a service (such as a program library) can have multiple APIs to allow an application that uses the service to call one or more of those APIs. An API can be specified in terms of a programming language that can be interpreted or compiled when an application is built.
In some embodiments, the API-implementing component may provide more than one API, each providing a different view of or with different aspects that access different aspects of the functionality implemented by the API-implementing component. For example, one API of an API-implementing component can provide a first set of functions and can be exposed to third party developers, and another API of the API-implementing component can be hidden (not exposed) and provide a subset of the first set of functions and also provide another set of functions, such as testing or debugging functions which are not in the first set of functions. In other embodiments, the API-implementing component may itself call one or more other components via an underlying API and thus be both an API-calling component and an API-implementing component.
An API defines the language and parameters that API-calling components use when accessing and using specified features of the API-implementing component. For example, an API-calling component accesses the specified features of the API-implementing component through one or more API calls or invocations (embodied for example by function or method calls) exposed by the API and passes data and control information using parameters via the API calls or invocations. The API-implementing component may return a value through the API in response to an API call from an API-calling component. While the API defines the syntax and result of an API call (e.g., how to invoke the API call and what the API call does), the API may not reveal how the API call accomplishes the function specified by the API call. Various API calls are transferred via the one or more application programming interfaces between the calling (API-calling component) and an API-implementing component. Transferring the API calls may include issuing, initiating, invoking, calling, receiving, returning, or responding to the function calls or messages; in other words, transferring can describe actions by either of the API-calling component or the API-implementing component. The function calls or other invocations of the API may send or receive one or more parameters through a parameter list or other structure. A parameter can be a constant, key, data structure, object, object class, variable, data type, pointer, array, list or a pointer to a function or method or another way to reference a data or other item to be passed via the API.
Furthermore, data types or classes may be provided by the API and implemented by the API-implementing component. Thus, the API-calling component may declare variables, use pointers to, use or instantiate constant values of such types or classes by using definitions provided in the API.
Generally, an API can be used to access a service or data provided by the API-implementing component or to initiate performance of an operation or computation provided by the API-implementing component. By way of example, the API-implementing component and the API-calling component may each be any one of an operating system, a library, a device driver, an API, an application program, or other module (it should be understood that the API-implementing component and the API-calling component may be the same or different type of module from each other). API-implementing components may in some cases be embodied at least in part in firmware, microcode, or other hardware logic. In some embodiments, an API may allow a client program to use the services provided by a Software Development Kit (SDK) library. In other embodiments, an application or other client program may use an API provided by an Application Framework. In these embodiments, the application or client program may incorporate calls to functions or methods provided by the SDK and provided by the API or use data types or objects defined in the SDK and provided by the API. An Application Framework may in these embodiments provide a main event loop for a program that responds to various events defined by the Framework. The API allows the application to specify the events and the responses to the events using the Application Framework. In some implementations, an API call can report to an application the capabilities or state of a hardware device, including those related to aspects such as input capabilities and state, output capabilities and state, processing capability, power state, storage capacity and state, communications capability, etc., and the API may be implemented in part by firmware, microcode, or other low-level logic that executes in part on the hardware component.
The API-calling component may be a local component (i.e., on the same data processing system as the API-implementing component) or a remote component (i.e., on a different data processing system from the API-implementing component) that communicates with the API-implementing component through the API over a network. It should be understood that an API-implementing component may also act as an API-calling component (i.e., it may make API calls to an API exposed by a different API-implementing component) and an API-calling component may also act as an API-implementing component by implementing an API that is exposed to a different API-calling component.
The API may allow multiple API-calling components written in different programming languages to communicate with the API-implementing component (thus the API may include features for translating calls and returns between the API-implementing component and the API-calling component); however, the API may be implemented in terms of a specific programming language. An API-calling component can, in one embedment, call APIs from different providers such as a set of APIs from an OS provider and another set of APIs from a plug-in provider and another set of APIs from another provider (e.g., the provider of a software library) or creator of the other set of APIs.
It will be appreciated that the API-implementing component 1110 may include additional functions, methods, classes, data structures, and/or other features that are not specified through the API 1120 and are not available to the API-calling component 1130. It should be understood that the API-calling component 1130 may be on the same system as the API-implementing component 1110 or may be located remotely and accesses the API-implementing component 1110 using the API 1120 over a network. While
The API-implementing component 1110, the API 1120, and the API-calling component 1130 may be stored in a machine-readable medium, which includes any mechanism for storing information in a form readable by a machine (e.g., a computer or other data processing system). For example, a machine-readable medium includes magnetic disks, optical disks, random-access memory; read only memory, flash memory devices, etc.
The memory interface 1202 can be coupled to memory 1250, which can include high-speed random-access memory such as static random-access memory (SRAM) or dynamic random-access memory (DRAM) and/or non-volatile memory, such as but not limited to flash memory (e.g., NAND flash, NOR flash, etc.).
Sensors, devices, and subsystems can be coupled to the peripherals interface 1206 to facilitate multiple functionalities. For example, a set of sensors 1210 including a motion sensor 1211, a light sensor 1212, and a proximity sensor 1214 can be coupled to the peripherals interface 1206 to facilitate the mobile device functionality. One or more biometric sensor(s) 1215 may also be present, such as a fingerprint scanner for fingerprint recognition or an image sensor for facial recognition. Other sensors 1216 can also be connected to the peripherals interface 1206, such as a positioning system (e.g., GPS receiver), a temperature sensor, or other sensing device, to facilitate related functionalities.
The device architecture 1200 additionally includes an audio/video system 1220. A camera subsystem 1221 and an optical sensor 1222, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips. An audio subsystem 1226 can be coupled to a speaker 1228 and a microphone 1230 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. In smart media devices described herein, the audio subsystem 1226 can be a high-quality audio system including support for virtual surround sound.
Communication functions can be facilitated through one or more wireless communication subsystems 1224, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the wireless communication subsystems 1224 can depend on the communication network(s) over which a mobile device is intended to operate. For example, a mobile device including the illustrated device architecture 1200 can include wireless communication subsystems 1224 designed to operate over a GSM network, a CDMA network, an LTE network, a Wi-Fi network, a Bluetooth network, or any other wireless network. In particular, the wireless communication subsystems 1224 can provide a communications mechanism over which a media playback application can retrieve resources from a remote media server or scheduled events from a remote calendar or event server.
The I/O subsystem 1240 can include a touchscreen controller 1242 and/or other input controller(s) 1245. For computing devices including a display device, the touchscreen controller 1242 can be coupled to a touch sensitive display system 1246 (e.g., touchscreen). The touch sensitive display system 1246 and touchscreen controller 1242 can, for example, detect contact and movement and/or pressure using any of a plurality of touch and pressure sensing technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch sensitive display system 1246. Display output for the touch sensitive display system 1246 can be generated by a display controller 1243. In one embodiment, the display controller 1243 can provide frame data to the touch sensitive display system 1246 at a variable frame rate.
In one embodiment, a sensor controller 1244 is included to monitor, control, and/or processes data received from one or more of the motion sensor 1211, light sensor 1212, proximity sensor 1214, or other sensors 1216. The sensor controller 1244 can include logic to interpret sensor data to determine the occurrence of one of more motion events or activities by analysis of the sensor data from the sensors.
In one embodiment, the I/O subsystem 1240 includes other input controller(s) 1245 that can be coupled to other input/control devices 1248, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus, or control devices such as an up/down button for volume control of the speaker 1228 and/or the microphone 1230.
In one embodiment, the memory 1250 coupled to the memory interface 1202 can store instructions for an operating system 1252, including portable operating system interface (POSIX) compliant and non-compliant operating system or an embedded operating system. The operating system 1252 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, the operating system 1252 can be a kernel.
The memory 1250 can also store communication instructions 1254 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers, for example, to retrieve web resources from remote web servers. The memory 1250 can also include user interface instructions 1256, including graphical user interface instructions to facilitate graphic user interface processing.
Additionally, the memory 1250 can store sensor processing instructions 1258 to facilitate sensor-related processing and functions; telephony instructions 1260 to facilitate telephone-related processes and functions; messaging instructions 1262 to facilitate electronic-messaging related processes and functions; web browser instructions 1264 to facilitate web browsing-related processes and functions; media processing instructions 1266 to facilitate media processing-related processes and functions; location services instructions including GPS and/or navigation instructions 1268 and Wi-Fi based location instructions to facilitate location based functionality; camera instructions 1270 to facilitate camera-related processes and functions; and/or other software instructions 1272 to facilitate other processes and functions, e.g., security processes and functions, and processes and functions related to the systems. The memory 1250 may also store other software instructions such as web video instructions to facilitate web video-related processes and functions; and/or web shopping instructions to facilitate web shopping-related processes and functions. In some implementations, the media processing instructions 1266 are divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing-related processes and functions, respectively. A mobile equipment identifier, such as an International Mobile Equipment Identity (IMEI) 1274 or a similar hardware identifier can also be stored in memory 1250.
Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. The memory 1250 can include additional instructions or fewer instructions. Furthermore, various functions may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
The computing system 1300 includes an interconnect 1335 (e.g., bus, fabric) to enable communication between components of the computing system 1300. One or more processor(s) 1310 can coupled with the interconnect 1335. The computing system 1300 further may include memory 1320 in the form of random-access memory (RAM) or other dynamic storage device coupled to the interconnect 1335. The memory 1320 may store information and instructions that may be executed by processor(s) 1310. The memory 1320 may also be main memory that is used to store temporary variables or other intermediate information during execution of instructions by the processor(s) 1310.
The computing system 1300 may also include read only memory (ROM) 1330 and/or another data storage device 1340 coupled to the interconnect 1335 that may store information and instructions for the processor(s) 1310. The data storage device 1340 can be or include a variety of storage devices, such as a flash memory device, a magnetic disk, or an optical disc and may be coupled to computing system 1300 via the interconnect 1335 or via a remote peripheral interface.
The computing system 1300 may also be coupled, via the interconnect 1335, to a display device 1350 to display information to a user. The computing system 1300 can also include an alphanumeric input device 1360, including alphanumeric and other keys, which may be coupled to interconnect 1335 to communicate information and command selections to processor(s) 1310. Another type of user input device includes a cursor control 1370 device, such as a touchpad, a mouse, a trackball, or cursor direction keys to communicate direction information and command selections to processor(s) 1310 and to control cursor movement on the display device 1350. The computing system 1300 may also receive user input from a remote device that is communicatively coupled via one or more network interface(s) 1380.
The computing system 1300 further may include one or more network interface(s) 1380 to provide access to a network, such as a local area network. The network interface(s) 1380 may include, for example, a wireless network interface having antenna 1385, which may represent one or more antenna(e). The computing system 1300 can include multiple wireless network interfaces such as a combination of Wi-Fi, Bluetooth®, near field communication (NFC), and/or cellular telephony interfaces. The network interface(s) 1380 may also include, for example, a wired network interface to communicate with remote devices via network cable 1387, which may be, for example, an Ethernet cable, a coaxial cable, a fiber optic cable, a serial cable, or a parallel cable.
In one embodiment, the network interface(s) 1380 may provide access to a local area network, for example, by conforming to IEEE 802.11 wireless standards and/or the wireless network interface may provide access to a personal area network, for example, by conforming to Bluetooth standards. Other wireless network interfaces and/or protocols can also be supported. In addition to, or instead of, communication via wireless LAN standards, network interface(s) 1380 may provide wireless communications using, for example, Time Division, Multiple Access (TDMA) protocols, Global System for Mobile Communications (GSM) protocols, Code Division, Multiple Access (CDMA) protocols, Long Term Evolution (LTE) protocols, and/or any other type of wireless communications protocol.
The computing system 1300 can further include one or more power sources 1305 and one or more energy measurement systems 1345. Power sources 1305 can include an AC/DC adapter coupled to an external power source, one or more batteries, one or more charge storage devices, a USB charger, or other power source. Energy measurement systems include at least one voltage or amperage measuring device that can measure energy consumed by the computing system 1300 during a predetermined period of time. Additionally, one or more energy measurement systems can be included that measure, e.g., energy consumed by a display device, cooling subsystem, Wi-Fi subsystem, or other frequently used or high-energy consumption subsystem.
Proximity Enhancements for a Location Query Application
In embodiments described herein, a wireless communication device such as a smartphone, smart watch, or tablet computer device can use one or more measurements, such as but not limited to wireless ranging or radio signal-based distance measurements to locate a proximate object in an environment. The proximate object can be a physical device, such as an electronic device or an electronic peripheral, which can be located via a virtual representation of the device that can be presented via an augmented reality display. The wireless communication device may implement a user-interface technique in which an identifier associated with the object is received wirelessly and one or more measurements are used to wirelessly determine a range of the object from the wireless communication device and/or a direction to the object. The wireless communication device may present information that indicates the range and/or the direction to the proximate object. For example, the wireless communication device may display a map of a proximate area in the environment with an indicator representative of the object on the map. Alternatively, the wireless communication device may display an image of the proximate area (e.g., as captured by a camera of the wireless communication device) with the indicator representative of the object on the image. This user-interface technique can remove the constraints associated with many existing user interfaces improve the user experience when using the wireless communication device, and thus may increase customer satisfaction and retention.
The devices can communicate wirelessly via wireless communication signals 1405 by detecting one another by scanning wireless channels, transmitting and receiving beacons or beacon frames on wireless channels, establishing connections (for example, by transmitting connect requests), and/or transmitting and receiving packets or frames (which may include the request and/or additional information, such as data, as payloads). The wireless communication signals 1405 can be carrier signals that conform to wireless communication technologies such as, but not limited to Wi-Fi or Bluetooth. In addition to wireless communication, the mobile device 1402 and the wireless device 1420 perform wireless ranging operations using wireless ranging signals 1406. The wireless ranging signals can be, for example, ultra-wideband signals that can be used to determine a distance and/or angle between the wireless device 1420 and the mobile device 1402 using techniques described herein. In one embodiment, data provided by the wireless ranging signals 1406 can be correlated with other metrics, such as an RSSI of the wireless communication signals 1405. In one embodiment, a communications processor of the mobile device 1402 and the wireless device 1420 can fuse multiple types of ranging to provide a unified distance and/or angle estimate based on multiple types of radio data.
The mobile device 1402 can provide a device locator UI 1404 that presents a map and/or an augmented reality view of the location environment 1408. The map and/or augmented reality view can present a virtual representation of the wireless device 1420. The virtual representation can streamline the process of locating the wireless device 1420 by user of the mobile device 1402. For example, using a direction indicator that points to a detected location of the wireless device 1420 can simplify the process of finding the approximate location of the wireless device 1420. The direction indicator can be pared with a map of the location environment 1408, where the map can include the virtual representation of the wireless device 1420. When the mobile device 1402 enters proximity with the wireless device, an augmented reality view of the location environment 1408 can be presented, which can allow the user to identify the wireless device 1420 when the wireless is hidden from view.
In one scenario the wireless device 1420 may be concealed by an item 1418. The item 1418 may be, for example, a jacket or another article of clothing and the wireless device 1420 may be a smartphone or tablet device that is in a pocket of the item 1418. The item 1418 and the wireless device 1420 may each be within a container 1410, such as a backpack, storage chest, item of luggage, or another item that make reduce the efficiency of audio-based location techniques that rely on the wireless device 1420 to play a sound that can be heard by a user.
In another scenario, the wireless device 1420 may be a beacon peripheral or locator tag that is attached to the item 1418. To find the item 1418, the user can use the device locator UI 1404 on the mobile device 1402 to find the wireless device 1420, enabling the user to locate the item 1418 to which the wireless device 1420 is attached. The proximity enhancements described herein can be used to augment the device locator UI 1404, allowing the user to quickly determine the location of the item 1418, via the wireless device 1420, in scenarios where the item may be concealed by a container 1410 or another physical item.
In one embodiment the mobile device 1402 can establish a secure wireless communication connection (e.g., a Bluetooth connection) with the wireless device 1420 (e.g., via wireless communication signals 1405) and command the wireless device 1420 to begin a wireless ranging process, for example, using the wireless ranging signals 1406, which can be UWB signals. The wireless ranging signals 1406 enable the mobile device 1402 to receive measurements of range and angles to the wireless device 1420.
In embodiments described herein, wireless ranging can be performed using any standard or proprietary ranging technique, or any combination of standard and/or proprietary ranging techniques. A wireless ranging operation can be performed to determine a distance between devices (e.g., between an initiator and a responder), a direction between devices, or both. For example, a Time of Flight/Time of Arrival (ToF/ToA) can be determined for one or more messages between the devices, which can be used to establish a measure of distance. The one or more messages can have any format and can be transmitted using any wireless protocol. In some embodiments, ToF/ToA can be determined using a bi-directional exchange of two or more messages. In some embodiments, one or more messages used to perform ranging can be secured, e.g., by encrypting or otherwise protecting at least a portion of the content. Further, in some embodiments, the direction of the source of one or more wireless signals can be determined using a technique such as Angle of Arrival (AoA). For example, AoA estimation can be performed using multiple receive elements (e.g., elements of an antenna array) to measure the different times (TDOA) and/or different phases (PDOA) of arrival of a signal. Additionally, or alternatively, in some embodiments, directionality can be determined by measuring Doppler shifts to establish a frequency difference of arrival (FDOA). Wireless ranging techniques can be applied individually or in combination to perform a single ranging operation. Further, wireless ranging techniques can be applied individually or in combination to perform on-going ranging operations, such as continuous or intermittent ranging, and a history of measurements can be captured and used in performing operations based on range and/or direction.
The incoming measurements can be processed to reduce measurement noise and stabilize the estimated location of the lost item. In one embodiment, the mobile device 1402 can use a sensor fusion technique that relies upon multiple streams of measurements to arrive at a best estimate of device location. For example, incoming range and/or angle measurements can be fused with position and orientation measurements of the mobile device 1402. The position and orientation measurements can be gathered using visual-inertial odometry (VIO), which determines the device location and orientation using an IMU and computer vision using camera data. The VIO data enables software on the mobile device 1402 to be aware of the movement and orientation of the device. Combining the VIO data with the range and/or angle measurements determined via the wireless ranging signals 1406 enables estimation of the most likely position of the wireless device and allows the user to be pointed towards the device. VIO data may be particularly useful in scenarios where one or more of the range or angle measurements may be inaccurate due to multipath signal propagation effects.
The algorithm for determining the position of a lost item can be accessed via the device locator UI 1404. When a user intends to locate an item, the user can select the item from a list of registered devices and select a “find” button that is presented via the device locator UI 1404. In various embodiments, multiple user interfaces may be presented. Some interfaces resemble a two-dimensional compass like view with a two-dimensional arrow guiding the user to the target item or device to be found. Other interfaces are three-dimensional augmented reality view with virtual objects, including a three-dimensional arrow, that can guide a user to and highlight the lost item.
The indicator 1516 can be based on a last observed location of the device associated with the indicator 1516. The last observed location can be a location observed by the mobile device 1402 or another device of the user account associated with the mobile device 1402. In one embodiment the location of the indicator 1516 can be initially based on a location provided by a device locator server 203 in response to a location query performed by the location query application. As shown in
As with the finder devices 202 in
As shown in
As shown in
If during direction determination the motion of the target wireless device exceeds a threshold, the specific direction to the target wireless device may be difficult to determine based on instantaneous wireless ranging data. This difficulty may be of particular issue if angle of arrival measurements are inaccurate and cannot be relied upon. In such scenario, the direction to the target wireless device can be determined using a series of range measurements. If while gathering the range measurements to the target device, the motion of the target device exceeds a threshold, a view of the device locator UI 1404 can change to the view presented in
As shown in
If while gathering the range measurements to the target device, the motion of both the target device and the mobile device 1402 exceeds a threshold, a view of the device locator UI 1404 can change to the view presented in
As shown in
As shown in
In one embodiment, the direction and range view of
As shown in
The host operating system 1702 can include a location query application 1710 and a nearby device daemon 1720. The location query application 1710 can present the device locator UI 1404 described herein. The location query application 1710 can include libraries and resources associated with a sensor fusion framework 1711 and a proximity framework 1712. The proximity framework 1712 presents program interfaces that enable the location query application to communicate with a nearby device daemon 1720 via inter-process communication (IPC 1715) messages. The nearby device daemon 1720 includes a controller framework 1722. The controller framework enables the nearby device daemon 1720 to communicate with the wireless controller 1730 on the device via a real-time operating system (RTOS) firmware interface 1725. The sensor fusion framework 1711 enables sensor data received from the wireless controller to be fused with augmented reality sensor data.
In one embodiment the wireless controller 1730 can execute a separate real-time operating system (RTOS 1732) on one or more processors 1736 of the controller. The one or more processors 1736 of the wireless controller 1730 are separate from the application processor that executes the host operating system 1702. In one embodiment the one or more processors 1736 includes secure memory, cryptographic accelerators, and other components that enables the wireless controller 1730 to perform secure ranging operations using a ranging sensor 1734. For example, the one or more processors 1736 can include a secure processor, such as a secure element processor. In one embodiment the processor 1736 is a secure processor, such as a secure element processor (SEP).
The ranging sensor 1734 can include an ultra-wideband transceiver that can transmit and receive ultra-wideband radio signals. In one embodiment the wireless controller also includes a wireless communications radio 1738 that enables the wireless controller to facilitate communication with other devices using one or more wireless communication or networking protocols, such as Wi-Fi, Bluetooth, or NFC. Metrics associated with signals received by the wireless communications radio 1738 can also be used as one aspect of the ranging process.
The interpolator logic 1804 and estimator logic 1806 can include various algorithmic techniques to overcome anomalous data and sensor interference to enhance the precision of the three-dimensional target position estimate 1807. In one embodiment, the system 1800 can fuse inertial data gathered from the IMU 1812 with images captured via one or more cameras to perform visual inertial odometry (VIO). VIO techniques include the analysis of a sequence of images to estimate camera motion over the sequence of images. Using image-based camera motion estimation and inertial measurements captured over the same period of time, motion of the mobile device through a coordinate system can be estimated, providing software of the system 1800 the ability to estimate both the position of the finder mobile device and the wireless device to be located within a three-dimensional coordinate space.
In one embodiment, the target location estimate can take the form of a nonlinear least-squares problem that can be solved using a non-linear-least-squares (NLLS) batch filter. For example, given r1 as a range measurement at time ti, and p(ti) as a pose estimate relative to the center of VIO coordinate system at the same time, a target location x=[x y z]T can be determined that minimizes the cost function:
The specific pose estimate used may be based on or relative to the position of the wireless ranging radio antenna on the mobile device. Alternatively, a Kalman Filter or spatial occupancy grid-based approach may be applied.
In one embodiment, once an initial target position for a wireless device has been determined within the three-dimensional coordinate space, the relative target position of the wireless device can be updated using augmented reality data even in the absence of updated range measurements between the wireless device and the wireless accessory. In such scenario, the frequency of wireless ranging operations may be throttled to reduce power consumption by the wireless device and mobile device.
Ranging Algorithm Enhancements
Performance and accuracy of the distance, bearing, and range calculations can be enhanced using various algorithmic approaches. In one scenario, as the device traces out a trajectory through space towards a target, the device may encounter multipath signal propagation effects when ranging to the target. Multipath propagation is a phenomenon that results in radio signals reaching a receiving antenna by two or more paths. Multipath propagation can occur if the line of sight to the target is obscured by obstacles in the environment, the antenna pattern severely attenuates the line of sight signal, or the users themselves block the signal. As the mobile device traces a trajectory through space towards the target, the measured range to the target can evolve over time, along with the distance traveled over the trajectory. During this evolution, anomalies and interference may arise.
At times the measured range determined from the sensor data may appear to jump more than the distance traveled through space. A sudden jump in the measured range that differs from the distance travelled over a period of time may be used to determine when the multipath conditions have changed. When the change in range is positive and larger than the distance traveled, the bias on the measurement due to multi path may have increased. Similarly, when the change in range is negative and larger in magnitude than the distance traveled, a multi path component likely existed before and has reduced in effect. When multipath propagation is detected to have decreased, the uncertainty of previous measurements can be increased to account for the potential multi interference. In one embodiment, previous measurements that have been deemed as multipath measurements may be removed from the measurement history by setting the weights associated with those measurements to zero. When multipath interference is detected to have increased, the measurements received after the detected increase can similarly be de-weighted or those measurements can be rejected.
In one embodiment range measurements can be de-weighted due to the passage of time. As range measurements are gathered over the device's trajectory, the solution for the target's position becomes overdetermined. Range measurements can be de-weighted as the measurements age and/or as VIO errors accumulate. In the case of a moving target, analysis of range measurements as a function of time may be particularly important, as a time series of range measurements can be used to determine an angle measurement to the device without relying on angle measurements associated with any one ranging operation. Thus, the weights associated with older measurements may be decayed based on a time function.
In one embodiment, range measurements can be de-weighted due to the measured and/or estimated distance to target. As measurements indicate further ranges, the possibility of multipath interference increases, and the measurements become less useful. As a solution is formed, the estimated distance to each position where a measurement was formed can be calculated. Based on that distance, the measurement can be reweighed. Additionally, the measurement itself is a range, and can be used to infer a measurement uncertainty.
In one embodiment, angle of arrival measurements can be used to further refine the target position when a range-only solution indicates the target is in the device's field of view. A position solution for the device can be formed using ranging measurements only (e.g., without angle measurements). Given the position solution, probabilities can be calculated that each of the angle measurements was made from the mobile device when the target was in the device's field of view. If the probability meets a threshold, the angle measurement can be used to form a new position solution. Use of the angle measurement can resolve ambiguities that may be present in the range solution, especially in the up direction, due to the device trajectory mostly being in a single plane.
After measurements are used to form the position solution, the measurement errors of individual contributing measurements can be calculated. In an over determined solution, the measurement errors will be non-zero and give some indication of the quality of the position fix. If large errors are seen, the confidence in the position fix can be decreased. A threshold on the confidence can be used to determine if the position should be shown to the user. If all measurements are seen to disagree by small amounts, the errors can be used to increase the uncertainty of the position.
In one embodiment, method 1900 includes operations to receive, at a wireless communication device, a geographic location of a target wireless device (block 1901). The geographic location for the target wireless device can be a location determined by the wireless communication device, a device that shares a user account of the wireless communication device, or a location received from a device locator server.
The wireless communication device can then present an indicator for the target wireless device on an interface of a location query application that is executed by the wireless communication device (block 1902). The indicator can be a map indicator that shows a location of the target wireless device on a map of the nearby environment. A distance and range measurement to the target wireless device can be presented on the map, along with the map indicator that shows the location of the target wireless device. The user of the wireless communication device can then move towards the location of the target wireless device.
When the wireless communication device is within a threshold range of the target wireless device, the wireless communication device can determine a range and/or direction to the target wireless device using one our more wireless ranging operations (block 1903). The one or more ranging operations can include continuous or periodic ultra-wideband ranging operations. The ultra-wideband ranging operations can be performed in conjunction with other range determination techniques, such as RSSI-based distance determinations. In one embodiment the ultra-wideband ranging operations are secure ranging operations in which the ranging packets exchanged are encrypted. In one embodiment, both range and direction determinations can be made based on the one or more ranging operations. In one embodiment, range-only determinations can be made and the direction is determined based on analysis of multiple range measurements. Where both range and angle measurements may be determined based on instantaneous wireless ranging signal analysis, a history of range measurements can be used to increase the accuracy of the determined angle measurements.
The wireless communication device can then determine a pose of the wireless communication device via sensor data received from an inertial measurement unit within the wireless communication device (block 1904). The inertial measurement unit data can be processed by the augmented reality system to determine the pose of the wireless communication device. The pose data can be used to further enhance the direction determinations made for the target wireless device based on wireless ranging sensors.
The wireless communication device can then determine a position estimate of the target wireless device based on the range, direction, and pose (block 1905). In one embodiment the specific range, direction, and pose determined for the target wireless device and the wireless communication device is determined based on a fusion of sensor data from the wireless ranging system and the augmented reality system of the wireless communication device. The position estimate may be a position relative to the wireless communication device or an absolute position within a 3D coordinate space. The wireless communication device can then generate and display a device locator UI that includes the position estimate of the wireless device (block 1906). The device locator UI can be directional view, for example, as shown in
Method 2000 includes operations to establish, at a target wireless device, a wireless ranging session with a wireless communication device (block 2001). In response to establishing the wireless ranging session, the target wireless device can enable a motion sensor on the target wireless device (block 2002). The target wireless device can then determine, via the motion sensor, if motion of target wireless device exceeds a motion threshold (block 2003). If the motion is determined to be over the motion threshold (block 2004, “yes”), the target wireless device can signal the wireless communication device that the target device motion has exceeded the motion threshold (block 2005). Otherwise, the target wireless device can continue to monitor its motion via the motion sensor. The target wireless device can monitor its motion via the motion sensor by periodically polling sensor data from the motion sensor. The target wireless device may also enable the motion sensor to trigger an interrupt to a sensor processor when
In one embodiment the target wireless device determines motion state via an accelerometer within the motion sensor, for example, by tracking changes in device orientation. The accelerometer can be an inertial frame sensor that detects changes in device acceleration, including acceleration resulting from gravity. Movement of the target wireless device will result in at least slight changes in the orientation of the target wireless device, which can be detected via the accelerometer. Data from other types of sensors, such as gyroscopes or magnetometers may also be used to determine device motion, either as alternatives to the accelerometer or in fusion with the accelerometer.
In one embodiment the target wireless device can signal the wireless communication device that the target device motion has exceeded the motion threshold via, for example, a Bluetooth message. The message can be transmitted over an active Bluetooth link between the target wireless device and the wireless communication device, or via another Bluetooth signaling method, such as, for example, status bits that are includes within a Bluetooth wireless advertisement. Other wireless signaling methods may also be used.
Method 2100 includes operations to establish, at a wireless communication device, a wireless ranging session with a target wireless device (block 2101). The wireless communication device can then determine a motion state for the target wireless device and the wireless communication device (block 2102). The wireless communication device can determine the motion state for the target wireless device via wireless data received from the target wireless device. The wireless communication device can determine the motion state for the wireless communication device via sensor data processed by a sensor processor (e.g., sensor controller 1244 as in
If the motion of both devices is not over the motion threshold (block 2103, “no”), the wireless communication device can determine a UI view to display based on the motion state of the target wireless device. If the target device motion is over the motion threshold (block 2104, “yes”) the wireless communication device can display a message that the target wireless device is moving and that new location is being determined (block 2106). The wireless communication device can then continue to monitor the motion states (block 2102).
If the motion of the target wireless device is below the motion threshold (block 2104, “no”), and when the motion of the wireless communication device has been determined to be below the motion threshold (block 2103, “no”), the wireless communication device can determine a position estimate of the target wireless device relative to the wireless communication device and display a location and/or direction for the target wireless device in device locator UI as described herein (block 2107).
Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in conjunction with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification do not necessarily all refer to the same embodiment. The processes depicted in the figures that follow can be performed by processing logic that comprises hardware (e.g. circuitry, dedicated logic, etc.), software (as instructions on a non-transitory machine-readable storage medium), or a combination of both hardware and software. Reference will be made in detail to various embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the present invention. The first contact and the second contact are both contacts, but they are not the same contact.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting as to all embodiments. As used in the description of the invention and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
Computing devices, user interfaces for such devices, and associated processes for using such devices are described herein. In some embodiments, the computing device is a portable communications device such as a mobile telephone that also contains other functions, such as PDA and/or music player functions. In the description and figures of this application, where a wireless device, wireless accessory, or wireless accessory device is described or illustrated, unless stated otherwise the described or illustrated attributes can generally be applied to any type of wireless device, wirelesses accessory, or wireless accessory device that is capable of broadcasting a wireless beacon.
One embodiment provides for a non-transitory machine-readable medium storing instructions to cause one or more processors of a wireless communication device to perform operations comprising determining a range and direction to a target wireless device using one our more wireless ranging operations when within a threshold range of the target wireless device; determining a pose of the wireless communication device via sensor data received from an inertial measurement unit within the wireless communication device; determining a target position estimate of the target wireless device relative to the wireless communication device based on the range, direction, and pose; generating an augmented reality view that includes the target position estimate of the wireless device; and presenting the augmented reality view via a display of the wireless communication device. The augmented reality view can include video captured via a camera of the wireless communication device. The target position estimate can be positioned within the video at a location determined for the target wireless device.
In the foregoing description, example embodiments of the disclosure have been described. It will be evident that various modifications can be made thereto without departing from the broader spirit and scope of the disclosure. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense. The specifics in the descriptions and examples provided may be used anywhere in one or more embodiments. The various features of the different embodiments or examples may be variously combined with some features included and others excluded to suit a variety of different applications. Examples may include subject matter such as a method, means for performing acts of the method, at least one machine-readable medium including instructions that, when performed by a machine cause the machine to perform acts of the method, or of an apparatus or system according to embodiments and examples described herein. Additionally, various components described herein can be a means for performing the operations or functions described herein.
One embodiment provides for a non-transitory machine-readable medium storing instructions to cause one or more processors of a wireless communication device to perform operations comprising determining a range and direction to a target wireless device using one our more wireless ranging operations when within a threshold range of the target wireless device, determining a pose of the wireless communication device via sensor data received from an inertial measurement unit within the wireless communication device; determining a target position estimate of the target wireless device relative to the wireless communication device based on the range, direction, and pose, generating an augmented reality view that includes the target position estimate of the wireless device, and presenting the augmented reality view via a display of the wireless communication device.
One embodiment provides for an electronic device comprising a wireless controller including a ranging sensor and a communications radio, a display device, memory to store instructions, and one or more processors to execute the instructions. The instructions cause the one or more processors to scan for a beacon advertisement via the communications radio of the wireless controller and in response to detection of a beacon advertisement of a target wireless device, transmitting a command to the target wireless device to initiate a bi-directional ranging operation with the electronic device. The one or more processors can additionally determine a range and direction to the target wireless device via the ranging sensor of the wireless controller during the bi-directional ranging operation, determine a target position estimate for the target wireless device relative to the electronic device based on the range and direction to the target wireless device, generate an augmented reality view that includes the target position estimate for the wireless device, and present the augmented reality view via the display device.
One embodiment provides for a method comprising, on a wireless communication device including one or more processors, determining a range and direction to a target wireless device using one our more wireless ranging operations when the wireless communication device is within a threshold range of the target wireless device. The method additionally includes determining a pose of the wireless communication device via sensor data received from an inertial measurement unit within the wireless communication device, determining a target position estimate for the target wireless device relative to the wireless communication device based on the range and direction to the target wireless device and the pose of the wireless communication device, and generating an augmented reality view that includes the target position estimate for the wireless device. The method additionally includes presenting the augmented reality view via a display of the wireless communication device.
One embodiment provides for a method on a target wireless device to signal a paired wireless communication device (e.g., mobile device) that the motion for the target wireless device has exceeded a threshold. In response to establishing a wireless ranging session with the paired communication device, the target wireless device can enable a motion sensor on the target wireless device to determine if motion of the target wireless device exceeds a motion threshold. In response to a determination that the motion is over the motion threshold, the target wireless device can signal the wireless communication device that the target device motion has exceeded the motion threshold via one of a variety of wireless signaling techniques.
One embodiment provides for a method at a wireless communication device to indicate whether a location for a target wireless device can be determined and displayed. The wireless communication device can determine a motion state for the target wireless device via wireless data received from the target wireless device. The wireless communication device can determine the motion state for the wireless communication device via sensor data processed by a sensor processor of the wireless communication device. If the wireless communication device determines that both devices are over the motion threshold, the wireless communication device can display a message via a user interface that less movement is required to detect the direction to the target wireless device. If the motion of both devices is not over the motion threshold the wireless communication device can determine a UI view to display based on the motion state of the target wireless device, such that if the target device motion is over the motion threshold the wireless communication device can display a message that the target wireless device is moving and that new location is being determined. If the motion of the target wireless device and the wireless communication device is below the motion threshold the wireless communication device can determine a position estimate of the target wireless device and display a device location and/or direction for the target wireless device in the device location UI.
A system and/or non-transitory machine-readable medium may also be provided that performs the operations of any of the above described methods.
Those skilled in the art will appreciate from the foregoing description that the broad techniques of the embodiments can be implemented in a variety of forms. Therefore, while the embodiments have been described in connection with particular examples thereof, the true scope of the embodiments should not be so limited since other modifications will become apparent to the skilled practitioner upon a study of the drawings, specification, and following claims.
This patent application is a U.S. National Phase Application under 35 U.S.C. § 371 of International Application No. PCT/US2020/028326, filed Apr. 15, 2020, entitled “Finding a Target Device Using Augmented Reality,” which claims to priority to U.S. Provisional Application Ser. No. 62/835,494, filed on Apr. 17, 2019 and U.S. Provisional Application Ser. No. 62/855,963, filed on Jun. 1, 2019, each of which are hereby incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/028326 | 4/15/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/214708 | 10/22/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4918425 | Greenberg et al. | Apr 1990 | A |
5910776 | Black | Jun 1999 | A |
6292660 | Hartless et al. | Sep 2001 | B1 |
6345098 | Matyas, Jr et al. | Feb 2002 | B1 |
6369706 | Anderson et al. | Apr 2002 | B1 |
6754349 | Arthan | Jun 2004 | B1 |
6993350 | Katoh | Jan 2006 | B2 |
7039427 | Tachikawa | May 2006 | B2 |
7059182 | Ragner | Jun 2006 | B1 |
7224987 | Bhela et al. | May 2007 | B1 |
7274761 | Muller et al. | Sep 2007 | B2 |
7342497 | Chung et al. | Mar 2008 | B2 |
7376393 | Ono et al. | May 2008 | B2 |
7388491 | Chand et al. | Jun 2008 | B2 |
7519377 | Tsukamoto | Apr 2009 | B2 |
7558529 | Seshadri et al. | Jul 2009 | B2 |
7657248 | Hodoshima | Feb 2010 | B2 |
8213389 | Bush et al. | Jul 2012 | B2 |
8224355 | Beydler et al. | Jul 2012 | B2 |
8351937 | Lee | Jan 2013 | B2 |
8457617 | Sweeney et al. | Jun 2013 | B2 |
8499337 | Kenny | Jul 2013 | B1 |
8538401 | Kim et al. | Sep 2013 | B2 |
8583915 | Huang | Nov 2013 | B1 |
8873758 | Bradley | Oct 2014 | B2 |
8971924 | Pai et al. | Mar 2015 | B2 |
9009794 | Dykeman et al. | Apr 2015 | B2 |
9077521 | Machani | Jul 2015 | B2 |
9104896 | Pai et al. | Aug 2015 | B2 |
9277353 | Merriam | Mar 2016 | B2 |
9277386 | Masiero et al. | Mar 2016 | B1 |
9316717 | Gicklhorn et al. | Apr 2016 | B2 |
9323916 | Wu et al. | Apr 2016 | B1 |
9357348 | Evans et al. | May 2016 | B2 |
9420423 | Mendelson | Aug 2016 | B1 |
9426749 | Cordeiro et al. | Aug 2016 | B2 |
9432802 | Matsushita et al. | Aug 2016 | B2 |
9439056 | Chukka et al. | Sep 2016 | B2 |
9443366 | Rayner | Sep 2016 | B2 |
9456298 | Lee et al. | Sep 2016 | B2 |
9462109 | Frazier Fields et al. | Oct 2016 | B1 |
9516620 | Upp et al. | Dec 2016 | B1 |
9520045 | Hawkins | Dec 2016 | B2 |
9544075 | Altman et al. | Jan 2017 | B2 |
9557185 | Kimes | Jan 2017 | B2 |
9565255 | Kapoor et al. | Feb 2017 | B2 |
9641622 | Kapoor et al. | May 2017 | B2 |
9706032 | Pai et al. | Jul 2017 | B2 |
9762316 | Kukuiski et al. | Sep 2017 | B2 |
9769601 | Zelinka | Sep 2017 | B2 |
9779596 | Ingrassia et al. | Oct 2017 | B2 |
9781106 | Vitus et al. | Oct 2017 | B1 |
9801059 | Ziv et al. | Oct 2017 | B2 |
9820093 | Mayor et al. | Nov 2017 | B2 |
9848075 | Ahmad et al. | Dec 2017 | B1 |
9860932 | Kapoor et al. | Jan 2018 | B2 |
9922531 | Doxey et al. | Mar 2018 | B1 |
9961507 | Mendelson | May 2018 | B1 |
10015836 | Kapoor et al. | Jul 2018 | B2 |
10022066 | Tomiha | Jul 2018 | B2 |
10022086 | Kahn et al. | Jul 2018 | B1 |
10042595 | Behzadi et al. | Aug 2018 | B2 |
10110642 | Numakami | Oct 2018 | B2 |
10366692 | Adams et al. | Jul 2019 | B1 |
10368378 | Foster et al. | Jul 2019 | B2 |
10410485 | Ingrassia et al. | Sep 2019 | B2 |
10448211 | Shen et al. | Oct 2019 | B1 |
10506517 | Dai Javad et al. | Dec 2019 | B2 |
10600310 | Hawkins | Mar 2020 | B2 |
10667313 | Maguire et al. | May 2020 | B2 |
10701203 | Fiorini et al. | Jun 2020 | B2 |
10771898 | Dusan et al. | Sep 2020 | B2 |
10841736 | De La Broise | Nov 2020 | B1 |
10855483 | Ramesh et al. | Dec 2020 | B1 |
10862684 | Hong et al. | Dec 2020 | B2 |
10956975 | Abdul Gaffar | Mar 2021 | B1 |
10970989 | Quibelan et al. | Apr 2021 | B1 |
10992755 | Tran | Apr 2021 | B1 |
11051105 | Dusan et al. | Jun 2021 | B2 |
11088830 | Gu et al. | Aug 2021 | B2 |
11107088 | Radocchia et al. | Aug 2021 | B2 |
11202168 | Evans et al. | Dec 2021 | B2 |
11265716 | Klinkner et al. | Mar 2022 | B2 |
11282351 | Ingrassia, Jr. et al. | Mar 2022 | B2 |
11288562 | Purba | Mar 2022 | B2 |
11310652 | Norp et al. | Apr 2022 | B2 |
11356799 | Haney | Jun 2022 | B2 |
11595784 | Mohalik | Feb 2023 | B2 |
11606669 | Lopatin et al. | Mar 2023 | B2 |
11622237 | Diem | Apr 2023 | B2 |
11641563 | Lopatin et al. | May 2023 | B2 |
11716603 | Lee et al. | Aug 2023 | B2 |
11768578 | Behzadi et al. | Sep 2023 | B2 |
11863671 | Sierra et al. | Jan 2024 | B1 |
11889302 | Victa et al. | Jan 2024 | B2 |
20020144215 | Hoskote et al. | Oct 2002 | A1 |
20030065918 | Willey | Apr 2003 | A1 |
20030092437 | Nowlin et al. | May 2003 | A1 |
20030182584 | Banes et al. | Sep 2003 | A1 |
20040162027 | Chang | Aug 2004 | A1 |
20040249817 | Liu et al. | Dec 2004 | A1 |
20050021767 | Cai | Jan 2005 | A1 |
20050154896 | Widman et al. | Jul 2005 | A1 |
20050190098 | Bridgelall et al. | Sep 2005 | A1 |
20050285739 | Velhal et al. | Dec 2005 | A1 |
20060039337 | Hodoshima | Feb 2006 | A1 |
20060111835 | Baker et al. | May 2006 | A1 |
20070139199 | Hanlon | Jun 2007 | A1 |
20070249374 | Hu et al. | Oct 2007 | A1 |
20070283151 | Nakano et al. | Dec 2007 | A1 |
20070283395 | Wezowski | Dec 2007 | A1 |
20080004798 | Troxler et al. | Jan 2008 | A1 |
20080119953 | Reed et al. | May 2008 | A1 |
20080120196 | Reed et al. | May 2008 | A1 |
20090058670 | Sweeney et al. | Mar 2009 | A1 |
20090150674 | Richardson et al. | Jun 2009 | A1 |
20090315767 | Scalisi et al. | Dec 2009 | A1 |
20090323972 | Kohno et al. | Dec 2009 | A1 |
20090325599 | Vuori | Dec 2009 | A1 |
20100079249 | Pan | Apr 2010 | A1 |
20100159833 | Lewis et al. | Jun 2010 | A1 |
20100184378 | Wakefield | Jul 2010 | A1 |
20100245054 | Kim | Sep 2010 | A1 |
20100289620 | Aminger et al. | Nov 2010 | A1 |
20110124326 | Kudo | May 2011 | A1 |
20120054493 | Bradley | Mar 2012 | A1 |
20120083209 | Giles et al. | Apr 2012 | A1 |
20120100868 | Kim et al. | Apr 2012 | A1 |
20120275361 | Berenberg et al. | Nov 2012 | A1 |
20120310391 | Sanders | Dec 2012 | A1 |
20120311643 | Aguirre et al. | Dec 2012 | A1 |
20120328061 | Chow | Dec 2012 | A1 |
20130023238 | Kaplan et al. | Jan 2013 | A1 |
20130034004 | Mannemala et al. | Feb 2013 | A1 |
20130104035 | Wagner et al. | Apr 2013 | A1 |
20130111555 | Leneel | May 2013 | A1 |
20130171986 | Shimizu | Jul 2013 | A1 |
20130271902 | Lai et al. | Oct 2013 | A1 |
20130275873 | Shaw et al. | Oct 2013 | A1 |
20130290191 | Dischamp et al. | Oct 2013 | A1 |
20130290522 | Behm, Jr. | Oct 2013 | A1 |
20130343542 | Rosati et al. | Dec 2013 | A1 |
20140111307 | Ingrassia et al. | Apr 2014 | A1 |
20140222685 | Middleton et al. | Aug 2014 | A1 |
20140379584 | Ward | Dec 2014 | A1 |
20150019124 | Bandyopadhyay et al. | Jan 2015 | A1 |
20150072618 | Granbery | Mar 2015 | A1 |
20150189596 | Stephens | Jul 2015 | A1 |
20150277852 | Burgis | Oct 2015 | A1 |
20150289207 | Kubo et al. | Oct 2015 | A1 |
20150334569 | Rangarajan et al. | Nov 2015 | A1 |
20150350140 | Garcia et al. | Dec 2015 | A1 |
20150350820 | Son et al. | Dec 2015 | A1 |
20150356030 | Zahand et al. | Dec 2015 | A1 |
20150382140 | Cho et al. | Dec 2015 | A1 |
20160006577 | Logan | Jan 2016 | A1 |
20160037439 | Shamis et al. | Feb 2016 | A1 |
20160057572 | Bojorquez Alfaro et al. | Feb 2016 | A1 |
20160057625 | Andrada et al. | Feb 2016 | A1 |
20160069991 | Das et al. | Mar 2016 | A1 |
20160080591 | Asakura | Mar 2016 | A1 |
20160087959 | Park | Mar 2016 | A1 |
20160088143 | Cohn et al. | Mar 2016 | A1 |
20160094947 | Shen et al. | Mar 2016 | A1 |
20160140335 | Proulx et al. | May 2016 | A1 |
20160017402 | Cavallaro et al. | Jun 2016 | A1 |
20160164973 | Kapoor et al. | Jun 2016 | A1 |
20160174023 | Cavallaro et al. | Jun 2016 | A1 |
20160180392 | Liu et al. | Jun 2016 | A1 |
20160189507 | Rayner | Jun 2016 | A1 |
20160205556 | Borghei | Jul 2016 | A1 |
20160212538 | Fullam et al. | Jul 2016 | A1 |
20160234213 | Kim et al. | Aug 2016 | A1 |
20160242192 | Llosa et al. | Aug 2016 | A1 |
20160248564 | Qi et al. | Aug 2016 | A1 |
20160302137 | Escott et al. | Oct 2016 | A1 |
20160330095 | Numakami | Nov 2016 | A1 |
20160344712 | Ding et al. | Nov 2016 | A1 |
20160352518 | Ford et al. | Dec 2016 | A1 |
20160357385 | Dan et al. | Dec 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20160371507 | Jakobsson | Dec 2016 | A1 |
20170006417 | Canoy et al. | Jan 2017 | A1 |
20170078408 | Lepp et al. | Mar 2017 | A1 |
20170126818 | Kang | May 2017 | A1 |
20170127340 | Dooey et al. | May 2017 | A1 |
20170171181 | Britt | Jun 2017 | A1 |
20170228935 | Foster et al. | Aug 2017 | A1 |
20170272415 | Zhao et al. | Sep 2017 | A1 |
20170134898 | Vega et al. | Nov 2017 | A1 |
20170325065 | Azam et al. | Nov 2017 | A1 |
20170330031 | Wilson et al. | Nov 2017 | A1 |
20180013815 | Gold | Jan 2018 | A1 |
20180025595 | Ingrassia et al. | Jan 2018 | A1 |
20180035374 | Borden et al. | Feb 2018 | A1 |
20180176748 | Kim et al. | Jun 2018 | A1 |
20180183591 | De Laat et al. | Jun 2018 | A1 |
20180183596 | Deshpande et al. | Jun 2018 | A1 |
20180184286 | Patterson | Jun 2018 | A1 |
20180199138 | Dusan et al. | Jul 2018 | A1 |
20180219872 | Sugashima et al. | Aug 2018 | A1 |
20180227284 | Sugano et al. | Aug 2018 | A1 |
20180262907 | Alanis et al. | Sep 2018 | A1 |
20180288208 | Lee et al. | Oct 2018 | A1 |
20180317266 | Britt et al. | Nov 2018 | A1 |
20180343561 | Patterson | Nov 2018 | A1 |
20180348718 | Richardson et al. | Dec 2018 | A1 |
20190028281 | Turissini et al. | Jan 2019 | A1 |
20190028445 | McLaughlin et al. | Jan 2019 | A1 |
20190034920 | Nolan et al. | Jan 2019 | A1 |
20190037469 | Krishnan et al. | Jan 2019 | A1 |
20190058966 | Puppala et al. | Feb 2019 | A1 |
20190069243 | Bean et al. | Feb 2019 | A1 |
20190116173 | Robison et al. | Apr 2019 | A1 |
20190124469 | Roy et al. | Apr 2019 | A1 |
20190191301 | Fang et al. | Jun 2019 | A1 |
20190213528 | Gupta et al. | Jul 2019 | A1 |
20190246253 | Ryu et al. | Aug 2019 | A1 |
20190289059 | Vanahalli et al. | Sep 2019 | A1 |
20200034835 | Kim | Jan 2020 | A1 |
20200074822 | Ingrassia, Jr. et al. | Mar 2020 | A1 |
20200107164 | Lopatin et al. | Apr 2020 | A1 |
20200145244 | Hollinger et al. | May 2020 | A1 |
20200177595 | Rakshit et al. | Jun 2020 | A1 |
20200187001 | Ard et al. | Jun 2020 | A1 |
20200226908 | Doxey et al. | Jul 2020 | A1 |
20200242662 | Middleton et al. | Jul 2020 | A1 |
20200344549 | Wegener | Oct 2020 | A1 |
20210044957 | Norp et al. | Feb 2021 | A1 |
20210136846 | Ponnusamy et al. | May 2021 | A1 |
20210203747 | Gorsica, IV et al. | Jul 2021 | A1 |
20210204115 | Gorsica, IV et al. | Jul 2021 | A1 |
20210250355 | Galdo et al. | Aug 2021 | A1 |
20210256833 | Daoura et al. | Aug 2021 | A1 |
20210334851 | Proctor, Jr. et al. | Oct 2021 | A1 |
20210400045 | Kondeti | Dec 2021 | A1 |
20220021684 | Mensah et al. | Jan 2022 | A1 |
20220052847 | Gonzalez Cervantes et al. | Feb 2022 | A1 |
20220070667 | Victa et al. | Mar 2022 | A1 |
20220078029 | Galdo et al. | Mar 2022 | A1 |
20220165139 | Ingrassia, Jr. et al. | May 2022 | A1 |
20220200789 | Lalande et al. | Jun 2022 | A1 |
20220224300 | Knode | Jul 2022 | A1 |
20220256633 | Gu et al. | Aug 2022 | A1 |
20220327196 | Trapani | Oct 2022 | A1 |
20220369022 | Jorgovanovic et al. | Nov 2022 | A1 |
20220386076 | Lopatin et al. | Dec 2022 | A1 |
20220394431 | Lopatin et al. | Dec 2022 | A1 |
20220394660 | Werner et al. | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
101682508 | Mar 2010 | CN |
102325324 | Jan 2012 | CN |
104833945 | Aug 2015 | CN |
106412816 | Feb 2017 | CN |
106792501 | May 2017 | CN |
107328424 | Nov 2017 | CN |
108223229 | Jun 2018 | CN |
108520552 | Sep 2018 | CN |
108604129 | Sep 2018 | CN |
109596118 | Apr 2019 | CN |
109596118 | Apr 2019 | CN |
111436040 | Jul 2020 | CN |
112068512 | Dec 2020 | CN |
114071357 | Feb 2022 | CN |
1296155 | Mar 2003 | EP |
2020784 | Feb 2009 | EP |
2472192 | Feb 2011 | GB |
H11262065 | Sep 1999 | JP |
2007150904 | Jun 2007 | JP |
2018191522 | Dec 2018 | JP |
2021025798 | Feb 2021 | JP |
20140044916 | Apr 2014 | KR |
20170013833 | Feb 2017 | KR |
20180086118 | Jul 2018 | KR |
2019-0141998 | Dec 2019 | KR |
2010126846 | Nov 2010 | WO |
2012030733 | Mar 2012 | WO |
2013036488 | Mar 2013 | WO |
2013163334 | Oct 2013 | WO |
2014005004 | Mar 2014 | WO |
2014042507 | Mar 2014 | WO |
2016032610 | Mar 2016 | WO |
2016036453 | Mar 2016 | WO |
2017107077 | Jun 2017 | WO |
2018001518 | Jan 2018 | WO |
2018118026 | Jun 2018 | WO |
2018135919 | Jul 2018 | WO |
2018156555 | Aug 2018 | WO |
2018160863 | Sep 2018 | WO |
2019232420 | Dec 2019 | WO |
2020214701 | Oct 2020 | WO |
2020214709 | Oct 2020 | WO |
2020214711 | Oct 2020 | WO |
2022046527 | Mar 2022 | WO |
2022256438 | Dec 2022 | WO |
Entry |
---|
FOFA, “Find One, Find All Key Finder & Remote Control Locators”, received from The Wayback Machine—https://web.archive.org/web/20200715080402/http://www.findonefindall.com:80/index.htm, 3 pages. |
Kim et al., “In/Out Status Monitoring in Mobile Asset Tracking with Wireless Sensor Networks”, received from www.mdpi.com/journal/sensors, published Mar. 26, 2010, 22 pages. |
Fuemmeler et al., “Energy Efficient Multi-Object Tracking in Sensor Networks”, received from https://ieeexplore.ieee.org/document/5439914, published Mar. 29, 2010, 9 pages. |
Cocchi et al., “Subband Neural Networks Prediction for On-Line Audio Signal Recovery”, received from https://ieeexplore.ieee.org/document/1021887, Published Jul. 2002, 10 pages. |
PCT/US2022/027681, “PCT Notification of transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”, mailed Jul. 28, 2022, 9 pages. |
EP22172663.1, “Extended European Search Report” mailed Jan. 25, 2023, 18 pages. |
PCT/US2020/028318, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”, Mailed Jul. 8, 2020, 16 pages. |
PCT/US2020/028326, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”, Mailed Jul. 3, 2020, 12 pages. |
PCT/US2020/028327, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”, Mailed Sep. 14, 2020, 17 pages. |
PCT/US2020/028329, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”, Mailed Jul. 15, 2020, 13 pages. |
A. Korolova et al., “Cross-App Tracking Via Nearby Bluetooth Low Energy Devices”, A presentation proposal for PrivacyCon 2017, Published Mar. 13, 2018, 12 pgs. |
Mannan, Mohammad, et al. “Mercury: Recovering forgotten passwords using personal devices.” International Conference on Financial Cryptography and Data Security. Springer, Berlin, Heidelberg, 2011 (Year: 2011). |
Lopez, Mareo, “The Importance of a Speaker's Resonant Frequency”, proaudioland.com, 3 pages, Aug. 2015 (Year: 2015). |
Thetileapp.com [online], “Learn How Tile's Tracking Device Helps You Find Your Lost Things,” Dec. 9, 2016, retrieved from URL <https://www.thetileapp.com/how-it-works>, 9 pages. |
PCT/US2023/017975, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”, mailed Jun. 29, 2023, 12 pages. |
PCT/US2021/046786, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration”, Mailed Dec. 13, 2021, 14 pages. |
PCT/US2019/048899, “Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or The Declaration” mailed Dec. 2, 2019, 5 pgs. |
PCT/US2023/031897, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority or the Declaration”, mailed Dec. 12, 2023, 19 pages. |
Chinese Patent Application No. CN202080028758.X, Office Action, Mailed On Nov. 16, 2024, (7 pages of English translation and 5 pages of official language copy). |
Number | Date | Country | |
---|---|---|---|
20220201428 A1 | Jun 2022 | US |
Number | Date | Country | |
---|---|---|---|
62855963 | Jun 2019 | US | |
62835494 | Apr 2019 | US |