The invention relates to a computing device for tracking objects, a method of tracking objects performed by a computing device, a corresponding computer program, a corresponding computer-readable storage medium, and a corresponding data carrier signal.
People may have difficulties in keeping track of objects such as electronic devices (e.g., mobile phones, tablet computers, or the like), watches, keys, wallets, remote controls, tools, or any other types of everyday items in general, which can be picked up, i.e., gripped with a hand of a person using the object (the user), carried by the user, and subsequently released by the user at a potentially different location.
There are different solutions to assist people in finding lost or displaced objects. For instance, battery-powered tracking devices are known which can be attached to objects such as wallets or keys, and which are based on short-range radio signals, e.g., Bluetooth. Further, Apple's “Find My iPhone” app can be used for locating iOS devices by retrieving position information from iOS devices which are connected to the Internet.
It is an object of the invention to provide an improved alternative to the above techniques and prior art.
More specifically, it is an object of the invention to provide improved solutions for tracking objects such as electronic devices (e.g., mobile phones, tablet computers, or the like), watches, keys, wallets, remote controls, tools, or any other types of everyday items in general, which can be picked up, i.e., gripped with a hand of a person using the object (the user), carried by the user, and released by the user at a potentially different location.
These and other objects of the invention are achieved by means of different aspects of the invention, as defined by the independent claims. Embodiments of the invention are characterized by the dependent claims.
According to a first aspect of the invention, a computing device for tracking objects is provided. The computing device comprises a positioning sensor, a wireless network interface, and a processing circuit. The processing circuit causes the computing device to be operative to detect that an object is gripped by a user carrying the computing device, and to identify the object. The computing device is further operative to update information pertaining to the object in a database accessible by multiple computing devices. The information comprises an identifier associated with the object, an identifier associated with the user, and position information identifying the object as being co-located with the user. The computing device is further operative to detect that the object is released by the user, and in response thereto, update the position information with a position of the computing device when the object was released by the user.
According to a second aspect of the invention, a method of tracking objects is provided. The method is performed by a computing device and comprises detecting that an object is gripped by a user carrying the computing device, and identifying the object. The method further comprises updating information pertaining to the object in a database accessible by multiple computing devices. The information comprises an identifier associated with the object, an identifier associated with the user, and position information identifying the object as being co-located with the user. The method further comprises detecting that the object is released by the user, and in response thereto, updating the position information in the database with a position of the computing device when the object was released by the user.
According to a third aspect of the invention, a computer program is provided. The computer program comprises instructions which, when the computer program is executed by a processor comprised in a computing device, cause the computing device to carry out the method according to the third aspect of the invention.
According to a fourth aspect of the invention, a computer-readable storage medium is provided. The computer-readable storage medium has stored thereon the computer program according to the third aspect of the invention.
According to a fifth aspect of the invention, a data carrier signal is provided. The data carrier signal carries the computer program according to the third aspect of the invention.
The invention makes use of an understanding that computing devices, in particular mobile communications devices which are carried by users, such as mobile phones, smartphones, tablet computers, Personal Digital Assistants (PDAs), Head-Mounted Displays (HMDs), or Augmented-Reality (AR) headsets, can be used for keeping track of objects which are picked up by their users at a location where the objects are currently located (by gripping the object with a hand), and subsequently released after the users have finished using them at a potentially different location. Information about the current location of an object, i.e., its position, is maintained in a database which is accessible by multiple computing devices, i.e., a shared database.
This is advantageous in that multiple computing devices which are carried by their users can share information about the current positions of one or more objects, allowing users to locate an object which there are interested in finding and which may be in use by another user or which has been placed at a position where it has been released by the user or another user.
Even though advantages of the invention have in some cases been described with reference to embodiments of the first aspect of the invention, corresponding reasoning applies to embodiments of other aspects of the invention.
Further objectives of, features of, and advantages with, the invention will become apparent when studying the following detailed disclosure, the drawings and the appended claims. Those skilled in the art realize that different features of the invention can be combined to create embodiments other than those described in the following.
The above, as well as additional objects, features and advantages of the invention, will be better understood through the following illustrative and non-limiting detailed description of embodiments of the invention, with reference to the appended drawings, in which:
All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.
The invention will now be described more fully herein after with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In the following, embodiments of the computing device 110 for tracking objects are described with reference to
The computing device 110 comprises a positioning sensor 113, a wireless network interface 114, and a processing circuit 115. If the computing device is embodied as an optical AR headset or HMD 110 as is illustrated in
The positioning sensor 113 is operative to determine a current position of the computing device 110, and accordingly that of its user carrying the computing device 110. It may either be based on the Global Positioning System (GPS), the Global Navigation Satellite System (GNSS), China's BeiDou Navigation Satellite System (BDS), GLONASS, or Galileo, or may receive position information via the wireless network interface 114, e.g., from a positioning server. The position information may, e.g., be based on radio triangulation, radio fingerprinting, or crowd-sourced identifiers which are associated with known positions of access points of wireless communications networks (e.g., cell-IDs or WLAN SSIDs). The current position of the computing device 110 may, e.g., be made available via an Application Programming Interface (API) provided by an operating system of the computing device 110.
The wireless network interface 114 is a circuit which is operative to access a wireless communications network and thereby enable the computing device 110 to communicate, i.e., exchange data in either direction (uplink or downlink). The computing device may, e.g., exchange data with other computing devices which are similar to the computing device 110, or a database 150 which is accessible by multiple computing devices 110 and which is operative to maintain information pertaining to one or more objects, as is described further below. A yet a further alternative, the wireless network interface 114 may be operative to exchange data with one or more other communications devices of the user, such as a smartwatch 140 which is shown in
Embodiments of the computing device 110 are now described with further reference to
The processing circuit 115 causes the computing device 110 to be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110, such as the spirit level 120 or the drill 130. For instance, the computing device 110 may be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110 by detecting flexion of the fingers of a hand of the user, wherein the flexion of the fingers is characteristic of the hand gripping an object. The computing device 110 may, e.g., be operative to detect flexion of the fingers based on sensor data which is received from a sensor device which is worn close to the hand gripping the object. The sensor device may, e.g., be a smartwatch 140 or any other wearable device which is preferably worn close to the wrist and which comprises haptic sensors, motion sensors, and/or ultrasound sensor, which are operative to detect flexion of the fingers. For instance, McIntosh et al: have demonstrated hand-gesture recognition using ultrasound imaging (J. McIntosh, A. Marzo, M. Fraser, and C. Phillips, “EchoFlex: Hand Gesture Recognition using Ultrasound Imaging”, in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pages 1923-1934, ACM New York, 2017). Alternatively, the sensor device may be a haptic glove which is worn by the user. The sensor data is received by the computing device 110 via its wireless network interface 114, and is transmitted by the sensor device via a corresponding network interface comprised in the sensor device, e.g., a Bluetooth interface.
The computing device 110 may alternatively be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110 by performing image analysis on one or more images captured by a camera worn by the user. This may, e.g., be a camera 112 which is integrated into an AR headset, or HMD, embodying the computing device 110, as is illustrated in
As a further alternative, the computing device 110 may be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110 by evaluating a strength of a radio signal which is transmitted by the object. This is exemplified in
The computing device 110 is further operative to identify 312/332 the object which is gripped 311/331 by the user. The computing device 110 may, e.g., be operative to identify 312/332 the object by performing object recognition on one or more images captured by a camera worn by the user, similar to what is described hereinbefore in relation to detecting that an object is gripped by the user. In particular, the camera may be the camera 112 which is integrated into an AR headset, or HMD, embodying the computing device 110, as is illustrated in
The computing device 110 may alternatively be operative to identify 312/332 the object based on a radio signal which is transmitted by the object, similar to what is described hereinbefore in relation to detecting that an object is gripped by the user. In particular, the object may be provided with an RFID sticker, such as RFID sticker 121, or any other type of radio transmitter which is comprised in the object or can be attached to an object and which is transmitting a distinct code, such as a MAC address or any other unique identifier associated with the radio transmitter or object, or a distinct radio-signal pattern. Preferably, the radio signal is a short-ranged radio signal, such as Bluetooth or NFC.
The computing device 110 is further operative to update 313/333 information pertaining to the object in a database 150 which is accessible by multiple computing devices 110. The database 150, which is also referred to as a shared database, may, e.g., be maintained in an application server, an edge server, or a cloud storage, which is accessible by the computing devices 110 through one or more wireless communications network to which the computing devices 110 are connected via their wireless network interfaces 114. The computing device 110 is operative to update 313/333 the information pertaining to the object by transmitting information via its wireless network interface 114 using a suitable protocol, e.g., one or more of the Hypertext Transfer Protocol (HTTP), the Transmission Control Protocol/Internet Protocol (TCP/IP) suite, the Constrained Application Protocol (CoAP), the User Datagram Protocol (UDP), or the like. As an alternative the shared database 150 may be maintained in a local data storage, i.e., memory, of each of the multiple computing devices 110, which multiple local databases are continuously synchronized with each other, or with a master database which is maintained in an application server, an edge server, or a cloud storage. Synchronization is achieved by transmitting and/or receiving information via the wireless network interfaces 114 using a suitable protocol, as is described hereinbefore.
The information which is updated in the database 150 comprises an identifier which is associated with the object, an identifier which is associated with the user, and position information which identifies the object as being co-located with the user. The identifier which is associated with the object is preferably generated based a unique code or identifier which is obtained when the object is identified 312/332. Alternatively, an identifier which is associated with the object may be generated by the database 150, in particular if the object is not yet listed in the database 150 and a new database entry is created. The identifier which is associated with the user may, e.g., be a name, a user name, an account name, or a login name, of the user. Alternatively, the identifier which is associated with the user may be an identifier which is associated with the user's computing device 110, e.g., a MAC address of the computing device 110, a name associated with the computing device 110 (e.g., “Bob's iPhone”), or the like. The position information which identifies the object as being co-located with the user may, e.g., be an information field, or a flag, indicating that the object is co-located with the user who is identified by the identifier which is associated with the user (e.g., a Boolean flag). Alternatively, the position information which identifies the object as being co-located with the user may be the identifier which is associated with the user (e.g., “Bob”) of the computing device of the user (e.g., “Bob's iPhone”). Optionally, the computing device 110 may be operative to update 313/333 the position information identifying the object as being co-located with the user by recurrently updating the position information with a current position of the computing device 110 of the user who has gripped the object, which position information is acquired from the positioning sensor 113. The position information identifying the object as being co-located with the user may be updated 313/333 periodically, or in response to detecting that the position of the computing device, and thereby that of the user, has changed by more than a threshold distance.
The computing device 110 is further operative to detect 314/334 that the object is released by the user. This may be achieved in a similar way as is described hereinbefore in relation to detecting that an object is gripped by the user. For instance, the computing device 110 may be operative to detect 314/334 that the object is released by the user by detecting flexion of the fingers of a hand of the user, wherein the flexion of the fingers is characteristic of the hand releasing an object. The computing device 110 may, e.g., be operative to detect flexion of the fingers based on sensor data which is received from the sensor device which is worn close to the hand gripping the object, such as the smartwatch 140 or other wearable device worn close to the wrist and comprising haptic sensors, motion sensors, and/or ultrasound sensors, or a haptic glove worn by the user. The computing device 110 may alternatively be operative to detect 314/334 that the object is released by the user by performing image analysis on one or more images captured by a camera worn by the user, e.g., the camera 112 which is integrated into an AR headset, or HMD, embodying the computing device 110, as is illustrated in
The processing circuit 115 causes the computing device 110 to be further operative to update 315/335 the position information in the database 150 in response to detecting 314/334 that the object is released by the user. The position information in the database 150 is updated 315/335 with a position of the computing device 110 when the object was released by the user. The position information is obtained from the positioning sensor 113. Similar to what is described hereinbefore, the computing device 110 may be operative to update 315/335 the position information by transmitting information to the database 150 via its wireless network interface 114 using a suitable protocol, e.g., one or more of HTTP, TCP/IP, CoAP, UDP, or the like. As an alternative, if the database 150 is maintained in a local data storage of each of the multiple computing devices 110, the local databases 150 are continuously synchronized with each other, or with a master database which is maintained in an application server, an edge server, or a cloud storage, by transmitting information via the wireless network interfaces 114 using a suitable protocol.
Optionally, the computing device 110 may further be operative to receive 321/341 a request from the user to locate an object, to query 322/342 the database 150 to retrieve 323/343 a current position of the object, and to guide 324/344 the user to the current position of the object. The request from the user may, e.g., be received as a spoken instructions (e.g., “Find spirit level.”) which is captured by a microphone comprised in the computing device 110, and subjected to speech recognition, or via a graphical user interface through which the user interacts with the computing device 110. For instance, the computing device 110 may be operative to display a list of objects which currently are listed in the database 150 on a display of the computing device 110, from which list the user may select an object which he/she wishes to locate. The list of objects which currently are listed in the database 150 may be retrieved by querying the database 150 via the wireless network interface 114.
With reference to
The computing device 110 may be operative to guide 324/344 the user to the current position of the object by displaying one or more cues guiding the user to the current position of the object. For instance, if the computing device 110 is embodied by an AR headset as is illustrated in
Alternatively, the computing device 110 may be operative to guide 324/344 the user to the current position of the object by emitting an audible sound guiding the user to the current position of the object. In particular, the emitted audible sound may be varied to reflect a distance between the user and the current position of the object while the user is moving around to locate the object. For instance, a volume or a frequency of the audible sound may increase with decreasing distance. If the audible sound comprises repetitive beeps, the duration in-between beeps may be shortened to reflect a decrease in distance, similar to a metal detector.
As yet a further alternative, the computing device 110 may be operative, if the current position of the object is indicated as being co-located with another user, to guide 324/344 the user to the current position of the object by notifying the user that the object is currently co-located with the other user. This may, e.g., be achieved by providing an audible instruction to the user (e.g., “The spirit level is used by Bob.”), or by displaying corresponding information to the user (e.g., “The spirit level is used by Bob.”). Similar to what is described hereinbefore, the computing device 110 may be operative to guide the user to the object at its current position even if it is in use by another user, e.g., by displaying one or more cues or by emitting audible sound guiding the user requesting to locate the object to the user who has gripped the object.
Although embodiments of the computing device have in some cases been described with reference to the AR headset 110 illustrated in
In the following, embodiments of the processing circuit 115 comprised in the computing device for tracking objects, such as the computing device 110, are described with reference to
In the following, embodiments of the method of tracking objects are described with reference to
The detecting 501 that an object is gripped by a user carrying the computing device may comprise detecting flexion of the fingers of a hand of the user which is characteristic of the hand gripping an object. Optionally, the flexion of the fingers is detected based on sensor data received from a sensor device worn close to the hand gripping the object.
The detecting 501 that an object is gripped by a user carrying the computing device may alternatively comprise performing image analysis on an image captured by a camera worn by the user.
The detecting 501 that an object is gripped by a user carrying the computing device may alternatively comprise evaluating a strength of a radio signal transmitted by the object. Optionally, the strength of a radio signal transmitted by the object is evaluated based on data pertaining to the strength of the radio signal, which data is received from a receiver device worn close to the hand gripping the object and which has received the radio signal.
The identifying 502 the object may comprise performing object recognition on an image captured by a camera worn by the user.
Alternatively, the object may be identified 502 based on a radio signal transmitted by the object.
The updating 503 the position information identifying the object as being co-located with the user may comprise recurrently updating the position information with a current position of the computing device.
The method 500 may further comprise receiving 506 a request from the user to locate the object, querying 507 the database to retrieve a current position of the object, and guiding 508 the user to the current position of the object. The guiding 508 the user to the current position of the object may comprise displaying one or more cues guiding the user to the current position of the object. The guiding 508 the user to the current position of the object may alternatively comprise emitting audible sound guiding the user to the current position of the object. If the current position of the object is indicated as being co-located with another user, the guiding 508 the user to the current position of the object may alternatively comprise notifying the user that the object is currently co-located with the other user.
It will be appreciated that the method 500 may comprise additional, alternative, or modified, steps in accordance with what is described throughout this disclosure. An embodiment of the method 500 may be implemented as the computer program 404 comprising instructions which, when executed by the one or more processor(s) 402 comprised in the computing device 110, cause the computing device 110 to perform in accordance with embodiments of the invention described herein.
The person skilled in the art realizes that the invention by no means is limited to the embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/SE2019/050664 | 7/3/2019 | WO |