The invention relates generally to generating a database for search of objects from the internet. More particularly, the invention relates to the use of sensor measurement, such as Earth's magnetic field or radio frequency measurements for generating such database and for performing such search.
It is common to search information from the Internet by using, e.g. Google or Bing search engines. Typically this takes place by typing a search word or words, i.e. a search key, to the search engine and waiting for the search engine to retrieve results that are related to the typed search key. However, this type of search is limited in terms of, e.g., finding only those results that are directly related to the search words. For example, the search may retrieve objects, such as written documents or websites, including the typed search key.
According to an aspect of the invention, there are provided apparatuses as specified in claims 1, 17 and 19.
According to an aspect of the invention, there is provided a computer program product embodied on a distribution medium readable by a computer and comprising program instructions which, when loaded into an apparatus, cause the apparatus, such as the database entity, the mobile device or the user device, to execute any of the functionalities as described in the appended claims.
According to an aspect of the invention, there is provided a computer-readable distribution medium carrying the above-mentioned computer program product.
According to an aspect of the invention, there is provided an apparatus, such as the database entity, the mobile device or the user device, comprising means for performing any of the embodiments as described in the appended claims.
Some embodiments of the invention are defined in the dependent claims.
In the following, the invention will be described in greater detail with reference to the embodiments and the accompanying drawings, in which
The following embodiments are exemplary. Although the specification may refer to “an”, “one”, or “some” embodiment(s) in several locations of the text, this does not necessarily mean that each reference is made to the same embodiment(s), or that a particular feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As said earlier, current search methods from the Internet are limited. These may include, for example, typing a search key to the Google and waiting for the Google search engine to retrieve hits (such as links to documents or images) which comprise the given search key. The retrieved results are only related to the global search key. However, sometimes a person may want the search engine to retrieve any data/hits that is/are relevant to a certain local area. This provides more flexibility, user-friendliness and more possibilities for a search process.
Therefore, there is provided a database entity 100, comprising at least one processor and at least one memory including a computer program code. According to the proposed solution, the at least one memory and the computer program code may be configured, with the at least one processor, to cause the database entity 100 to perform various functions. As shown in step 200 of
In one embodiment, the sensor fingerprint may represent at least one of the following: acceleration (detectable with an acceleration sensor), angular velocity (detectable by a gyroscope, for example), temperature, ambient illumination, air pressure (indication of altitude), speed, to mention only a few non-limiting examples. Each of these may be given in time series, for example.
The RF fingerprint may be based on WiFi (e.g. wireless local area network, WLAN), Bluetooth (BLT) or cellular RF signals, for example. Thus, the RF fingerprint may be e.g. a WiFi fingerprint. For example, there may be RE (such as WLAN or BLT) base stations mounted indoors and/or outdoors. As a person having a mobile device or a user device with a RF receiver walks in the area having mounted RF base stations, the RF receiver of the person's device may detect the signal transmitted by the RF base stations and may form the RF fingerprint of the detected RF signal, for example. The RF fingerprint may represent identifiers (such as basic service set identifiers (BSSID), media access control (MAC) address) of the RF base stations or access points, the strength of the detected signal, angle-of-arrival of the detected signal, or any other feature of the RF signals or derived from the RF signals, for example. As said, the mobile device or the user device may also detect an identifier transmitted by the RF base stations. The RF fingerprint may thus comprise a feature vector for each given location, e.g. which base stations/access point are detectable at this given location and at what signal strength. On the other hand, a time series of detected total signal strength may be used as well as one possible form of RF fingerprint. The RF fingerprint may be location specific so that a RF fingerprint of a given location is different than a RF fingerprint of another location.
Before looking further at
An example of a building 300 with 5 rooms, a corridor and a hall is depicted in
The mobile device 102-106 is detailed later, but for now it may be said, that the mobile device 102-106 may comprise a magnetometer or any other sensor capable of measuring the EMF 108, such as a Hall sensor or a digital compass. The magnetometer may be an accurate sensor capable to detect any variations in the EMF 108. In addition to the strength, also known as magnitude, intensity or density, of the magnetic field (flux), the magnetometer may be capable of determining a three-dimensional direction of a measured EMF vector. To this end, it should be noted that at any location, the Earth's magnetic field 108 can be represented by a three-dimensional vector. Let us assume that a compass needle is tied at one end to a string such that the needle may rotate in any direction. The direction the needle points, is the direction of the Earth's magnetic field vector.
As said, the magnetometer carried by a person in the mobile device traversing the path 302 in
The acquisition of the sensor fingerprint of step 202 may take place in various manners. In an embodiment, the database entity 100 may acquire the reference sensor fingerprint from each of the plurality of mobile devices 102-106. In this case, the reference sensor fingerprint may be measured by the mobile device at the location and/or environment in which the at least one object is detected by the mobile device. In an embodiment, the reference sensor fingerprint is acquired as part of a received digital content file representing the detected object from the mobile device. As an example, the reference sensor fingerprint may be stored as part of the digital content, such as the file format, of the detected object (e.g. an image, video, audio, as will be explained later). This may be beneficial as then the mobile device 102-106 need not separately transmit the fingerprint but it is stored as part of the digital content file of the detected object. This digital content file of the detected object may then be transmitted to the database entity 100 so that by receiving the object or an indication of the object, the database entity 100 simultaneously obtains the reference sensor fingerprint corresponding to this transmitted object. Alternatively, the database entity 100 may be authorized to access the object's stored digital content file in the mobile device.
In an embodiment, the database entity 100 may acquire the reference sensor fingerprint from another mobile device associated to the same user as the mobile device 102 from which the at least one object is acquired. For example, the person may carry a camera and a mobile phone. The camera may detect the object (e.g. capture an image) and the mobile phone may measure the reference sensor fingerprint. The devices may be configured to transmit the reference sensor fingerprint and the object to the database entity 100 or allow the database entity 100 to access the devices' contents via network. The database entity 100 may compare at least one predetermined comparison property of the acquired reference sensor fingerprint and of the acquired at least one object. Such property may be a time stamp when the object and the reference sensor fingerprint were detected/measured. The time stamp may be included in the file format of the object and of the reference sensor fingerprint. Another example property may be the location where the object and the reference sensor fingerprint were detected/measured, for example. The location may be detected with RF positioning system (such as Wi-Fi), satellite positioning system, EMF based positioning system, social media network (e.g. a status update indication the location of the mobile device), for example. The database entity 100 may acquire the indication of the location from the corresponding mobile device, e.g. as part of the file format of the object and of the reference sensor fingerprint.
Then the database entity 100 may associate the acquired reference sensor fingerprint with the acquired at least one object on the basis of the comparison. That is, if the property, such as the time stamp, is sufficiently similar, then the database entity 100 may determine that these correspond to each other. The location information may further aid in avoiding false associations. Whether the comparison property is sufficiently similar may be determined by applying a predetermined comparison threshold such that small deviations in the time stamps and/or location are allowed between one object-sensor fingerprint—association. This comparison threshold may be based on empirical derivation, for example. Further, there may an indication in one of the received data item (i.e. in the object or in the reference sensor fingerprint) according to which the received data item is to be associated with a data item (i.e. with the reference sensor fingerprint or with the object, respectively) received from a mobile device having a certain, indicated identifier.
These different devices of the same user may, in an embodiment, be connected together through, e.g. a short range communication connection, such as Bluetooth. This may allow the devices to transfer the object and/or reference sensor fingerprint between each other so that one device may perform the transmission of the object and the reference sensor fingerprint to the database entity 100.
In yet one embodiment, the database entity 100 may detect/identify the location of a given mobile device (e.g. the mobile device 102) among the plurality of mobile devices 102-106. The location may, as said, be detected with any positioning technique available. Thereafter, the database entity 100 may acquire the reference sensor fingerprint corresponding to the at least one object acquired from the mobile device on the basis of an sensor data map of the area in which the mobile device is detected to locate. Such sensor data map may be, e.g. an EMF map or an RF (fingerprint) map of the area. This embodiment may thus require that such EMF/RF map is available. The EMF map refers to a map of the area, wherein the map comprises EMF magnitudes and/or directions for each location within the area. An RF map, on the other hand, may represent signal strengths of the RF signals in the area. If such map is available, the database entity 100 may, e.g. read the reference EMF fingerprint from the EMF map and associate the read EMF fingerprint with the at least one object acquired from this mobile device. The read reference sensor fingerprint may correspond to the most likely traversed path along which the object is detected (i.e. the along the identified location). For example, in a corridor, the reference sensor fingerprint may correspond to the EMF/RF values along a predetermined spatial range along the corridor in the vicinity of the identified location.
Let us now consider the acquisition of the at least one object in step 200 of
Let us consider, as an example, that an object is an image captured by the mobile device 102. It may be, for example, that the mobile device 102 transmits the captured images automatically to a cloud in the internet for storing. Simultaneously, the mobile device 102 may also automatically transmit the corresponding sensor fingerprint to the cloud. It may be that the database entity 100 is comprised in the cloud or has access to the information stored in the cloud, so that the database entity 100 may acquire indication of the objects and of the sensor fingerprints from the cloud.
The indication of the object may comprise the content of the object (such as the image) or an indication where the content may be acquired.
An object may be anything that is related to the context, such as to the location and/or environment. Although the specification is written by defining that the object may be related to a location, the term “location” may be substituted with “context” or “environment”, such as an indoor and an outdoor environment. In an embodiment, the context may refer to the situation to which the object is related to, such as to a context in which an image was captured. For example, the context may refer to a motion of a vehicle, such as a car, or a walking motion of a person. Appropriate reference/target sensor fingerprints, acquired by applying e.g. speed sensors or acceleration sensors, may be recorded and used as an indicator of the current context to which the detected object is related to. In an embodiment however, the context denotes a location and/or an environment to which the at least one object is related to.
For example, in an embodiment, the object may be an image captured in the location corresponding to the sensor fingerprint. In an embodiment, the object may be an audio captured in the location. In an embodiment, the object may be a video captured in the location. In an embodiment, the mobile device 102-106 may capture the image, audio or video, and avail the object or an indication of the object and the corresponding sensor fingerprint to the database entity 100. This may take place by transmitting the object to the database entity 100 directly or allowing the database entity 100 to access the object data in the mobile device 102-106.
In an embodiment, the object may be an advertisement related to the location. The advertisement may be present in the location or the advertisement may be received at the location by the mobile device 102-106, such as a location specific mobile advertisement, for example. As the location specific mobile coupon or advertisement is received or detected (through a captured image, for example) by the mobile device 102-106, the mobile device 102-106 may provide an indication of the advertisement (i.e. of the detected object) to the database entity 100.
In an embodiment, the object may be any digital content detectable by the mobile device 102-106 the location/environment.
In an embodiment, the object may be identity of a person present in the location. The identity may be determined from images, audio, video, content of an electronic message (such as SMS, social media network message, email), social media network profile, or ID of the mobile device 102-106, for example. Thus, the person whose identity is determined may be the person carrying the mobile device, or another person present in the location, such as a person from which an image is captured at the location, or a person in a social network service.
In an embodiment, the object may be an operation performed in the mobile device at the location. The operation may be a status update in a social media network (Facebook, FourSquare, etc), transmission of a text message (SMS), a multimedia message, or an email, for example. In an embodiment, the object may be the content of an electronic message (text message, social media network message, multimedia message, email) sent or received in the location.
In an embodiment, the user of the mobile device 102-106 may himself/herself determine what is to be considered as an object. For example, the user of the mobile device 102-106 may determine that images and videos are comprised in the objects, whereas, for example, SMS messages are not. In another embodiment, the mobile device 102-106 may be pre-coded with instructions which determine those objects which are to be considered as objects. These objects may then be made available for the database entity 100, such as transmitted to the database entity 100 or to another entity to which the database entity 100 has access to or which transmits the indication of the objects to the database entity 100.
As said, in
Let us further consider that the mobile device 104 may travel a route 114 during which the mobile device 104 may detect the object 124. As shown in the table of
The mobile device 106 may travel a route 116 during which the mobile device 102 may detect two objects 126 and 128. As shown in the table of
The database entity 100 may, as said earlier, receive in step 202 the indication of the reference sensor fingerprint corresponding to the location in which the object is detected. The reference sensor fingerprint may be given as a vector comprising numerical values. In case of EMF fingerprint, the numerical values may represent the measured amplitude (Y1; Y2; . . . ; YN) and/or direction (Y1, X1; Y2, X2; . . . ; YN, XN) of the EMF as a function of distance or time. In case of RF fingerprint, the numerical values may represent the measured amplitude (Y1; Y2; . . . ; YN), for example. As a result, a graphical presentation of the measured reference sensor fingerprint may be provided, as shown for objects 120, 122 and 128, for illustrative purposes in
In one embodiment, the length of each sensor fingerprint may be determined on a case-by-case basis by the database entity 100 or by the mobile device 102-106. This may be beneficial in order to make sure that each sensor fingerprint comprises distinguishing characteristics. These distinguishing characteristics may refer to statistical characteristics of the sensor fingerprint vector. For example, it may be that the variation of the amplitude samples and/or direction samples of the sensor fingerprint is required to be above a predetermined threshold, which may be empirically or mathematically derived. These distinguishing characteristics/features may aid in distinguishing the plurality of sensor fingerprints from each other.
In an embodiment, as shown in
In an embodiment, it may also be that the duration for an sensor fingerprint may be limited. Limiting the length may be beneficial so as to reduce the amount of memory storage needed from the database entity. The limitation may be automatic on the basis of a maximum duration or distance set for any sensor fingerprint. In another embodiment, the limitation may be determined case-by-case so that if a shorter sensor fingerprint already comprises distinguishing features, then there may not be any need to store an sensor fingerprint of the maximum length. In such case, there may be parts of the continuous sensor fingerprint which do not correspond to any object, such as the part 404 in
Thereafter, in step 204, the database entity 100 may associate each object with the corresponding reference sensor fingerprint and in step 206 generate a database of associations between the reference sensor fingerprints and the objects. This is shown in the table of
In an embodiment, the objects may be categorized or grouped as outdoor objects and indoor objects on the basis of the reference sensor fingerprints. It may be that an sensor fingerprint from an outside area is different (e.g. the statistical variance may be smaller) than an sensor fingerprint from an indoor area. For example, the objects are categorized/grouped/clustered according to the similarities of the reference sensor fingerprints of features derived thereof.
In an embodiment, more detailed information about where the objects are actually detected, such as trains, subways, elevators, etc., may be acquired by the database entity. Thereafter, the database entity 100 may notice that a given group, comprise objects which are actually measured in one specific type of environment, such as in subways. This detection may be used by the database entity 100 to obtain knowledge about which environments, other than the previously mentioned indoor or outdoor environments, provide environment-specific sensor fingerprints.
In an embodiment, as shown in
Let us, as an example, assume that objects 122, 124 and 126 are objects which are detected outside. That is, the mobile devices, when detecting these objects 124-126 are located outside. On the contrary, objects 120 and 128 are located indoors. In such case, the grouping/categorizing may result in grouping the outdoor objects 122-126 in one group and grouping the indoor objects 120 and 128 in another group. This may provide a possibility to search for objects that are related to outdoors and/or to search for objects that are related to indoors. Although explained that, e.g., the outdoor environment may be an environment which may provide sensor fingerprints, such as EMF fingerprints, having similar properties so that objects from outdoor environments may be grouped together and distinguished from other environments, such as indoor environments, there may be other environments such as transportation types (subway, elevators, escalators) which provide similar possibilities. Further environment or sub-environment providing environment-specific sensor fingerprints for categorizing the corresponding objects may be, e.g., sea (sensor fingerprints measured in or above a sea in a boat, for example), mountain environments, for example.
In an embodiment, the exact location corresponding to the reference sensor fingerprint is not known, and an sensor data map, such as an EMF or an RF map, does not exist. In this embodiment, where the sensor data map is not known, the database entity 100 may not know where the mobile devices 102-106, and consequently where the detected objects, are. As such specific location information is not known, it may be beneficial that the reference sensor fingerprints are collected so that the detected objects may be categorized or grouped or clustered or indexed according to the reference sensor fingerprint or feature(s) derived from it which represent the locations/environments/environmental conditions of the detected objects.
However, in another embodiment, the sensor data map is known and the exact location of the mobile devices 102-106 may be determined on the basis of the reference sensor fingerprint and the sensor data map. The sensor data may refer to, e.g., EMF data or RF data. In this embodiment, the detected objects may be, with an increased reliability, associated with specific locations.
In an embodiment, the mobile devices 102-106 transmit and, thus, the database entity 100 acquires reference metadata from at least one of the mobile devices 102-106. The reference metadata may be determined by the mobile device 102-106 or the metadata may be determined by the database entity 100 on the basis of information related to the mobile devices 102-106. However, acquiring the metadata is not mandatory.
In an embodiment, the reference metadata comprises the measured sensor fingerprint. The sensor fingerprint may be stored in the digital content of the digital file representing the object (such as the captured image).
In an embodiment, the reference metadata comprises time and/or date when the reference sensor fingerprint was measured. This may be determined by the mobile device 102-106 or by the database entity 100. As shown, the table of
In an embodiment, the reference metadata comprises duration or distance corresponding to the reference sensor fingerprint. This may be determined by the mobile device 102-106, for example, and indicated to the database entity 100. Alternatively, the database entity 100 may determine this information on the basis of timing data or motion data obtained from the corresponding mobile devices 102-106. For example, the duration or distance corresponding to the reference sensor fingerprint may be determined on the basis of the motion data comprising inertial sensor data measured by the mobile device 102-106 during the measurement of the reference sensor fingerprint. As shown, the table of
In an embodiment, the reference metadata comprises indication of the location in which the reference sensor fingerprint was measured. This may be determined on the basis of any location discovery technique, such as a location discovery technique applying radio frequency (RF) signals (e.g. the strength of received signals), magnetic fields, satellite positioning system, etc). As shown, the table of
In an embodiment, the reference metadata comprises a reference to a social media network of a person associated with the mobile device. The mobile device 102-106 may allow the database entity 100 to access the list of Facebook friends of the person, for example. As shown, the table of
In an embodiment, the reference metadata comprises a type of each of the at least one object detected. The type may indicate whether the object is an object having a textual content, an image, a video, an electronic message, etc.
In an embodiment, the metadata comprises the type and/or model of the mobile device 102-106 used for the measuring the reference sensor fingerprint. This may be beneficial as the database entity 100 may be aware of bias associated with a specific type/model. If this is the case, the sensor fingerprint may correct the received reference sensor fingerprint from that mobile device so that all the reference sensor fingerprints are comparable with each other (i.e. the reference sensor fingerprints are made commensurable).
In an embodiment, the metadata comprises the user identification of the person associated with mobile device 102-106 which transmitted the detected object. Such indication may be obtained by the database entity 100 from any identifier (ID) transmitted by the mobile device. For example, the message carrying the indication of the detected object may carry also such globally unique ID. The unique ID may be related to the subscriber identity card (SIM) of the mobile device, for example. As shown, the table of
Thereafter, the database entity 100 may associate the acquired reference metadata with the at least one object indicated by the corresponding at least one mobile device 102-106. Again, such association is shown, for example, in the table of
Let us now look at how the database entity 100 may serve as a search engine. As shown in FIGS. 5 and 6A/6B, the database entity 100 may, in step 500, receive, from a user device 600, an indication of a target sensor fingerprint 602, wherein the target sensor fingerprint 602 is used as one search key for the search. The target sensor fingerprint 602 may be, e.g., a target EMF fingerprint or a target RF fingerprint. The indication of the target EMF fingerprint may be given as a vector of values representing the magnitude and/or direction of the target EMF or a feature derived from the target EMF fingerprint. The indication of the target RF fingerprint may be given as a vector of values representing the magnitude of the detected RF signals or a feature derived from the target RF fingerprint. In an embodiment, the user device 600 transmits the target sensor fingerprint 602 to the database entity 100. In an embodiment, the target sensor fingerprint 602 may be user defined. In an embodiment, the user device 600 may have measured the target sensor fingerprint 602. In an embodiment, the target sensor fingerprint 602 may be otherwise determined (e.g. by mathematical input, by drawing, etc.).
In one embodiment, the database entity 100 receive, from the user device 600, an indication of a target object, wherein the target object is associated with the target sensor fingerprint 602 and the target object indicates the target sensor fingerprint 602 to the database entity 100. The target sensor fingerprint 602 may be embedded into the target object implicitly or explicitly. The target sensor 602 may be embedded in the digital content of the file representing the target object, for example, as shown in
Thereafter, in step 502, the database entity 100 may determine which one or more reference sensor fingerprints 604-608 match, according to a predetermined similarity threshold, with the target sensor fingerprint 602. Such similarity threshold may be empirically or mathematically derived and may represent, for example, similarity in at least one statistical property between the fingerprint 602 and the fingerprints 604-608. An example statistical feature/property/characteristic may be the variance, peak-to-peak amplitude, mean value, mean deviation, frequency spectrum, N-dimensional feature (e.g. in time and/or in frequency domain) vector derived from the target fingerprint, etc. The comparison between the fingerprints 602-608 may be performed with respect to the magnitude and/or direction of the sensor represented by the fingerprints 602-608. It should be noted that the fingerprints 602-608 may be represented with numerical vectors. For the sake of illustration, graphical presentations are used in the Figures.
The comparison may comprise a graphical comparison of the graphical target and reference sensor fingerprint curves, a comparison between numerical values of the target and reference sensor fingerprints, a comparison between statistical features derived from the target and reference sensor fingerprints, etc.
Let us consider in
In step 504, the database entity 100 may select a subset 610 from the acquired objects, wherein the selection of the subset 610 is based on which one or more reference sensor fingerprints match, according to the predetermined similarity threshold, with the target sensor fingerprint. As said, in an embodiment, the match need not be a perfect match. In an embodiment, the whether the fingerprints match or not may be based on determining a distance between features derived from the fingerprints. The subset 610 may comprise one or more of the acquired objects. In an embodiment, as a result, the subset 610 may comprise those objects which are associated with the one or more reference sensor fingerprints 604, 606 that match with the target sensor fingerprint 602. As shown, the subset 610 may comprise two objects (#1A, #1B), such as audio and video files, associated with the reference sensor fingerprint 604 and one object (#2), such as a transmitted/received SMS, associated with the reference sensor fingerprint 606. The object(s) associated with the reference sensor fingerprint 608 may not be comprised in the list. These objects in the subset 610 may correspond to those detected objects which are relevant to the, possibly unknown, location/environment specified by the target sensor fingerprint 602. For example, the objects in the subset 610 may be images captured at that location, or contents of electronic messages received or transmitted at that location. In other words, the subset 610 may comprise objects of a plurality of different types.
In an embodiment, upon selecting the subset 610 of the objects, the database entity 100 may select at least one of the groups/clusters, in case the grouping 110 which is illustrated in
In step 506, the database entity 100 may then provide the user device 600 with an indication of the subset 610 of objects. The indication may be given in a form of a list of objects, or in any other manner readable by the user device 600. The user device 600 may then display the indication of the objects on a display of the user device 600. In this way, the database entity 100 returns, as a response to the search key from the user device 600, a list of relevant objects or a list of references to the objects associated to the location/environment specified by the search key.
In an embodiment, the database entity 100 may arrange the subset 610 according to a predetermined arrangement criterion, wherein the predetermined arrangement criterion comprises at least one of: relevancy on the basis of the match/distance between the target sensor fingerprint 602 and the reference sensor fingerprint 604-608 or between features derived thereof, date of the reference sensor fingerprint 604-608, reliability of the reference sensor fingerprint 604-608. For example, the objects in the subset 610 may be ordered so that the one which is associated to that reference sensor fingerprint 604, which provides the closest match with the target sensor fingerprint 602, may be the first in the subset 610. The one which is associated to that reference sensor fingerprint 606, which provides the furthest match with the target sensor fingerprint 602 but is still within the similarity threshold, may be the last in the subset 610. In another example, the object which is associated with that reference sensor fingerprint, which is most recently measured, may be the first in the subset 610. In another example, the object which is associated with the reference sensor fingerprint, which is most recently measured, may be the first in the subset 610.
In yet one embodiment, the object which is associated with that reference sensor fingerprint, which is most reliable, is the first in the subset 610. The reliability may be determined according to various criteria, including the age of the measured reference sensor fingerprint, the history information of the mobile device 102-106 which measured the reference sensor fingerprint (for example, if inaccurate sensor vectors has previously been received from this mobile device 102-106, then the reliability is not the best), the type and/or model of the mobile device 102-106 which measured the reference sensor fingerprint (e.g. some type/model may be known to cause inaccurate sensor data measurements), and/or the stability/motion of the mobile device 102-106 during the sensor data measurement (this may be detectable from motion data acquired from the corresponding mobile device 102-106).
In one embodiment, the database entity 100 may acquire an indication of target metadata, wherein the target metadata is further used as one search key for the search. Then the database entity 100 may select the subset 610 from the acquired objects, wherein the selection of the subset 610 is further based on comparison between the indicated target metadata and the reference metadata (see
In an embodiment, the target metadata may comprise a time frame with which the reference sensor fingerprint 604-608 is required to match. This may limit the selection so that only those objects which are associated with reference sensor fingerprints having a time stamp within the indicated time frame (such as within the last month) are listed in the subset 610. For example, all the objects associated with reference sensor fingerprints measured outside the given time frame are not comprised of the subset 610.
In an embodiment, the target metadata may comprise a reference to a social media network. In this embodiment, the subset 610 may be limited so that only those objects which are related to the indicated reference are comprised in the subset 610. Such reference may be, e.g. a list of friends in the social media network of the person carrying the user device 600. Then, only those images, messages, videos, etc., which are related to the indicated reference (such as comprise the name or image of at least one of the person's friends) are comprised in the subset 610.
In an embodiment, the target metadata may comprise duration and/or distance corresponding to the target sensor fingerprint 602. This may aid in making the target sensor fingerprint 602 and the reference sensor fingerprints 604-608 commensurable with each other. The distance may be obtained on the basis of motion data from the mobile device, for example.
In an embodiment, the target metadata may comprise type of the objects to be retrieved. In this case, only those objects which belong to the type of the target object are comprised in the subset 610.
In an embodiment, the database entity 100 may detect the geographical location in which the mobile device, e.g. the mobile device 104, is at the moment when the at least one object is acquired. This may be determined on the basis of a positioning system, such as satellite based system, RF signal based system, EMF based system, etc. Then the database entity 100 may associate each object with the corresponding geographical location. This is shown in
In an embodiment, data indication the location of the mobile device (i.e. location data) is stored as metadata in digital content file of the corresponding object so that the database entity 100 obtains this location data when it receives/accesses the file of the object.
Thereafter, the database entity 100 may acquire an indication of a target geographical area from the user terminal 600, wherein the target geographical area is further used as one search key for the search. The database entity 100 may then select the subset from the acquired objects, wherein the selection of the subset 610 is further based on which objects are associated with a geographical location within the indicated target geographical area. As a result, the subset 610 may comprise those objects which are associated with the one or more reference sensor fingerprints that match with the target sensor fingerprint and which are associated with a geographical location within the indicated target geographical area. This may be beneficial as there may be situations where the reference sensor fingerprint is somewhat similar even though they are measured in different locations. Then, obtaining the rough knowledge of the location of the location may be helpful in providing the user terminal 600 with the subset 610 of objects from only one location corresponding to the indicated target sensor fingerprint 602.
In an embodiment, the location or the area is indicated with an accuracy of one or more building or with an accuracy of one or more floors within a building. In an embodiment, the indication comprises satellite positioning system coordinates. In an embodiment, Wi-Fi is used for deriving the indication of the location or the area.
As shown in
The three-dimensional orientation of the mobile device 102 may be defined by at least one of the following: a rotation with respect to a first horizontal axis (such as X-axis or Y-axis), a rotation with respect to a second horizontal axis (such as Y-axis or X-axis, respectively), and a rotation with respect to a vertical axis Z. Let us consider this in more detail by referring to
In an embodiment, the database entity 100 may acquire motion data of the mobile device 102, wherein the motion data is measured by the at least one inertial measurement unit (IMU) comprised in the mobile device 102 during the measurement of the reference sensor fingerprint. In an embodiment, the motion data is stored as metadata in digital content file of the corresponding object so that the database entity 100 obtains this motion data information when it receives/accesses the file of the object. The motion data may be used to represent the sensor fingerprints (either the reference or the target fingerprint) as a function of distance, instead of or in addition to the fingerprint being a function of time. This may further help in providing correct hits in the search.
Further, the motion data may indicate the three-dimensional orientation of the mobile device 102 at the at least one time instant when the reference sensor fingerprint is measured by the mobile device 102. The orientation, as shown in
Thereafter, the database entity 100 may apply the inertial measurement results for determining, on the basis of the acquired motion data, at least one angle estimate of a difference between the three-dimensional orientation of the mobile device 102 and a three-dimensional coordinate system of the person carrying the mobile device 102. For example, in order to determine the amount of rotation about the Y-axis (
Finally, the database entity 100 may adjust the reference sensor fingerprint on the basis of the determined at least one angle estimate. This may be advantageous in order to commensurate the sensor fingerprints received from different mobile devices 102-106.
Also the target sensor fingerprint 602 may be adjusted in the similar manner if it is detected, for example on the basis of motion data acquired from the user device 600, that the three dimensional orientation of the user device 600 is not aligned with the axis of the XYZ coordinate system. In some cases it may be that the user defines the target sensor fingerprint 602 from a user interface on the user device 600. In this case, the user interface application of the user device 600 may make sure that the given target sensor fingerprint 602 represents the direction of the sensor in the desired coordinate system. However, in some other cases it may be that the user device 600 captures an image, transmits the captured image to the database entity 100 along with the target sensor fingerprint 602 associated with the captured image. Then it may be that the user device 600 has not been correctly oriented when it has measured the target sensor fingerprint 602 and/or may have moved during the measurement and, consequently, such target sensor fingerprint 602 may need to be corrected, as explained above.
Looking from the mobile device 102-106 point of view with respect to
Looking from the point of view of the user device 600 with respect to
In one example embodiment, the user device 600 may capture an image 1300 by a camera application installed in the user device 600, as shown in
In another example embodiment, the user device 600 may run a search application, such as Google, or apply the web browser to access Google search page. The user of the user device 600 may enter, e.g., an image or a reference (e.g. URL) to an image in the search field. The digital file of the image may comprise the target sensor fingerprint 602 as metadata or the sensor fingerprint 602 may be separately indicated to the database entity 100. Based on this sensor target fingerprint 602 the database entity 100 may then search and retrieve from the database all objects that are associated with a similar enough (based on the similarity threshold) reference sensor fingerprint. These searched objects may be listed according to the predetermined arrangement criterion and then provide the user terminal 600 with the arranged search results. In an embodiment, the user of the user device 600 may limit the search by indicating the type of the objects to be retrieved, such as only audio, image, video, identifiers of persons, etc.
Embodiments, as shown in
Each of the apparatuses comprise at least one processor 1002, 1102, 1202 and at least one memory 1004, 1104, 1204 including a computer program code, which are configured to cause the respective apparatuses (such as the database entity 100, the mobile devices 102-106, and the user device 600, respectively, to carry out functionalities according to any of the embodiments. The at least one processor may each be implemented with a separate digital signal processor provided with suitable software embedded on a computer readable medium, or with a separate logic circuit, such as an application specific integrated circuit (ASIC).
The apparatuses 1000, 1100, 1200 may further comprise radio interface components 1006, 1106, 1206 providing the respective apparatus with radio communication capabilities with the radio access network. The radio interfaces may be used to perform communication capabilities between the apparatuses. The radio interfaces may be used to communicate data related to the sensor fingerprints, detected objects, metadata, search results, location estimates, etc.
User interfaces 1008, 1108, 1208 may be used in operating the respective apparatuses. The user interfaces may each comprise buttons, a keyboard, means for receiving voice commands, such as microphone, touch buttons, slide buttons, etc.
The at least one processor 1002 may comprise a database generation circuitry 1010 for generating the database for the objects and the associated reference sensor fingerprints and, possibly, for the metadata. A search control circuitry 1012 may be for performing the search of the objects on the basis of the search keys. A calibration & correction circuitry 1014 may be responsible for correcting the received sensor fingerprints on the basis of the motion data, or on the basis of known bias, for example.
The at least one processor 1102 may comprise a reference sensor fingerprint generation circuitry 1110 for generating the reference sensor fingerprint with the help of the magnetometer 1120 or a signal reception unit 1126, a motion data measurement circuitry 1112 for measuring the motion data with the help of the IMU 1122 and/or the odometer 1124, an object detection circuitry 1114 for detecting objects and for generating reference metadata, and a calibration & correction circuitry 1116 for performing a calibration process of a magnetometer 1120 and/or the signal reception unit 1126, and/or correcting the acquired information from the magnetometer 1120 and/or from the signal reception unit 1126, for example. A camera 1128 and microphones may be used for capturing images and/or video (e.g. objects), for example. The signal reception unit 1126 may be for detecting the RF signals, such as WiFi, BLT, cellular RF signals, or for detecting GPS signals, for example.
The at least one processor 1202 may comprise a target sensor fingerprint generation circuitry 1210 for generating the target sensor fingerprint with the help of the magnetometer 1220 or a signal reception unit 1226, a motion data measurement circuitry 1212 for measuring the motion data with the help of the IMU 1222 and/or the odometer 1224, a metadata generation circuitry 1214 for generating target metadata, and a calibration & correction circuitry 1216 for performing a calibration process of a magnetometer 1220 and/or the signal reception unit 1126, and/or correcting the acquired information from the magnetometer 1220 and/or from the signal reception unit 1126, for example. A camera 1228 and microphones may be used for capturing images and/or video (e.g. target objects), for example. The signal reception unit 1226 may be for detecting the RF signals, such as WiFi, BLT, BLT low energy (BLE), cellular RF signals, or for detecting GPS signals, for example.
The magnetometer 1120 and 1220 may comprise at least one orthogonal measuring axis. However, in an embodiment, the magnetometer may comprise three-dimensional measuring capabilities. Yet in one embodiment, the magnetometer may be a group magnetometer, or a magnetometer array which provides magnetic field observation simultaneously from multiple locations spaced apart.
As used in this application, the term ‘circuitry’ refers to all of the following: (a) hardware-only circuit implementations, such as implementations in only analog and/or digital circuitry, and (b) combinations of circuits and software (and/or firmware), such as (as applicable): (i) a combination of processor(s) or (ii) portions of processor(s)/software including digital signal processor(s), software, and memory(ies) that work together to cause an apparatus to perform various functions, and (c) circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present. This definition of ‘circuitry’ applies to all uses of this term in this application. As a further example, as used in this application, the term ‘circuitry’ would also cover an implementation of merely a processor (or multiple processors) or a portion of a processor and its (or their) accompanying software and/or firmware. The term ‘circuitry’ would also cover, for example and if applicable to the particular element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a entity, a cellular network device, or another network device.
The techniques and methods described herein may be implemented by various means. For example, these techniques may be implemented in hardware (one or more devices), firmware (one or more devices), software (one or more modules), or combinations thereof. For a hardware implementation, the apparatus(es) of embodiments may be implemented within one or more application-specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof. For firmware or software, the implementation can be carried out through modules of at least one chip set (e.g. procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory unit and executed by processors. The memory unit may be implemented within the processor or externally to the processor. In the latter case, it can be communicatively coupled to the processor via various means, as is known in the art. Additionally, the components of the systems described herein may be rearranged and/or complemented by additional components in order to facilitate the achievements of the various aspects, etc., described with regard thereto, and they are not limited to the precise configurations set forth in the given figures, as will be appreciated by one skilled in the art.
Embodiments as described may also be carried out in the form of a computer process defined by a computer program. The computer program may be in source code form, object code form, or in some intermediate form, and it may be stored in some sort of carrier, which may be any entity or device capable of carrying the program. For example, the computer program may be stored on a computer program distribution medium readable by a computer or a processor. The computer program medium may be, for example but not limited to, a record medium, computer memory, read-only memory, electrical carrier signal, telecommunications signal, and software distribution package, for example. Coding of software for carrying out the embodiments as shown and described is well within the scope of a person of ordinary skill in the art.
Even though the invention has been described above with reference to an example according to the accompanying drawings, it is clear that the invention is not restricted thereto but can be modified in several ways within the scope of the appended claims. Therefore, all words and expressions should be interpreted broadly and they are intended to illustrate, not to restrict, the embodiment. It will be obvious to a person skilled in the art that, as technology advances, the inventive concept can be implemented in various ways. Further, it is clear to a person skilled in the art that the described embodiments may, but are not required to, be combined with other embodiments in various ways.
This is a Continuation-in-Part of application Ser. No. 14/054,264 filed Oct. 15, 2013. The disclosure of the prior application is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 14054264 | Oct 2013 | US |
Child | 14093250 | US |