This pertains to remote sensing of physical items and has application, by way of example, in retailing.
An online store can capture and record clicks by consumers, and that data provides insights into customer behavior and interests. For example, if many visitors click on a specific product, an owner of the online store can infer that that the product is attracting interest and can analyze what kind of demographics are behind that interest. Such tracking is not possible in physical (a/k/a “brick-and-mortar” or “offline” stores). This limits access to highly valuable data and, by way of example, the ability of brands and/or retailers to provide unified commerce experience across the online and offline channels.
A more complete understanding of the discussion that follows may be attained by reference to:
Disposed on user 12 is a device 16—here, a personal digital assistant in the form factor of a “smart” watch of the type known and operating in accord with the art as adapted in accord with the teachings hereof. In other embodiments, the device 16 may be another manner of personal digital assistant, e.g., a cell phone, an electronic bracelet, a contactless loyalty card, or other wirelessly-detectable device that is disposed in, on, or about the user 12 and by which he/she may be distinguished (directly or indirectly) from other individuals (also bearing such devices) who may be present in space 10. Such user devices may be of the type known in the art and operating (actively, passively or otherwise) in accord therewith as adapted in accord with the teachings hereof.
Disposed on retail items 14 are wireless IoT sensors of the type known in the art as adapted in accord with the teachings hereof. Eighteen such sensors 18 are shown identically on respective items 14 in the drawing; to avoid clutter, only three are labelled. Each sensor 18 comprises a processor 20, antenna 22 and one or more sensor elements 24, shown in breakout by way of example in the upper left of the drawing. The elements 20-22 on a given retail item 14 may be separate but interconnected with one another or they may be integrated onto a single circuit board (flexible or otherwise) or on a single chip (e.g., as in the case of a SoC, or system on a chip, implementation), or otherwise. IoT sensors 18 can be disposed on or in the respective items 14; this is referred to herein as “on” for simplicity and without loss of generality.
Sensor elements 24 of IoT sensor 18 are of the type known in the art for any of optical, acceleration, tilt, force, load, torque, strain, pressure, position, presence, motion, velocity, displacement, temperature, acoustic, sound, vibration or other sensing, as selected and adapted in accord with the teachings hereof. The elements 24 are employed, here, to sense changes in position and/or orientation of the item 14 on which they are disposed. Though two elements 24 are shown in the breakout, a greater or lesser number of them may be provided in other embodiments.
The processor 20, which can include or be coupled with random access memory and input/output subsections, may be of the type known in the art for use in an IoT sensor as adapted in accord with the teachings hereof, e.g., through loading and execution of software 26, discussed below.
Antenna 22 can be of the conventional type known in the art for use with IoT sensors. This can be configured for use, along with processor 20 or other logic, to wirelessly transmit information signals utilizing WiFi, Bluetooth, ZigBee, 2G/3G/4G cellular, or other protocols known in the art as adapted in accord with the teachings hereof. As indicated by wavefronts in the drawing, antenna 22 (and the IoT sensor 18 of which it forms a part) wirelessly communicates with user device(s) 16 as well as with digital data processor 30. Communications with the latter are additionally supported by network 28, which comprises one or more conventional networks local area networks (LANs), wide area networks (WANs), metropolitan area networks (MANs), and or Internet(s) suitable for supporting communications between IoT sensor 18 and digital data processor 30.
Digital data processor 30 comprises a conventional desktop computer, workstation, minicomputer, laptop computer, tablet computer, PDA, mobile phone or other digital data device of the type that is commercially available in the marketplace as adapted in accord with the teachings hereof. It may be configured as, coupled to, and/or provide a database system (including, for example, a multi-tenant database system) that includes disk or other storage 32, or other system or environment, and it may be arranged to communicate with IoT sensors 18 in peer-to-peer fashion, per a client-server model, or otherwise, consistent with the teachings hereof.
The software 26 comprises computer programs (i.e., sets of computer instructions) stored on transitory and non-transitory machine-readable media of the type known in the art as adapted in accord with the teachings hereof, which computer programs cause the sensor 18 and, more particularly, the respective processor 20 thereof, to perform the respective operations and functions attributed thereto herein. Such machine-readable media can include, by way of non-limiting example, hard drives, solid state drives, and so forth, coupled to the respective sensor 18 and processor 20 directly or indirectly (e.g., via network 28) in the conventional manner known in the art as adapted in accord with the teachings hereof.
In step 34, the software 26 detects a change in position and/or orientation of the item 14 in which that software 26 is executing. This can be by monitoring one or more of the respective sensor elements 24 (e.g., a tilt and/or motion sensor element) disposed in that item 14 and/or by responding to an interrupt or other event signaled by such element(s) 24, or otherwise as within the ken of those skilled in the art in view of the teachings hereof.
In step 36, the software 26 identifies a user device 16 (and thereby, indirectly, its user 12) associated with the change of position/orientation of the item 14 in which that software is executing. This can be by querying signals received by one or more of the respective sensor elements 24 (e.g., a wireless transmitter sensor) disposed in that item, by monitoring incoming transmissions received on antenna 22 of that item 14, or otherwise as within the ken of those skilled in the art in view of the teachings hereof, to determine whether there is a device 16 in sufficiently close range to be likely associated with a user 12 responsible for the change in position/orientation. See step 38. That range can be as little as 1-foot, by way of example, to insure that only a watch 16 on the arm of a responsible user 12 is detected, or as much as 10- to 15-feet or more, by way of further example, to detect a cell phone owned by that user 12 disposed in a nearby shopping cart, all in accord with the demands of the embodiment and environment in which it is utilized.
The identification determined in step 36 can be any manner of identification associated with the device 16 from which the identity of its user 12 can be discerned (directly or indirectly) by specific individual, by population segmentation, classification or otherwise. In some embodiments, the user devices 16 execute general- or special-purpose software causing them to generate signals from which their user's respective identity can be more readily determined, e.g., upon transmission to digital data processor 30.
In the event the software 26 detects multiple user devices 16 within range, it attempts to resolve which of those devices is most likely associated with a user 12 responsible for the change in the position/orientation of the item 14 in which that software is executing. This can be by comparing the strength of signals received from the multiple devices 16 at one or more of the respective sensor elements 24, at the antenna 22 of that item 14, or otherwise as within the ken of those skilled in the art in view of the teachings hereof. If the software 26 is able to resolve among multiple user devices, it discerns an identification of the type discussed above associated with the device 16 most likely associated with the change in position/orientation.
If the software 26 is unable to identify a specific user device 16 associated with the change of position/orientation, it takes no further action. Otherwise, processing progresses to optional step 44, where the software 26 monitors for further changes in position/orientation and/or a cessation in same. The software 26 performs such further monitoring in the manner discussed above in connection with step 34 or otherwise as within the ken of those skilled in the art in view of the teachings hereof.
In steps 46-48, the software 26 responds to a cessation of motion by determining the length of them that the item 14 was held for inspection or otherwise by user 12. This can be by comparing the time the change in position/orientation was first detected (in step 34) with the time it stopped (in step 44) or otherwise as within the ken of those skilled in the art in view of the teachings hereof.
Alternatively, or in addition, in steps 50-52, the software responds to further changes in the position/orientation of the item 14 in which that software 26 is executing by determining what sort of interaction the user 12 is engaged in with the item 14. That can include, for example, determining whether the user is trying (e.g., trying on) the item 14, looking at a price tag or other label on the item 14, moving the item 14 within the retail space 10 or otherwise. The software makes the determination by comparing the changes detected in steps 24 and 34 with patterns characteristic of those types of interactions.
In step 54, the software 26 transmits to network 28, e.g., for routing to digital data processor 30, one or more of the identity of the item 14 in which the software 26 is executing, the identity of the user device 12 determined in step 36, and the duration and/or type of interaction determined in steps 48, 52.
Steps 56 and 58 of
The embodiments above are merely illustrative Other embodiments are contemplated, as well. For example, although in the embodiment of