The accompanying drawings illustrate implementations of the concepts conveyed in the present patent. Features of the illustrated implementations can be more readily understood by reference to the following description taken in conjunction with the accompanying drawings. Like reference numbers in the various drawings are used wherever feasible to indicate like elements. In some cases, parentheticals are utilized after a reference number to distinguish like elements. Use of the reference number without the associated parenthetical is generic to the element. Further, the left-most numeral of each reference number conveys the figure and associated discussion where the reference number is first introduced.
This description relates to friction-free inventory control concepts. Existing inventory controls tend to be ineffective (e.g., inaccurate) and/or burdensome to users involved with them. The following description offers friction-free inventory control that can be implemented nearly seamlessly for users. These inventory control concepts can be implemented in almost any use case scenario that involves tracking locations of items, objects, and/or users and/or their inter-relationships in a physical environment. For purposes of explanation, the description first turns to a retail shopping scenario, followed by a construction/manufacturing scenario, and finally a health care scenario.
Traditionally, in retail shopping scenarios, inventory control has been accomplished manually by forcing the user to go through a check stand where a clerk either manually enters, or electronically scans the user's items. The user then pays the clerk for the items before leaving. Waiting in a check-out line is frustrating for shoppers and is consistently perceived as the least enjoyable part of shopping. Attempts have been made to reduce these checkout lines by utilizing self-check kiosks. However, the process still has similar pitfalls and users often end up waiting in line for a kiosk and waste time in the check-out process. Often users have trouble with the self-check process which tends to cause delay and results, once again, in longer check out times. More sophisticated attempts to provide a seamless user experience face the daunting technical challenge of unobtrusively and accurately identifying users that are in the inventory control environment and determining what inventory items individual users have in their possession. In light of these and other goals, the present concepts can utilize data from multiple sensors over time to identify users and items and associations therebetween.
The inventory control environment 100 can also include various sensors (indicated generally at 110). In this example, the sensors 110 include RFID sensors (e.g., antennas) 112, cameras 114 (visible light and/or infrared, 2D and/or 3D), NFC sensors 116, and/or weight sensors (e.g., scales) 118, among others. The RFID sensors 112 and NFC sensors 116 can sense tagged items 108. The cameras 114 and weight sensors 118 can sense tagged items 108 and/or untagged items 102.
In some implementations, the sensors 110 can be organized into sets 120 to achieve a given function. For instance, a first set 120(1) can operate to sense items 102, while a second set 120(2) can operate to sense users 122 (
Other implementations may identify the individual users 122 with additional or alternative techniques. For instance, individual users may have their smart phones with them. Communications can be established with the smart phone to identify the user and the user's location can be tracked by tracking the location of the smart phone. In one example, the user may have an app on their smart phone for an entity associated with the inventory control environment 100. The app may include an agreement that defines conditions of use that have been approved by the user. The conditions of use may allow the entity to use the smart phone to identify and track the user when the smart phone is detected in the inventory control environment 100. The app may also define payment aspects (discussed more below). In another example, the user may wear smart wearable devices, such as bands, glasses, belts, and/or rings, to achieve the same capabilities of the smart phones.
Viewed from one perspective, an advantage of the present implementations is the ability to utilize whatever sensor data is available from one or more types of sensors and to analyze this collection of sensed data to obtain information about users, items, and/or relationships between users and items. This process can be termed ‘sensor fusion’ and will be explained in further detail in the discussion below. Sensor fusion can reduce the limitations and uncertainties that come with any type of sensor by combining observations from multiple sensors over space and time to improve the accuracy of determinations about the items and/or users. This improved accuracy can be achieved without inconveniencing the users.
In the inventory control environment 100, the users 122 can interact with various items 102 in a traditional manner. For instance, as illustrated in
In this example, looking at
An action can be taken based upon the user's possession of the items 102 from the first location to the second location. For instance, the user 122(1) can be deemed to want to purchase the items 102 in their possession at the second location. The items 102 in the user's possession can be verified at the second location. For instance, in this example, a listing 132 of the tagged items can be provided to the user, such as on displays 134. The user can verify the listing 132. The user can then be charged for the possessed (and verified) items 102, such as on a credit card account on record for the user or by the user paying cash, EBT, check, or other traditional forms of payment for the items. The payment aspect may be defined according to conditions agreed to by the entity associated with the inventory control environment (e.g., operating entity) and the user, such as by an app on the user's smartphone. In some implementations, the user can continue on her way without the hassle of checkout lines and the shopping experience can be seamless from beginning to end.
The illustrated scenario involves users 122A(1) and 122A(2) and using sensor fusion and co-location to determine which user is in possession of example item 102N.
This implementation is not directed to specific types of sensors 110N and instead can utilize whatever sensor data is available. The available sensor data can be fused together to obtain information about users 122A and items 102N over time. For instance, fused data relating to the users can provide many useful parameters, such as skeletal parameters, facial parameters, heat, footsteps, gait length, pulse, respiration rate, etc. These parameters can be used for distinguishing and/or identifying various users. These parameters can also be used for locating individual users and/or detecting user gestures, such as their motion and/or activity, such as walking and/or picking something up.
Similarly, sensor fusion can provide sensed data relating to the appearance of the items, such as shape, design, color, pattern, size, weight, and/or material. These parameters can be used to identify an item, but can be even more accurate when combined with tag information, such as RFID tags, unique codes, such as Q codes and/or other physically distinctive aspects. The location of individual items can be tracked with vibration/acceleration data, ultra-sound reflection data, and/or displacement in camera field of view, among others.
In the illustrated example, weight sensors, cameras, and RFID sensors can all provide information about whether the item is still on the shelf or not. Once the item is picked up, both the cameras and the RFID sensors can provide data that can be used for determining its location. If the user is holding the item, the cameras may provide more accurate location information than the RFID sensors and as such be weighted higher in determinative value. In contrast, if the user puts the item in a shopping cart and puts other items on top of it, the value of the camera data may decrease and be weighted lower than RFID data. The available sensor data can be collectively evaluated or fused to determine the locations of the users and the items at various times. Consecutive locations can be utilized to track paths 202A(1), 202A(2), and 202A(3) (
From one perspective, the illustrated implementation can provide useful information about objects and users through one or more sensor fusion paradigms, such as multi-sensor fusion, temporal-spatial fusion, and/or source separation. For instance, a single event, such as identifying an item is observed/sensed by multiple sensors (in some cases with multiple modalities per sensor). The observations can be fused together to provide a more accurate identification than can be achieved by a single sensor. For instance, an item can be identified by its size, shape, and/or color (e.g., using multiple cameras from multiple view angles). The item can also be sensed for weight (from a scale beneath it) and/or for composition by a metal detector. In one such example, a metal can of soup can be distinguished from an aluminum can of soda despite similar weights, shapes, and labels.
Temporal-spatial fusion observations of an individual item can be made over time and space. Physical laws (such as motion) and correlations can be used to constrain the possible states of the item and reduce uncertainties. For example, Newton's law can be applied to the sensor data to model the trajectory of the item. Given an estimation of the current position and an observation of any applied force, temporal-spatial fusion implementations can estimate the next possible position of the item and its uncertainty.
In source separation fusion, an observation of items/users may contain signals from multiple events mixed together. Features can be used to estimate which part of the signal comes from which source (e.g., sensor). For instance, multiple users may be talking at the same time. When sensed with a microphone array, source separation fusion implementations can separate individual users based on the direction of the sound source. The present implementations can employ various fusion algorithms, such as statistics, Bayesian inference, Dempster-Shafer evidential theory, Neural networks and machine learning, Fuzzy logic, Kalman filters, and/or Particle filters. An example Particle filter implementation is described in more detail below relative to
Time Five shows user 122A(2) once again co-located with the item 102N while user 122A(1) is relatively far from the item 120N and is moving away. When viewed collectively, analysis of the sensed data can indicate that both users were near the item at Time Two, but then at Time Three user 122A(1) moved away from the item while the item moved with user 122A(2). At Time Four, the users were both once again close to the item, but again user 122A(1) moved away from the item while the item moved or tracked with user 122A(2) to checkout at Time Five. Thus, analysis of the sensor data over the time range can indicate that it is much more likely that user 122A(2) is in possession of the item than user 122A(1) and further, user 122A(2) plans to purchase the item.
This accurate determination can be achieved without requiring the locations of the items and users be determined with precision. Instead, when determined at multiple times, approximate locations can provide very reliable (e.g. high confidence level) results about interrelationships of individual items and individual users. For instance, the approximate locations of the users and items could be a circle having a diameter of 1-5 meters. Multiple approximate locations can be evaluated over time to provide highly accurate inter-relationships.
The location information about the items and the users can be useful in other ways. For instance, rather than the scenario described above where user 122A(2) picks up item 102N and leaves the inventory control environment with the item, consider another scenario where the user puts the item back on another shelf at Time Three (T−3). This information can be used in multiple ways. First, the item is less likely to be purchased by another user when it is out of place. Also, it creates the appearance that inventory of that item is lower than it actually is. Further, if the item has special constraints, such as regulatory constraints, the location information can ensure that those constraints are satisfied. For instance, assume that the item is a refrigerated food item, such as a carton of milk that the user took out of the refrigerated environment at Time Two and put back on a non-refrigerated shelf at Time Three. The location information provides information of not only where the item is, but how long it has been there (e.g., when it was removed from the refrigerated environment). This information can allow appropriate measures to be taken in regards to the item. For instance, the item can be returned to the refrigerated environment within a specified time or disposed of after that time to avoid product degradation.
In another example, the item location information can be used to curtail nefarious behavior. For instance, if the item location information indicates that the item left the inventory control environment at a specific time, but no one paid for the item, this information can be used to identify system shortcomings (e.g., someone had it in their cart but the system failed to charge them for it). Alternatively, an individual user, such as a shopper or an employee may have taken active measures to leave without paying for the item. Various actions can be taken in such a case. For instance, if over time, multiple items leave the inventory control environment without being paid for, analysis of users leaving at the same time can indicate a pattern of a particular user leaving with items without permission (e.g., without paying for them). The present techniques can also provide a confidence level for each user leaving with the item. For instance, users one, two, and three all left the inventory control environment at the same time as the item. Based upon their locations through the inventory control environment and co-location with the item, the likelihood that user one has the item is 40%, user two 30%, and user three 20% (with a 10% chance that the none of them has the item). Looking at previous instances, user one has previously been associated with items ‘leaving’ the inventory control environment and so confidence levels can be adjusted to 60% for user one, 20% for user two, and 10% for user three, for example.
As mentioned above, the present inventory control concepts can be employed in many use case scenarios. In a manufacturing or construction scenario, the sensor fusion and co-location aspects can be used to track the user and items and/or other things. For instance, sensor fusion can be used to identify IoT devices and/or robots/AI devices. For example, sensor fusion can be used to sense parameters relating to appearance, size, weight, RF signature, power signature, etc. of these ‘devices.’ This information can be used to identify individual devices. Location of these devices can be determined (actual and/or relative to items and/or users) utilizing RF reading range, triangulation, RF phase change, Doppler shift, and/or inertial measurement units, among others. For example, Doppler shift can be used to determine whether the item is moving toward or away from an individual sensor. Alternatively or additionally, Doppler shift can be used to track local motion of the item/object, such as caused by arm swinging, and compare it with motion of arms in the scene using computer vision. Utilizing any combination of the above sensor data, the present concepts can be utilized to identify any kind of object or being, determine its location, and/or determine inter-relationships with other objects and/or beings.
Multiple beneficial examples of utilizing this knowledge are provided above, but other examples are contemplated. For instance, in the manufacturing/construction scenario, a user may leave the inventory control environment with an item, such as a tool. If the user does not have permission, appropriated steps can be taken, but more importantly, even if the user has permission, important steps can be taken to increase efficiency. For instance, the user may take the tool to another jobsite (e.g., another inventory control environment), but the tool may be needed the next day at this jobsite. The fact that the tool is no longer at the inventory control environment can allow appropriate action to be taken, such as obtaining a replacement tool so that process can be performed as planned the next day.
In another example, the inventory control concepts can be employed in a health care setting. For example, assume that the inventory control environment includes inventory areas, such as in a pharmacy, and a patient care area, and that both of these areas are covered by sensors throughout the inventory control environment. Assume, that a user (e.g., health care provider) such as a doctor prescribes a prescription medicine for the patient in room ‘814’ and enters this information into an inventory control/tracking system. The prescription medicine can be maintained in the inventory control environment. Another health care provider, such as a nurse can retrieve the prescription medicine. (This could occur directly or another health care provider, such as a pharmacist, may retrieve the prescription medicine and transfer it to the nurse). In either scenario, information from the sensors can identify that a user is now in possession of the prescription medicine, which health care provider possesses the prescription medicine, and/or the location of the prescription medicine within the health care facility.
Now assume that the nurse accidentally transposes the room number and enters patient room ‘841’ with the item (e.g., prescription medicine) rather than patient room ‘814.’ In such a case, within the inventory control environment, a location of an individual inventory control item has been identified and the location has been correlated to an individual (identified) user (this user is in possession of the item). As a result, actions can be automatically taken to prevent the prescription medicine from being administered to the wrong patient or otherwise mishandled. For instance, an alarm could be set off and/or a notice, such as a page or a text, could be sent to the nurse and/or the nurse's supervisor. Thus, without any user involvement or hassle, the inventory control environment can determine the location of items and who is in possession of individual items.
Looking first at Scenario One and Scenario Two particle filter sensor fusion technique 400 can fuse data from sensors 110N to determine an initial probability for each scenario. For instance, the sensors can provide item weight, item location, item image, user biometrics, user gestures, etc. The sensor data can also include stored data from previous user interactions, such as user purchase history and/or other information about the user. For instance, stored data could indicate that user 122B(1) has purchased item 102B(1) in the past, but never item 102B(2) and conversely, user 122B(2) has purchased item 102B(2) in the past, but never item 102B(1). The particle filter sensor fusion technique 400 can utilize this data to determine the initial probability for each scenario at 402. In this example, for purposes of explanation, assume that the initial probability for Scenario One is 70% and the initial probability for Scenario Two is 30%.
The particle filter sensor fusion technique 400 can next address the possibility of a handoff from one user to another in the inventory control environment at 404. Specifically, the particle filter sensor fusion technique can determine the probability that user 122B(1) handed whatever item he/she has (indicated as 102B(?)) to user 122B(3) when they pass each other. Item 102B(?) is shown with a cross-hatching pattern that is the sum of the patterns of items 102B(1) and 102B(2) to indicate the identity of the item is not known with certainty. For purposes of explanation, the particle filter sensor fusion technique can determine an initial probability of the handoff at 406. In this example, for purposes of explanation, assume that the initial probability of a handoff is 50% (50% probability that user 122B(1) transferred item 102B(?) to user 122B(3) and 50% probability that he/she retains the item).
The particle filter sensor fusion technique 400 continues to analyze sensor data over time at 406. This analysis of sensor data over time can increase and refine the initial determinations. For instance, in the illustrated example, various sensors 110N can continue to track user 122B(1) to increase the reliability of the initial determination whether user 122B(1) has item 102B(1). In this example, this additional sensor data may allow the confidence that user 122B(1) has item 102B(1) to approach 100%. For instance, a threshold can be defined, such as 95%, for example. Thus, if the additional data sensed over time provides a confidence level that satisfies the threshold, then the analysis can be treated as determinative as indicating at 408 that user 122B(1) is in possession of item 102B(1). If the confidence level does not satisfy the threshold, additional resources can be employed at 410 to increase the confidence level. In this example, the additional resources can include a human assistant who reviews the sensed data and makes the determination about what (if any) item user 122B(1) possesses. (In another example, the additional resource can be additional processing resources). Thus, the additional resources can increase the confidence level about the threshold. With or without employing additional resources, a determination can be made with a confidence that satisfies the threshold that user 122B(1) is in possession of item 102B(1) at 412.
In that case, user 122B(1) did not hand off this item to user 122B(3) at 404. Thus, these percentages can be recalculated to reflect the probability of the handoff as 0%. Further, looking back to 402, because user 122B(1) has item 102B(1) the likelihood of Scenario One can be recalculated to 100% and the likelihood of scenario two can be recalculated to 0%. Further, given that Scenario One occurred at 402 and no handoff occurred at 406, a final determination can be made at 414 that user 122B(1) is in possession of item 102B(1), user 122B(2) is in possession of item 102B(2) and user 122B(3) is not in possession of either item 102B(1) or 102B(2). This information can be used at 416 to refine models applied to future scenarios in the inventory control environment to increase accuracy of determinations that individual users are in possession of individual items.
The method can give each particle a value based on initial distribution at 504. Initial distribution could start equally between all particles. For instance, particle weight (w) can be expressed as: w (x,y)=1/total # live particles. For example, assume that there are three people in a region of the inventory control environment. In the beginning, the distribution may be 33% per person given that the method can equally distribute the probability percentage.
Then, the initial estimates can be updated using sensor data at 506. Thus, initial particle values can then be updated based upon sensor data from various sensor sources. For example, continuing with the above example, assume that the region that includes the three people is covered by cameras. For instance, using an input video stream from the cameras or a combination of sensors (ex. RFID tags), and using the formula above, the method can adjust the probabilistic formula for each individual to reflect the updated belief % (confidence level) of who is the person of interest. As an example, the input data would shift the probability of the users from 33%, 33%, 33% to 20%, 60%, 20%, for example, which means the method is identifying the second person with a 60% confidence level.
The above example, reflects utilizing information from the sensors to update the probability value. Given that sensor data can be sampled over time (e.g., a time series recording of all three individuals in this example), and the fact that the second user has now been identified with a 60% confidence level, the method can now back track to the history of the video stream to identify the unknown users at time zero, when their probability was equally weighted. Which means, effectively the method can change the probability of time zero from 33%, 33%, 33% to the new probability model of 20%, 60%, 20%. This brings a level of accuracy to the system using future probability values for historical events.
The updated weights can supplant the assigned weights in the next iteration at 508. Thus, the user's location can be tracked (e.g., as a path) as the user progresses through the inventory control environment.
Another example can relate to RFID sensors. Multiple RFID sensors can be positioned in the inventory control environment, such as in the example of
The method can receive sensed data from multiple sensors in an inventory control environment at block 602. The multiple sensors can all be of the same sensor type or the sensors can include sensors from different sensor types. For instance, in the examples described above, sensor types include RFID sensors, NFC sensors, cameras, scales, accelerometers, and gyroscopes, among others. Receiving sensed data can also entail receiving stored data, such as previously sensed data, and/or data about the users, such as stored biometric data, shopping history, user profile and billing information, etc., and/or information about the inventory control environment, such as maps of the inventory control environment, sensor layout, inventory history, etc.
At block 604, the method can fuse the data received over time to identify items and users in the inventory control environment. Various techniques can be employed to fuse the data from the various sensors. In some cases, each type of sensor data can be weighted equally. In other cases, some sensors data can be weighted higher than other sensor data. For example, if the item is a pineapple, visual identification via camera data (e.g., images) may be highly accurate and determinative. In contrast, for a stack of similarly colored garments on a shelf, visual identification may provide low accuracy. Thus, in the former scenario involving the pineapple, camera data may be weighted higher than other types of sensor data. In contrast, in the latter scenario relating to garments, camera data may be weighted lower. The fusing can continue over a duration of time. Confidence in identification of users and items can increase over time with repeated sensing. Further, confidence in co-location of items and users and hence any interpreted association can increase over time.
The method can determine locations of the items and the users in the inventory control environment from the fused data at 606. Various examples are described above relative to
The method can associate individual items and individual users based upon instances of co-location in the inventory control environment at 608. For instance, the locations can be overlaid to detect simultaneous co-location of individual items and individual users. The prognostic value of co-location increases as the individual user and the individual item are co-located along an extended path that culminates at an exit from the inventory control environment. In such a case, the association can be a presumption that the individual user is in possession of the individual item and intends to purchase the individual item. Thus, the individual user can be charged for the individual item when the associating continues until the individual user leaves the inventory control environment.
The described methods can be performed by the systems and/or elements described above and/or below, and/or by other inventory control devices and/or systems.
The order in which the methods are described is not intended to be construed as a limitation, and any number of the described acts can be combined in any order to implement the method, or an alternate method. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof, such that a device can implement the method. In one case, the method is stored on one or more computer-readable storage medium/media as a set of instructions (e.g., computer-readable instructions or computer-executable instructions) such that execution by a processor of a computing device causes the computing device to perform the method.
In either configuration 710, the device can include storage/memory 724, a processor 726, and/or a sensor fusion component 728. The sensor fusion component 728 can include a sensor fusion algorithm that can identify users and/or items by analyzing data from sensors 110. The sensor fusion component 728 can include a co-location algorithm that can identify locations over time (e.g., paths) of users and/or items by analyzing data from sensors 110. From the locations, the co-location algorithm can identify instances of co-location (e.g., same place same time) between items and users.
The sensor fusion component 728 can be configured to identify users and items and to detect when an item is moved from an inventory area. For instance, the sensor fusion component 728 can be configured to analyze data from the sensors 110 to identify items and users in the inventory control environment and to detect co-location of an individual user and an individual item at a first location in the inventory control environment at a first time and at a second location at a second time. For example, the sensor fusion component can be configured to process data from the set of ID sensors to track locations of an ID tagged inventory item from the first shared space to the second shared space, the sensor fusion component can be further configured to process images from the set of cameras to identify users in the inventory control environment. The sensor fusion component can be further configured to correlate the tracked locations of the ID tagged inventory item to simultaneous locations of an individual identified user.
In some configurations, each of devices 704 can have an instance of the sensor fusion component 728. However, the functionalities that can be performed by sensor fusion component 728 may be the same or they may be different from one another. For instance, in some cases, each device's sensor fusion component 728 can be robust and provide all of the functionality described above and below (e.g., a device-centric implementation). In other cases, some devices can employ a less robust instance of the sensor fusion component 728 that relies on some functionality to be performed remotely. For instance, device 704(2) may have more processing resources than device 704(1). In such a configuration, training data from ID sensors 112 may be sent to device 704(2). This device can use the training data to train the sensor fusion algorithm and/or the co-location algorithm. The algorithms can be communicated to device 704(1) for use by sensor fusion component 728(1). Then sensor fusion component 728(1) can operate the algorithms in real-time on data from sensors 110 to identify when an individual shopper is in possession of an individual item. Similarly, identification of users within the inventory control environment can be accomplished with data from cameras 114 through biometric analysis and/or comparison to stored data about the users. This aspect can be accomplished by sensor fusion component 728 on either or both of devices 704(1) and 704(2). Finally, correlation of individual items to identified users can be accomplished by sensor fusion component 728 on either or both device 704.
The term “device,” “computer,” or “computing device” as used herein can mean any type of device that has some amount of processing capability and/or storage capability. Processing capability can be provided by one or more processors that can execute data in the form of computer-readable instructions to provide a functionality. Data, such as computer-readable instructions and/or user-related data, can be stored on storage, such as storage that can be internal or external to the device. The storage can include any one or more of volatile or non-volatile memory, hard drives, flash storage devices, and/or optical storage devices (e.g., CDs, DVDs etc.), remote storage (e.g., cloud-based storage), among others. As used herein, the term “computer-readable media” can include signals. In contrast, the term “computer-readable storage media” excludes signals. Computer-readable storage media includes “computer-readable storage devices.” Examples of computer-readable storage devices include volatile storage media, such as RAM, and non-volatile storage media, such as hard drives, optical discs, and flash memory, among others.
Examples of devices 704 can include traditional computing devices, such as personal computers, desktop computers, servers, notebook computers, cell phones, smart phones, personal digital assistants, pad type computers, mobile computers, appliances, smart devices, IoT devices, etc. and/or any of a myriad of ever-evolving or yet to be developed types of computing devices.
As mentioned above, configuration 710(2) can be thought of as a system on a chip (SOC) type design. In such a case, functionality provided by the device can be integrated on a single SOC or multiple coupled SOCs. One or more processors 726 can be configured to coordinate with shared resources 718, such as memory/storage 724, etc., and/or one or more dedicated resources 720, such as hardware blocks configured to perform certain specific functionality. Thus, the term “processor” as used herein can also refer to central processing units (CPUs), graphical processing units (GPUs), controllers, microcontrollers, processor cores, or other types of processing devices.
Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed-logic circuitry), or a combination of these implementations. The term “component” as used herein generally represents software, firmware, hardware, whole devices or networks, or a combination thereof. In the case of a software implementation, for instance, these may represent program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer-readable memory devices, such as computer-readable storage media. The features and techniques of the component are platform-independent, meaning that they may be implemented on a variety of commercial computing platforms having a variety of processing configurations.
Various examples are described above. Additional examples are described below. One example includes a system comprising a set of ID sensors positioned relative to an inventory control environment, a first subset of the ID sensors sensing a first shared space in the inventory control environment and a second different subset of ID sensors sensing a second shared space in the inventory control environment and a set of cameras positioned relative to the inventory control environment, a first subset of the cameras imaging the first shared space in the inventory control environment and a second different subset of the cameras imaging the second shared space in the inventory control environment. The system also comprises a processor configured to process information from the set of ID sensors to track locations of an ID tagged inventory item from the first shared space to the second shared space, the processor further configured to process images from the set of cameras to identify users in the inventory control environment, the processor further configured to correlate the tracked locations of the ID tagged inventory item to simultaneous locations of an individual identified user.
Another example can include any of the above and/or below examples where the ID tagged inventory item comprises an RFID tagged inventory item and the ID sensors of the set of ID sensors comprise RFID antennas.
Another example can include any of the above and/or below examples where the cameras of the set of cameras comprise visible light cameras or IR cameras and/or wherein the cameras comprise 3D cameras.
Another example can include any of the above and/or below examples where the processor is configured to process the images from the set of cameras to identify the users in the inventory control environment using biometrics.
Another example can include any of the above and/or below examples where the processor is configured to process the images from the set of cameras to identify the users in the inventory control environment using facial recognition.
Another example can include any of the above and/or below examples where the processor is configured to track locations of the ID tagged inventory item from the first shared space to the second shared space using Doppler shift to determine whether the ID tagged inventory item is moving toward or away from an individual ID sensor.
Another example can include any of the above and/or below examples where individual ID sensors of the first subset of the ID sensors have sensing regions that partially overlap to define the first shared space.
Another example can include any of the above and/or below examples where the processor is configured to simultaneously process information from multiple ID sensors of the set of ID sensors to reduce an influence of physical objects in the inventory control environment blocking signals from individual ID sensors.
Another example can include any of the above and/or below examples where the physical objects include users, shopping carts, and/or shelving.
Another example can include any of the above and/or below examples where the tracked locations of the ID tagged inventory item define a path of the ID tagged inventory item in the inventory control environment and the simultaneous locations define a path of the individual identified user in the inventory control environment.
Another example can include any of the above and/or below examples where the path of the ID tagged inventory is more co-extensive with the individual user than paths of other of the users in the inventory control environment.
Another example includes a system comprising multiple sensors positioned in an inventory control environment and a sensor fusion component configured to analyze data from the sensors to identify items and users in the inventory control environment and to detect co-location of an individual user and an individual item at a first location in the inventory control environment at a first time and at a second location in the inventory control environment at a second time.
Another example can include any of the above and/or below examples where the multiple sensors comprise multiple types of sensors.
Another example can include any of the above and/or below examples where the sensor fusion component is configured to fuse the data from the multiple types of sensors over time until a confidence level of the identified items exceeds a threshold.
Another example can include any of the above and/or below examples where the first location and the second location lie on a path of the individual user and a path of the individual item.
Another example includes a method comprising receiving sensed data from multiple sensors in an inventory control environment, fusing the data received over time to identify items and users in the inventory control environment, determining locations of the items and the users in the inventory control environment from the fused data, and associating individual items and individual users based upon instances of co-location in the inventory control environment.
Another example can include any of the above and/or below examples where the receiving sensed data comprises receiving sensed data from multiple different types of sensors.
Another example can include any of the above and/or below examples where the receiving sensed data further comprises receiving stored data from the inventory control environment.
Another example can include any of the above and/or below examples where the associating comprises charging the individual user (or otherwise receiving payment) for the individual item when the associating continues until the individual user leaves the inventory control environment.
Another example can include any of the above and/or below examples where the fusing continues over time until a confidence level of the identified users and items exceeds a threshold.
Although the subject matter relating to inventory control has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.