The present invention relates generally to systems and methods for marketing analysis, and more particularly to systems and methods for monitoring and analyzing still images captured at a low frequency to detect and analyze shopping events.
In-store shopping event monitoring and analysis can provide direct evidence of shopper behavior at the point of product selection. Prior monitoring systems recorded video of shoppers, and used human operators to interpret the video. Such systems suffer from the drawback that they generate large amounts of data due to the high frame rate at which conventional video is captured, and require significant time spent by human operators to analyze the data. As a result, these systems have been costly to implement.
The still image shopping event analysis systems and methods provided herein may implement low frequency still image sampling and perform a computer analysis of the still images captured. According to one aspect, the analysis may include discriminating differences between frames of the still images based on changes of pixels between the frames. According to another aspect, the analysis may include detecting and/or analyzing one or more shopping events based on the discriminated differences between frames of the still images.
The systems and methods provided herein may further count and/or analyze the shopping events based on patterns of changes between frames, including for example, numbers of customers visiting and amounts of time customers spent visiting a shopping area, whether the visit was a transitory visit or involved more detailed shopping, whether a purchase occurred, and/or the identity and number of item(s) purchased. The change of the image field measured by pixel changes may also be used to measure inventory and/or determine presence or absence of specific displays or items.
The system 10 may include one or more cameras 14, such as power-over-Ethernet (POE) cameras, installed to monitor the various desired shopping areas 16 in the store 12. In this example, the monitored shopping areas 16 include various aisles of the store. However, it should be appreciated that the monitored shopping areas may include a pharmacy, fresh meat counter, service deli counter, checkout area, exits and entrances, doors, or other suitable area where shopping event analysis is desired.
In some embodiments, to be deployed in locations where laws or policies regulate the placement of cameras that capture images of customers in stores, the cameras 14 may be configured to capture and store pixel array data in a form suitable for statistical analysis, but which does not retain image characteristics that would run afoul of such regulations. It will be appreciated that since the embodiments of the systems and methods described herein analyze images on a pixel by pixel basis to determine changes in the pixels between frames. Thus, these images may alternatively be referred to as pixel arrays, and these pixel arrays may only retain a portion of the information needed to reconstruct the original image. For, the pixel array may contain a pixel parameter such as luminosity data for each pixel, but not other pixel information needed to reconstruct the original image.
In some configurations, cameras may be positioned to monitor only a particular section of an aisle or other location to adjust the scope of still image analysis. For example, cameras may be positioned to capture the shelves on each side of an aisle in the store. The cameras 14 may be configured to snap, while maintained in a fixed position, still images at a low frequency rate that is lower than conventional video frame rates (approximately 30 frames per second), such as once per second, once per minute, once per hour, once per day, or less frequently.
The image data of the still images captured may be transferred for temporary storage to a computing device 18 located for example in or near the store 12, which may be referred to as “staging” the image data in a “stomp box”. After staging, or in alternative to staging, the image data may be transmitted to an analysis computer 20, in some examples off site and centralized in a data center, through for example a wireless virtual private network (VPN) via the Internet. Once the image data is collected, analysis computer 20 may be configured to perform computerized analysis to discriminate shopping event information from the collected image data.
Still image data may be collected and tagged in a database, for example with date and the shopping area monitored to assist analysis. Shopping event analysis will be discussed in further detail below with reference to
It should be appreciated that the system 10 may include one camera in a single location or may include a plurality of cameras at numerous locations within a store. Further, a plurality of cameras may be dispersed between different store locations and may send image data to one or more data centers for analysis and storage.
The camera 14 may be positioned at various viewing angles for monitoring the desired shopping areas 16. For example, the camera 14 may be positioned to have a line of sight that is parallel to an elongate axis of the aisle, as depicted in
Various camera operating parameters may be adjusted, for data transfer, storage, processing and other purposes. For example, by reducing the sampling rate of the still images, storage capacity requirements may be reduced and/or storage resources may be made available for other uses which, in turn, may reduce overall system costs, and space requirements. As another example, by reducing image quality (i.e. pixel density) processing complexity may be reduced which may lead to a reduction in cost.
In an exemplary embodiment of the still image monitoring and analysis system 10, image analysis may be performed by the analysis computer 20 on an aggregate collection of still images to discriminate differences between consecutive images based on changes in pixels in order to identify shopping events, as described in detail hereinafter. By comparing each still image on a pixel basis to the subsequent image the analysis computer 20 is configured to determine if the images are identical or if there are differences between the images based on whether each pixel or a group of pixels in one image is different from a corresponding pixel or a corresponding group of pixels in the subsequent image.
The analysis computer 20 may be configured to categorize still images may be into different categories based on the detected changes in pixels between frames. For example, still images may be categorized, in order, into a series of groups of identical images, referred to as reference images, broken up by other groups of images that differ from the reference images, by variable amounts. These comparisons may be performed on a pixel basis, by comparing corresponding pixels or groups of pixels in each image to the reference image to detect changes in an image parameter, such as intensity, color, hue, saturation, value, luminance, and/or chrominance.
Furthermore, each still image may be broken down into a pixel matrix with different regions which may be used to identify particular events, as shown in
A threshold may be set to determine whether there is pixel change between frames, and the threshold may differ based on the area of the region of the image. Thus, for example, a smaller pixel change threshold may be applied to regions that are further from the camera and in which persons and products appear smaller. Once the still images are classified into the two categories, constant-events and visiting-events, various analyses may be performed by the analysis computer 20 to extract shopping event information. For example, by identifying the number of visiting-events that occur in any particular time period, an estimated minimum number of visitors in the field of the camera's view may be established. Further, this value can be used to determine other shopping event parameters including the visit percentage or the number of total shoppers that visit the monitored shopping area.
Furthermore, the duration of a shopping event, such as a constant-event or a visiting-event, may be determined by the analysis computer 20. The duration may be used to further classify the shopping event. For example, the total number of seconds in each of the visiting-events, namely, the number of seconds from the first still image to the last still image, obtained for example either by count or by difference in time stamps may be used to further classify the visiting-events. Based on duration of visiting-events, the visiting-events may be categorized into transitory-events and stopping-to-shop-events. The duration of transitory-events are in general shorter, since during transitory-events customers are merely passing by and do not spend a significant amount of time in the monitored shopping area. The duration of stopping-to-shop events are in general longer during transitory-events, since detailed shopping may be involved by one or more customers and the customers are spending more time in the monitored shopping area. A duration threshold may be set to differentiate transitory-events from stopping-to-shop events.
Furthermore, various other time measures, such as dwell time, which is an amount of time a shopper spent in a shopping area, and buy time, which an amount of time a shopper spent examining a product before making a purchasing decision may also be determined. In addition, these and other measures may be analyzed by time of day, day of week, etc., as desired. It should also be appreciated that a duration of each type of shopping event may be cataloged to help future analysis of the shopping events. The analysis computer 20 may be configured to perform various analyses, such as statistical analysis, of the captured still image. For example, the analysis computer 20 may be configured to identify and count the number of stopping-to-shop events. The number and/or percentage of customers who stopped to shop in the monitored shopping areas may be determined based on the number of stopping-to-shop-events occurring in a particular time period. For example, a minimum number and/or percentage of customers who stopped to shop may be estimated since each stopping-to-shop-event involves at least one customer who stopped to shop.
The analysis computer 20 may be configured to estimate the number of visiting customers visiting during a particular visiting-event based on a share and/or number of pixels of the still images that have been changed from a constant-event, such as a preceding constant-event. Alternatively, the number of visiting customers may be estimated based on pixel changes throughout the visiting-events, i.e., based on pixel changes between frames still images of the visiting-events. The visiting-event to shopping event conversion (S/N) rate may be calculated to determine the percentage of visitors that stop to shop.
Now turning to the constant-events, since visiting-events and constant-events are interspersed throughout the entire monitoring period, the total duration of the constant-events will equal the duration of the monitoring period minus the duration of the visiting-events.
By discriminating between the various constant-events, it may be identified when there is a change in pixels from one constant-event to another constant-event, such as to the next constant-event. This change may further be classified as or attributed to a purchasing-event, during which one or more customers made a purchase of an item located in the monitored shopping area. In other words, when monitoring a particular area, such as a shelf on an aisle, an identified pixel change (i.e. a purchasing-event) may represent a change in appearance of the shelf which may be interpolated as a purchase.
Referring to
The analysis computer 20 may further be configured to estimate how many units of a particular display item that have been removed, thus purchased, from number or share of pixels attributed to the particular display item that have been altered. Further, from the location of those pixels that are changed, the purchased product may be identified. Still further, the above purchase related analysis may be used to identify product stock-outs (i.e., an event in which a product becomes out of stock on a shelf) and/or pending product stock-outs (i.e., an event in which the number of products on a shelf drops below a threshold number, such as 1) of the particular display item, and/or providing alert, such as a timely alert, regarding the product stock-outs and/or pending product stock-outs of the particular display item, to, for example, store management, and/or to instruct store employees to engage in automated restocking of the particular items.
Analysis computer 20 may determine entrance direction of a visiting customer visiting a visiting-event based on characteristics of a line graph of the visiting-event, as will be illustrated in detail in reference to
Further, robustness of the analysis can be verified for quality control purposes, with further enhancement of understanding the relation of array statistics to shopper behavior, through the visual support of the still images, in other words by actually viewing the still images. For example, in some cases, a visiting-event is too short to represent a transit of a shopper of the cameras field of view. In one particular example, a person may wave their hand in front of the camera and trigger a short visiting-event. In such events that may not provide a clear indication of shopper behavior, the actual still image may be assessed to see what really happened in order to properly classify the event to refine the above pixel change based shopping event analysis. In yet another example, visual inspection of the actual still images may help to verify the accuracy of estimating number of customers during a visiting-event based on number or share or percentage of pixel changes from a constant event.
In some embodiments, the still image view function may be automated, so that, for example, single images or groups of images may be polled to be viewed automatically. As a particular example, one image from the center of each purchasing-event may be pulled to identify the sight demographics of the purchaser. The tagging information discussed earlier may help to facilitate or enable single images and/or groups of images of a shopping event to be viewed to assist in the analysis of the shopping event. It should be appreciated that such a process may also be performed manually by an image technician.
In some embodiments, the still image monitoring and analysis system may include an image viewing tool that allows the rapid scanning or detailed perusal of images selected by any suitable criteria the viewer may choose. The viewing tool may include a video play function of any selected event that shows for example a one second per frame selection of images consecutively of a desired event.
Referring now to
The first event indicates one shopper passing through the aisle, for a total of about 19 seconds, entering from the front of the aisle and exiting from the back. The front to back direction entry by the customer is apparent from the shape of the event, more specifically, a gradual increase in number of pixels changed as the shopper enters the monitored shopping area and walks towards the camera, and then a precipitous drop as they exit the image field cameras the customer pass the camera. The event identification is supported from the still images identified specifically in row C of
The second event, which lasts for about 41 seconds, begins with a single shopper, who also enters from the front of the aisle, shown in row B of
It should be appreciated that determinations may be made with a reasonable degree of reliability from the characteristics of line graphs of various shopping events, which may be cataloged by comparing different line graph curve characteristics to visual inspection. It should also be appreciated, that having less than a full aisle in the view of the camera (for example, an eight foot section of the aisle) may increase the reliability of the statistics.
It should also be appreciated that a catalog of reference shopping events may be identified, for example through visual inspection by an image technician of samples of reference still images during known shopping events. The technician may identify the type of line graph curve characteristics and/or other type of statistical data extrapolated from the reference still images, and may also identify corresponding reference pixel characteristics that are indicative of those events. This catalog may be used to facilitate future computer analysis of the captured still images. For example, these reference pixel characteristics identified through empirical data may serve as a sample data set against which other pixel data is compared, in order to statistically identify shopping events in the other pixel data, even where the other pixel data is not viewed by an image technician. This may save time and costs over prior methods that relied on human inspection of images.
It should be appreciated that the still image monitoring system and method do not require recognition of image features within a still image, but instead is based on recognizing the magnitude and timing of changes between still images from changes in pixels. However, image recognition may be used to further refine identification and categorization of shopping events and shopper behavior.
It should also be noted that a wide variety of information can be obtained in this manner, without requiring actual image recognition, which reduces the amount of data flow and the cost associated with analyzing large amount of data and allows shopping behavior analysis be carried out in a simple and cost effective manner.
Further, such a system may be modular, in other words, the system may be adaptable so that any number of cameras may be installed in a particular location and any number of locations may be connected on network. It should be appreciated that the method and system may be applied to various other applications in order to categorize specific events and behavior.
As described above, performing the computer analysis of the still images may further include categorizing still images into different categories based on changes in pixels between frames. These different categories may include constant-events during which no customer is visiting the monitored shopping area when changes in pixels between frames are below a threshold, and visiting-events during which one or more customers are visiting the monitored shopping area when changes in pixels between frames are above a threshold.
Further, performing the computer analysis of the still images may further include estimating a number and/or percentage of customers visited the monitored shopping area based on the number of visiting-events identified in a particular time period.
In addition, performing the computer analysis may further include determining an entrance direction of a visiting customer during a visiting-event based on characteristics of a line graph of the visiting-event the line graph of the visiting-event plots differences between frames
Performing the computer analysis of the still images may further includes, based on duration of the visiting-events categorizing the visiting-events into transitory-events and stopping-to-shop-events, wherein during each of the visiting-events, one or more customers are passing the monitored shopping areas, and during each of the stopping-to-shop-events, one or more customers visiting the monitored shopping areas also stopped to shop.
Performing the computer analysis of the still images may further include estimating a number and/or a percentage of customers who stopped to shop in the monitored shopping areas based on number of stopping-to-shop-events occurring in a particular time period.
Performing the computer analysis of the still images may further include estimating number of visiting customers during a visiting-event based on a share of and/or number of pixels of the still images that has been changed from the still images of a constant-event, and/or the pixel changes throughout the visiting-event.
Performing the computer analysis of the still images may further include interpolating changes in pixels attributed to a change in appearance of a product region of the monitored shopping area between two constant-events as an occurrence of a purchasing-event, during which a purchase of an item located in the product region has been made by a visiting customer.
Performing the computer analysis of the still images may further include identifying an item and/or product being purchased based on position of pixels changed.
Performing the computer analysis of the still images may further include calculating total number of the items being purchased based on the number of purchasing-events during which customers have purchased the item and/or product.
Performing the computer analysis further include identifying product stock-outs and/or pending product stock-outs of the item, and/or providing alerts regarding product stock-outs and/or pending product stock-outs of the item based on the calculated total number of the items being purchased.
Performing the computer analysis of the still images may further include identifying a number of purchasing-events that occurred in a particular time period, and estimating a number and/or percentage of customers visiting the monitored shopping area also made a purchase of an item of located in the monitored shopping area based on the number of purchasing-events.
It will be appreciated that the above described systems and methods may be used to statistically analyze pixel data from monitored shopping areas to identify shopping events occurring of a period of time, without requiring that video of the monitored shopping area be viewed by a human technician, and also without requiring processor-intensive and expensive image processing techniques to be employed to recognize patterns in video of the monitored shopping area.
It should be understood that the embodiments herein are illustrative and not restrictive, since the scope of the invention is defined by the appended claims rather than by the description preceding them, and all changes that fall within metes and bounds of the claims, or equivalence of such metes and bounds thereof are therefore intended to be embraced by the claims.
This application claims priority to U.S. Provisional Patent Application No. 60/889,519, filed on Feb. 12, 2007, which is hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5465115 | Conrad et al. | Nov 1995 | A |
5497314 | Novak | Mar 1996 | A |
5978016 | Lourette et al. | Nov 1999 | A |
6128396 | Hasegawa et al. | Oct 2000 | A |
6381368 | Kanatsu | Apr 2002 | B1 |
6424370 | Courtney | Jul 2002 | B1 |
6452972 | Ohara | Sep 2002 | B1 |
6570608 | Tserng | May 2003 | B1 |
6731400 | Nakamura et al. | May 2004 | B1 |
6775407 | Gindele et al. | Aug 2004 | B1 |
6785421 | Gindele et al. | Aug 2004 | B1 |
6897983 | Kawano | May 2005 | B1 |
7305106 | Sumitomo et al. | Dec 2007 | B2 |
7319479 | Crabtree et al. | Jan 2008 | B1 |
7386170 | Ronk et al. | Jun 2008 | B2 |
7466844 | Ramaswamy et al. | Dec 2008 | B2 |
7715659 | Zhao et al. | May 2010 | B2 |
7920717 | Kansal | Apr 2011 | B2 |
20010004400 | Aoki et al. | Jun 2001 | A1 |
20020159634 | Lipton et al. | Oct 2002 | A1 |
20030002712 | Steenburgh et al. | Jan 2003 | A1 |
20030098910 | Kim | May 2003 | A1 |
20030235327 | Srinivasa | Dec 2003 | A1 |
20040032494 | Ito et al. | Feb 2004 | A1 |
20040260513 | Fitzpatrick et al. | Dec 2004 | A1 |
20050036658 | Gibbins et al. | Feb 2005 | A1 |
20050078325 | Momose et al. | Apr 2005 | A1 |
20050104958 | Egnal et al. | May 2005 | A1 |
20050105764 | Han et al. | May 2005 | A1 |
20050105765 | Han et al. | May 2005 | A1 |
20050146610 | Creamer et al. | Jul 2005 | A1 |
20050162515 | Venetianer et al. | Jul 2005 | A1 |
20050163345 | van den Bergen et al. | Jul 2005 | A1 |
20050180595 | Horii et al. | Aug 2005 | A1 |
20060013495 | Duan et al. | Jan 2006 | A1 |
20060045381 | Matsuo et al. | Mar 2006 | A1 |
20060147087 | Goncalves et al. | Jul 2006 | A1 |
20060185878 | Soffer | Aug 2006 | A1 |
20060218057 | Fitzpatrick et al. | Sep 2006 | A1 |
20060227997 | Au et al. | Oct 2006 | A1 |
20060239546 | Tedesco et al. | Oct 2006 | A1 |
20060243798 | Kundu et al. | Nov 2006 | A1 |
20060262958 | Yin et al. | Nov 2006 | A1 |
20060291695 | Lipton et al. | Dec 2006 | A1 |
20070058040 | Zhang et al. | Mar 2007 | A1 |
20070223818 | Marik et al. | Sep 2007 | A1 |
20070230798 | Naylor et al. | Oct 2007 | A1 |
20070258202 | Cooley et al. | Nov 2007 | A1 |
20080037869 | Zhou | Feb 2008 | A1 |
20080077510 | Dielemans | Mar 2008 | A1 |
20080172781 | Popowich et al. | Jul 2008 | A1 |
20090087043 | Mizushima et al. | Apr 2009 | A1 |
Number | Date | Country |
---|---|---|
2428152 | Jan 2007 | GB |
Entry |
---|
ISA US, International Search Report of PCT/US2008/001894, Jun. 20, 2008, WIPO. |
Number | Date | Country | |
---|---|---|---|
20080215462 A1 | Sep 2008 | US |
Number | Date | Country | |
---|---|---|---|
60889519 | Feb 2007 | US |