This disclosure relates to a line monitoring system and method that may be used to monitor objects in a line.
Lines may form in various places for various reasons. People may form lines, for example, at point of sale locations or other customer service locations at retail stores. People may also form lines at other establishments such as an outdoor entertainment area waiting to pay for entrance to the area or waiting for a particular attraction of the area. Other objects such as vehicles may also form lines, for example, at toll booths, gas stations, and other establishments. Waiting in line is generally considered to be undesirable, and establishments may want to manage lines, for example, to improve the customer's experience.
Obtaining information, such as the number of people or objects in line, the average wait time in a line, or the volume of people or objects moving through a line, may be useful in managing the flow of people or other objects through lines. Observation of a line is one way to ascertain the number of people or other objects in line at a given moment. One drawback of such observation is that it requires the expenditure of personnel time and resources to gather line count data. Observation of a line also may not be adequate to provide other line information such as average wait time and/or the volume of people or objects moving through a line.
Features and advantages of embodiments of the claimed subject matter will become apparent as the following Detailed Description proceeds, and upon reference to the Drawings, where like numerals depict like parts, and in which:
Although the following Detailed Description will proceed with reference being made to illustrative embodiments, many alternatives, modifications, and variations thereof will be apparent to those skilled in the art. Accordingly, it is intended that the claimed subject matter be viewed broadly.
Referring to
One embodiment of the line monitoring system 100 may include an object identifying and locating system 120 to identify and locate objects 102a-102e in the surveillance area 104 and an object analysis system 130 to analyze the behavior of the objects and determine if the objects form a line. The object identifying and locating system 120 may generate object data including, but not limited to, object identifying data (e.g., an ID number) and object locating data (e.g., coordinates). The object analysis system 130 may receive the object data and analyze the position and movement of the objects to determine if objects exhibit behavior indicating that the objects should be designated as being in a line, as will be described in greater detail below. As shown, objects 102a, 102b may be designated as in a line, while objects 102c-102e may not yet be designated as in a line.
The object analysis system 130 may also determine one or more line statistics such as a count of the objects in a line, the wait time for objects in a line, the average time to service customers (e.g., in multiple lines), and/or the volume of objects passing through a line during a given time period. The line monitoring system 100 may display the line statistics on a display 140 and may further analyze the line statistics, for example, by comparing line statistics to thresholds (e.g., line count threshold, an average wait time threshold, etc.). The line monitoring system 100 may also provide line statistics to another computer system 142 for further analysis. The line monitoring system 100 and/or the computer system 142 may also communicate with a notification device 144, such as a handheld wireless device, to provide notifications based on line statistics. If a line count exceeds a line count threshold or falls below a line count threshold, for example, a notification may be provided to indicate that another line should be started or a line should be closed. The line monitoring system 100 may also include a user input device 146 to allow a user to provide input, for example, to select a surveillance area, to select desired line statistics, to set desired notification thresholds, and to configure line behavior pattern parameters, as described below.
The line monitoring system 100 may therefore facilitate a variety of line management applications. In a retail store, for example, if there are an excessive number of people in a line at a point of sale location in a retail store, the line monitoring system 100 may trigger an alarm (e.g., on notification device 144) to alert appropriate store personnel of the situation regardless of their location in the retail store. In response, the store personnel may open additional point of sale locations to ease the congestion.
Another application may be to determine the traffic flow through a particular area to see if service providers of the retail store are relatively consistent. This could be utilized to identify the relatively slower service providers who may then be trained in more efficient service techniques. Yet additional applications may calculate the average wait time through the whole line, the average volume of traffic through a particular area, the average volume of traffic though a particular area during a particular time period, and the average time to service an individual customer. Store personnel can utilize the results of these additional applications to improve line management and customer service.
One embodiment of the object identifying and locating system 120 may include one or more cameras 122 to capture one or more images of the surveillance area and an object extraction system 124 to extract objects from the captured images and determine object locations within the surveillance area. The camera(s) 122 may generate one or more image signals representing the captured image of the surveillance area 104. The camera(s) 122 may include cameras known to those skilled in the art such as digital still image or video cameras.
The camera(s) 122 may be situated to focus on the surveillance area 104. Although not shown in the block diagram of
As a line becomes longer, the field of view of the camera(s) 122 may be increased to expand the surveillance area 104 and to capture as many objects in the line as desired. To increase the field of view, for example, the vertical height of the camera(s) 122 may be raised above the surveillance area 104, a wider angle camera lens may be used, and/or a plurality of cameras may be used to provide adjacent views of the surveillance area 104. The use of a plurality of cameras 122 may enable each camera to be mounted lower or closer to the surveillance area 104 to facilitate tracking and differentiation of objects 102a-102e by the object extraction system 124. When a plurality of cameras are utilized, the cameras may be coordinated to track objects moving from the range of one camera to another camera using techniques known to those skilled in the art.
In one embodiment, the object extraction system 124 and the object analysis system 130 may be implemented as one or more computer programs or applications, for example, running on a computer system. The object extraction system 124 and the object analysis system 130 may be separate applications or may be components of a single integrated line monitoring application. The object extraction system 124 and the object analysis system 130 may also be applications running on separate computer systems that are coupled together, for example, by a network connection, a serial connection, or using some other connection. The computer programs or applications may be stored on any variety of machine readable medium (e.g., a hard disk, a CD Rom, a system memory, etc.) and may be executed by a processor to cause the processor to perform the functions described herein as being performed by the object extraction system 124 and the object analysis system 130. Those skilled in the art will recognize that the object extraction system 124 and the object analysis system 130 may be implemented using any combination of hardware, software, and firmware to provide such functionality.
The camera(s) 122 may be coupled to the object extraction system 124 via a path 126, for example, using a wireless connection or a wired connection to the computer system incorporating the object extraction system 124. The camera(s) 122 may provide image signals (e.g., a video feed of the surveillance area 104) to the object extraction system 124 via the path 126. The object extraction system 124 may analyze pixels in the image represented by the image signal and may group the moving pixels together to form image objects corresponding to actual objects 102a-102e in the surveillance area 104. The object extraction system 124 may further identify each object in the image of the surveillance area 104 and provide coordinates specifying the location of each object.
Referring to
In one embodiment where the objects being monitored are people in the surveillance area, the object extraction system 124 may be configured to identify objects that are people. To accurately identify people, the object extraction system 124 may filter out lighting, shadows, reflections, and other anomalies, which may be erroneously identified as people. The object extraction system 124 may utilize tuning parameters to increase the accuracy of object extraction, as is known to those skilled in the art. The tuning parameters may include a lighting threshold, edge detection threshold, and/or grouping criteria. The object extraction system 124 may thus provide the object analysis system 130 with correctly identified people objects to avoid false images or “phantoms” that may confuse the object analysis system 130. Although the object extraction system 124 may provide the majority of the filtering to identify people as objects, the object analysis system 130 may also provide object filtering as well for distinguishing people from other objects, for example, based on the movement or behavior of the objects.
As shown in
The object extraction system 124 may provide persistency of objects such that objects are consistently identified as the objects move through the image 200 of the surveillance area 104. To accomplish this, the object extraction system 124 may provide an identifier (e.g., an ID number) for each object in the image 200 to associate the image object at that coordinate in the image 200 with a specific corresponding object in the surveillance area. The object extraction system 124 may maintain that identifier as the image object moves.
As shown in
In addition to providing the object identifying data and object location data of image objects 206a-206e extracted from the surveillance area image 200, the object extraction system 124 may also provide additional parameters or object data to the object analysis system 130. Such object data may include object size, object velocity, and a timestamp for the current location of each object. Such additional parameters may be helpful in some instances, but are not necessary.
Although the exemplary embodiment uses an object extraction system 124 to obtain object identifying and location data, those skilled in the art will recognize that the object identifying and locating system 120 may also include other systems capable of generating object identifying data (e.g., an ID number) and object location data (e.g., coordinates). Examples of such systems include radio frequency identification (RFID) tracking systems and other tracking systems known to those skilled in the art.
Referring to
A number of behavior patterns indicative of objects in a line may be abstracted to various parameters and enumerated as values. The object analysis system 130 may assign default values for each line behavior pattern parameter representative of a behavior pattern. The user input device 146 may also be used by an operator of the object analysis system 130 to adjust the default values of the parameters in order to “tune” the object analysis system 130 for a variety of conditions.
Referring to
Objects generally form a line in a designated area extending from a starting point (e.g., a point of sale location). As shown in
When an object enters the reference area 400, the object may be designated as only “potentially in line” because the object may be only transitionally moving through the reference area 400. Therefore, the object analysis system 130 may designate the object 404a as “potentially in line” until the object analysis system 130 makes a determination that the object is actually in line, for example, using other parameters described below. As shown in
Other parameters may define movement of an object to determine if an object designated as “potentially in line” should be designated as “in line.” Examples of such parameters include a “stillness” parameter and/or a “jitter” parameter. Objects (e.g., people) that enter a line typically stop moving for at least a short period of time. The “stillness” parameter may be defined using one or more values representing a still time period. If the object location data for the object 404a that has entered the reference area 400 indicate that the location of the object has not changed for the still time period, for example, the object analysis system 130 may designate that object as being “in line” as opposed to being “potentially in line.” The still time period may be adjustable or tunable by an operator of the object analysis system 130 to take into account different circumstances.
Objects in line may move around within a limited space, and thus may not be perfectly still. The “jitter” parameter may be defined using one or more values representing a limited “jitter” space in which an object may move while in line. As shown in
When no objects have yet been designated as “in line”, the reference area parameter, the stillness parameter and the jitter parameter may be used to determine when a first new object should be designated as “in line.” When at least one object is designated as being “in line,” additional objects may then be designated as being “in line” or “potentially in line.” Other parameters may define a position of an additional object relative to other objects in line to determine if the additional object should be designated as being “in line” or “potentially in line.” These parameters may include a proximity parameter, a behindness parameter, and a cut distance parameter, as described below.
In general, an additional object will join a line at the end. The proximity parameter may be defined using one or more values representing a proximity distance from the last object designated as being in line. If object location data indicates that the additional object is within the proximity distance of the last object, then the object analysis system 130 may designate the object as being “in line” or “potentially in line.” As shown in
An additional object that enters the line in front of the last object currently in line (e.g., within the proximity distance) may be doing something that causes the object to temporarily move to that position but may not actually be attempting to enter the line. The behindness parameter may be defined using one or more values representing a relative location behind the last object currently in line. If the object location data for an additional object indicates that the additional object is actually “behind” the last object currently in line, the object analysis system 130 may designate the additional object as being “in line” or “potentially in line.” As shown in
An object may enter a line in front of the last object currently in line if the object attempts to “cut” into the line. The cut distance parameter may be defined using one or more values representing the distance to a line that connects the coordinates of two objects that are currently in line. If object location data indicates that an additional object has moved within the cut distance parameter, the additional object may be designated as “in line” or “potentially in line.” As shown in
Even if an additional object may be near a line (e.g., within a proximity or cut distance), the additional object may not be in line, for example, if the object is merely passing by the line. Thus, the proximity parameter, the behindness parameter and the cut parameter may be used to indicate that an additional object is “potentially in line” and the stillness and/or jitter parameters discussed above may be analyzed to determine if the additional objects designated as “potentially in line” should be designated as “in line.”
Once an object has joined a line, the object may leave the line at any time. The object analysis system 130 may utilize a deviation distance parameter to determine if an object that has already been designated as “in line” should be removed from the line. The deviation distance parameter may be defined using one or more values representing the distance required for the object to move away from the line before the object is removed from the line. If the object location data indicates that the object moves a distance greater than the deviation distance from the line, the object analysis system 130 may then remove the object that was previously designated as being “in line.”
As shown in
For the first object 404a currently in line, the deviation distance may be defined as a distance 442 from a line 440 between the last “still” position of the first object 404a (shown in phantom) and the next object 404b in line. The last “still” position of the first object 404a may be the location when the first object last met either the stillness parameter or the jitter parameter. For example, the first object 404a (previously first in line) may have a current position that has deviated from the line 440 by at least the deviation distance 442 and thus may be designated as removed from the line.
For the last object 404f currently in line, the deviation distance may be defined as a distance 450 from the last “still” position of the last object 404f (shown in phantom). The last “still” position of the last object 404f may be the location when the object 404f last met either the stillness parameter or the jitter parameter. Similar to other parameters, the deviation parameter may be tunable by an operator of the object analysis system 130. The deviation parameter may be separately tunable for the first object currently in line, the last object currently in line, and the objects currently in line between the first and last objects.
Referring to
If there is not a new object, then the object analysis system may update 514 positions of all objects based on the received object location data. The object analysis system may then determine 516 if any object designated as “in line” is outside its deviation distance. If an object is outside the deviation distance, the object analysis system may remove 520 the object from the line.
If there is a new object, the object analysis system may determine 508 how many objects are currently in line. If no objects are currently in line and the new object may be the first object in line, the object analysis system handles 510 the analysis of the object data for a first new object, as will be described in greater detail below. If there is at least one object currently in line and the new object may be an additional object in line, the object analysis system handles 512 the analysis of the object data as an additional object, as will be described in greater detail below. When the handling of the object data analysis for the first new object and the additional object is completed, the object analysis system may update 514 positions of all objects and may determine 516 if any objects have deviated from the deviation distance.
If the additional object is determined to be either within the cut distance or within the proximity distance and behind the last object currently in line, the object analysis system may determine 708 if the additional object is still. If the additional object is determined to be still, the object analysis system may add 712 the additional object to the line. If the object is not determined to be still, the object analysis system may determine 710 if the additional object is jittering about a jitter space. If the object is jittering, the object analysis system may add 712 the additional object to the line. If the additional object does not meet any of these parameters, the additional object may not be added to the line.
Various implementations of the object analysis system and method may utilize one or more of the defined line behavior pattern parameters depending on the actual implementation circumstances. Other line pattern behavior parameters may also be implemented in the object analysis system. The line pattern behavior parameters may also be analyzed in a different sequence than described herein.
The line statistics may be calculated as the object analysis system adds objects and removes objects from the line. A line count may be determined, for example, by calculating a number of objects designated as “in line” at any time. The average wait may be determined, for example, by calculating an average period of time that each object is designated as “in line.” The volume moving through the line may be determined, for example, by calculating a number of objects designated as “in line” during a time period. The line statistics may then be displayed and/or used to provide notifications or alarms, as described above.
Consistent with embodiments of the present invention, a line monitoring method and system may be used to monitor objects in a line. The line monitoring method may include receiving object data associated with objects in a surveillance area. The object data may include at least object identifying data and object location data. The method may also include analyzing the object data with reference to at least one line behavior pattern parameter representing at least one behavior pattern indicative of objects in line to determine if at least one of the objects should be designated as in a line in the surveillance area. The method may further include determining at least one line statistic associated with objects designated as in the line.
The line monitoring system may include an object identifying and locating system configured to identify and locate objects in a surveillance area and to generate object data comprising at least object identifying data and object location data. The line monitoring method may also include an object analysis system configured to receive the object data, to analyze the object data to determine if at least one of the objects should be designated as in a line in the surveillance area, and to determine at least one line statistic associated with the line.
The terms and expressions which have been employed herein are used as terms of description and not of limitation, and there is no intention, in the use of such terms and expressions, of excluding any equivalents of the features shown and described (or portions thereof), and it is recognized that various modifications are possible within the scope of the claims. Other modifications, variations, and alternatives are also possible. Accordingly, the claims are intended to cover all such equivalents.
This application claims the benefit of the filing date of U.S. Provisional Application Ser. No. 60/624,430, filed Nov. 2, 2004, the teachings of which are incorporated herein by reference.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US05/39487 | 11/1/2005 | WO | 00 | 2/25/2008 |
Number | Date | Country | |
---|---|---|---|
60624430 | Nov 2004 | US |