The present teachings relate to the field of vehicle parking and more particularly to methods and structures for determining parking spot availability.
Detecting available on-street parking spaces has been an important problem for parking management companies, city planners, and for others concerned with vehicle densities and parking availability. One method for determining parking availability is through the use of “puck-style” magnetic ground sensors that output a binary signal when detecting a vehicle in a parking stall. While this method can provide accurate parking information for street settings with demarcated parking stalls, it is difficult to estimate parking space when there is no demarcation on the street (multiple parking). In addition, this method offers limited functionality beyond single parking spot counts, is prone to damage by parking vehicles and regular traffic maintenance equipment, and incurs traffic disruption upon installation, repair, and replacement.
Video-based solutions have been also proposed to determine the availability of parking spaces by detecting parking vehicles and then estimating available parking space. Typically, each camera in a network of multiple surveillance cameras can cover four to five parking spots when the camera is deployed at an opposite side of a street relative to the parking spots. These systems can provide accurate parking space estimation for both single space and multi-space parking.
While these systems can provide accurate parking information for each parking spot, there are significant equipment, installation, and maintenance costs associated with these solutions.
The following presents a simplified summary in order to provide a basic understanding of some aspects of one or more embodiments of the present teachings. This summary is not an extensive overview, nor is it intended to identify key or critical elements of the present teachings nor to delineate the scope of the disclosure. Rather, its primary purpose is merely to present one or more concepts in simplified form as a prelude to the detailed description presented later.
In an embodiment of the present teachings, a method for estimating parking occupancy within an area of interest can include measuring initialization data for a beginning of a measurement cycle (time t=0), where the initialization data comprises a count of vehicles parked within the area of interest, and a count of vehicles transiting through the area of interest. The method can further include beginning the measurement cycle at time t=0, detecting entry of one or more vehicles into the area of interest using a first image capture device subsequent to t=0, detecting exit of one or more vehicles out of the area of interest using a second image capture device different from the first image capture device subsequent to t=0, and estimating the parking occupancy for the area of interest using the initialization data, the number of vehicles entering the area of interest, and the number of vehicles exiting the area of interest.
In another embodiment of the present teachings, a system can include an interface to a first image capture device configured to detect entry of one or more vehicles into an area of interest over a time interval subsequent to an initial time t=0, an interface to a second image capture device configured to detect exit of one or more vehicles out of the area of interest over the interval, and a processor, communicating with the interface to the first image capture device and the interface to the second image capture device. The processor can be configured to receive initialization data including a count of vehicles located within the area of interest at the initial time t=0 and a count of vehicles transiting through the area of interest at the initial time t=0, and estimate a parking occupancy for the area of interest using the initialization data, the number of vehicles entering the area of interest over the interval, and the number of vehicles exiting the area of interest over the interval.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present teachings and together with the description, serve to explain the principles of the disclosure. In the figures:
It should be noted that some details of the FIGS. have been simplified and are drawn to facilitate understanding of the present teachings rather than to maintain strict structural accuracy, detail, and scale.
Reference will now be made in detail to exemplary embodiments of the present teachings, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
While puck-style parking detectors and video techniques have a high accuracy in determining parking density and availability, these techniques can require the use of expensive hardware. In many applications, for example dynamic pricing where the cost of parking changes with the overall percentage of occupied spaces, an exact location of each parking spot is not required and a rough estimate of the number of available parking spots may be sufficient.
An embodiment of the present teachings may require fewer cameras and less computational power for available parking space estimation than other parking sensor methods. An embodiment may provide information regarding a percentage of available parking within a given area such as a street block, that is more accurate than, for example, simply subtracting vehicle exits from vehicle entrances at a given instance in time.
Embodiments of the present teachings can include methods and structures for estimating parking occupancy (i.e., parking density). In an embodiment, the occupancy can include a measurement or estimation of a number of parked cars. For example, the number of parked cars can be compared to the number of known parking spaces to determine an occupancy percentage or a percentage of available parking spaces within an area of interest. Some of the present embodiments may have a reduced accuracy compared to some other systems, but an acceptable estimation accuracy may be delivered at a lower cost. The present embodiments can include systems and methods for estimating available parking space on a street block using two or more image capture devices such as high speed still cameras and video cameras. While the description of the present teachings discussed below references video cameras for simplicity, it will be understood that other image capture devices or other vehicle counting devices may be used. A system in accordance with the present teachings can further include one or more of the following: A networked camera system that has at least one camera per each street intersection, including any entrance or exit to the street block such as that from a parking garage, parking lot, alleyway, or driveway; an initial estimate or method for obtaining a count of the number of vehicles parked and a count of the number of vehicles transiting through the area of interest at the beginning of an estimation period (measurement cycle); a vehicle detection method for each camera that will detect vehicles entering and exiting the area of interest; a method to count or generate an identifier for the detected vehicles; and a method to estimate the difference between the number of vehicles that are in transit on the street and number parking on the street, where the parking/transit estimate is performed using the initial estimates and counts or identifiers of the detected vehicles.
in a embodiment for estimating parking occupancy in the street segment 15 between blocks 12D and 12E, an initial estimate is obtained of the number of parked vehicles, as well as the number of transiting vehicles (vehicles driving within the block but not parking) on the street segment 15 at the beginning of a measurement cycle. These two data points can be measured or estimated using any number of techniques, such as direct observation by a human operator or using statistical data collected prior to beginning the measurement cycle. This data (i.e., initialization data) is entered into the processor 20 to initialize the network, for example using a hand-held input device by the human operator or by direct entry into the processor 20. This initialization data, as well as other data described below, can be used by the central data processing unit 20 to compute an estimated parking occupancy within the street segment 15 of street 14A using one or more algorithms.
After the initialization, the video cameras monitor and count vehicles entering and exiting the street segment 15 from both ends of the street segment 15. Any method can be used to detect vehicles entering/exiting the street, for example using vehicle detection from still images or video image data from video cameras 16.
Detecting vehicles from still images uses features commonly associated with images of vehicles such as common texture, structure, color, etc. Detection of objects using such data from still images is discussed, for example, in S. Agarwal, A. Awan, and D. Roth, “Learning to detect objects in images via a sparse, part-based representation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 11, pp. 1475-1490, 2004, and in Luo-Wei Tsai, Jun-Wei Hsieh, and Kao-Chin Fan, “Vehicle Detection Using Normalized Color and Edge Map”, IEEE Trans. on Image Processing, vol. 16, issue. 3, March 2007, pp. 850-864, the disclosures of which are incorporated herein by reference in their entirety.
Detecting vehicles from video data make use of information available in video. In addition to spatial information, these methods typically exploit temporal differences/similarities between the frames for vehicle detection. Detection of objects using such data from video is discussed, for example, in U.S. patent application Ser. No. 13/441,269 filed Apr. 6, 2012; Ser. No. 13,461,266 filed May 1, 2012 to Bulan, et al.; Ser. No. 13,461,266 filed May 1, 2012 to Buten et al., and Ser. No. 13/461,221 filed May 1, 2012 to Bulan et al. each of which is commonly assigned with the current application and incorporated herein by reference in their entirety. Other detection techniques are discussed in A. Makarov, J. Vesin, and M. Kunt “Intrusion Detection Using Extraction of Moving Edges, 12th IAPR Int. Conf. on Pattern. Recognition, V1, IAPR, pages 804-807, IEEE Press, 1994 and N. Paragious and G. Tziritas, “Detection and Location of Moving Objects Using Deterministic Relaxation Algorithms, ICPR, No. 13, pp. 201-286, Vienna, Austria, August 1996, the disclosures of which are incorporated herein by reference in their entirety.
In an embodiment, vehicles can be detected by constructing a background estimate and subtracting the constructed background from a given frame. The method first initializes the background as the first frame and gradually updates the background as new frames become available. The background is updated with a new frame as:
B
t+1
=p*F
t+1+(1−p)*Bt
where Bt is the background at time t, Ft+1 is the frame at time t+1, p is 0≦p≦1, and p is an image updating factor, and * is the operator for element-wise multiplication. If p is set to 1 for all pixels, then the estimated background at any given time is equal to the current frame. Conversely, if p is set to 0, the background remains the same as the background at time t. In other words, p controls the updating rate of the background.
The type of application determines the specific choices for different values of p. In particular, the proposed selection method for the learning factor p is well suited for detecting vehicles. The following criteria can be defined to set the learning parameter for each pixel at each time instant:
The learning parameter p for a pixel indexed or addressed by (i,j) can be set to 0 if foreground is detected at that pixel location. Under these conditions, background pixels are not updated for these locations. For all other pixels, the learning parameter is set to ps, which is a small positive number to update the background gradually.
Once the background is estimated at a given time, the vehicles that are entering/exiting the street can be detected by subtracting the current frame from the estimated background and applying thresholding and morphological operations on the difference image.
Optionally, when a vehicle is detected using the background subtraction algorithm detailed above, the detected vehicle (i.e., “blob”) can be verified through a validation module. The validation module can extract image features, for example a histogram of gradients (HOG), scale invariant feature transform (SIFT), edges, aspect ratio, etc. and verifies the detected blob using the extracted features and a classifier, for example linear/nonlinear support vector machine (SVM), nearest neighbor classifier, etc., which is trained offline. This ensures that the detected object is a vehicle and not another object, for example a moving object such as a pedestrian, bicyclist, etc.
In an embodiment of a method for estimating parking occupancy, for example in street segment 15, a method to count each vehicle entering and exiting street segment 15 and an estimate of the number of vehicles transiting through street segment 15 can be generated and stored in the processor 20. In one estimation method, a count-based method determines current traffic flow and generates a timeline of vehicle counts as vehicles enter and exit street segment 15 from either direction. As a vehicle enters street segment 15, a video camera 16 detects the vehicle and sends information relative to the specific individual vehicle to the processor 20, which the processor 20 uses to associate with each vehicle. For example, the processor may process the information into feature vectors to determine that a vehicle is a “red sedan,” “blue pickup,” “white van,” etc. Feature vectors can include, for example, vehicle color, vehicle type (i.e., sedan, coup, convertible, etc.), vehicle size (i.e., car, van, SUV, delivery truck, etc), or a combination of two or more feature vectors. Vehicles entering the street segment 15 can be tracked with an “entry” variable (e.g., “C+(t)”) and, upon detection, the count of the entry variable is increased by one. As a previously detected vehicle exits the street segment, a video camera 16 detects the exiting vehicle and sends the exit information relative to the specific vehicle to the processor 20, and an “exit” variable (e.g., “C−(t)”) is increased by one. The processor 20 can then estimate the number of vehicles within the street segment 15 by subtracting C−(t) from C+(t).
In addition, to improve accuracy so that transiting vehicles are not counted as parked vehicles, a typical vehicle transit time through the street segment 15 for vehicles proceeding directly from its entry point to its exit point (i.e., vehicles using the street segment 15 but not parking). The typical transit time can be denoted as “Δt,” and Δt may include an adder for tolerance. For example, for a 200 foot long section of street, an average Δt may be about 6.8 seconds assuming an average speed of about 20 mph for through traffic. As a vehicle enters street segment 15, its count is added to variable C+(t), and it is given time Δt to exit street segment 15. If the specific vehicle is not detected as exiting as a function of its entry time plus Δt, it is counted as a parked vehicle and the estimated parking occupancy for street segment 15 is increased accordingly. If an exiting vehicle is detected that has a transit time that exceeds Δt, the vehicle is assumed to be a previously parked vehicle that has exited street segment 15, the exit count C−(t) is increased by one, and the estimated parking occupancy is decreased accordingly. This is in contrast to some conventional parking estimation techniques and structures that simply subtract the number of exits from the number of entrances to determine a parking occupancy, and may assume all vehicles that enter are parked. This conventional technique may work for parking garages and lots, but would be inaccurate for street parking or parking areas having through traffic. Thus embodiments of the present teachings may provide a more accurate parking occupancy estimate.
The transit time Δt can be determined using various techniques. For example, a human operator can periodically measure the transit time for a specific vehicle as it transits through the street segment 15 and the measured value can be entered into the processor 20 using direct input or input from a hand-held device and subsequently used as Δt. In another technique, the processor 20 can use historical data, for example a measured Δt from previous same days of the week. In another technique, the processor 20 in cooperation with the video cameras 16 keeps a live running value for Δt based on vehicles entering street segment 15 and exiting within Δt±a tolerance value, and then adjusting Δt based on the measured transit time. Further, Δt may change during the day based on the time of day, the day of the week, and/or the time of year, and may therefore be adjusted for diurnal, weekly, monthly, or yearly traffic patterns. For example, Δt may be shorter during early mornings and late evenings, and longer during rush hour traffic. Diurnal adjustments to Δt and historical norms for Δt can be programmed into the processor 20 for automated updating.
Traffic flow can also be described using the feature vectors described above, but using a different estimating method. In this case a vehicle sequence estimation technique. In this embodiment, in a given sequence time interval ΔST, (an arbitrary user-selected sequence time interval over which vehicle entry and exit sequence is monitored) a sequence of detected vehicles entering the street might be, for example, red sedan,” “blue pickup,” “blue sedan.” Within the sequence time interval ΔST, the sequence of exiting vehicles detected is “red sedan,” “blue sedan.” In this embodiment, the blue pickup can counted as a parked vehicle if it is not detected exiting with its entry time plus Δt.
Similarly, if a sequence of vehicles entering the street segment 15 is determined to be “red sedan,” “blue sedan” and within sequence time interval ΔST a sequence of detected exiting vehicles is “red sedan,” “black van,” “blue sedan,” the black van can be counted as a parked vehicle which has exited a parking spot and the street segment 15. The count of the formerly parked black van can be added to the number of vehicles absent from the entry sequence and present in the exit sequence during the sequence time interval ΔST and the estimated parking occupancy can be decreased accordingly.
C
P(t)=CP(t−ΔST)+ΔC+−ΔC−
where CP(t) is an estimated parking occupancy at a time t during the sequence time interval ΔST where time t is subsequent to the initial time t=0, CP(t−ΔST) is an estimated parking occupancy at a time t−ΔST, ΔC+ is a number of vehicles present in the entry sequence and absent from the exit sequence when the entering and exiting sequences are compared, ΔC− is a number of vehicles absent from the entry sequence and present in the exit sequence when the entering and the exiting sequences are compared.
In another method according to the present teachings, the parking occupancy or transit occupancy can be estimated by a count-based method using mathematical techniques. In this embodiment, vector features for specific vehicles are not necessary and a mathematical algorithm is used to estimate parking occupancy at a given time “t” that is after measurement begins at t=0. For example, an estimated parking occupancy or occupancy can be calculated using the formula:
C
P(t)=CP0+C+(t−Δt)−[C−(t)−CT0]
where CP(t) is the estimated parking occupancy at a given time t during the measurement cycle where time t is subsequent to time t=0, CP0 is an initialization data count of vehicles parked within the area of interest at the beginning of a measurement cycle (t=0) as discussed above, C+(t−Δt) is a count of the number of vehicles entering the street segment 15 from time t=0 up to time t−Δt where Δt is the estimated transit time for a vehicle from the start of the interest area to the end of the interest area, and t−Δt is greater than t=0, C−(t) is the count of the number of vehicles exiting the street segment 15 from time t=0 up to the time t, and CT0 is an initial count of transiting vehicles within the street segment 15 at the beginning of a measurement cycle (t=0). CT0 can be determined for the beginning of the measurement cycle (t=0) by a manual count of the transiting vehicles by a human operator or using another technique. Thus both CP0 and CT0 are initialization data entered into the processor 20, for example directly or through a hand-held device by a human operator, for the beginning the measurement cycle (t=0).
In another embodiment as depicted in
Once a parking occupancy estimate has been generated, any number of operations can be performed. For example, the parking occupancy estimate can be made available for public or private viewing. The parking occupancy estimate can be configured for public or private viewing, for example using the processor 20 to broadcast the parking occupancy data or uploading the parking occupancy data from the processor 20 to a network such as the Internet. A user can access the parking occupancy estimate through a processor or computing device such as a laptop, smartphone, tablet, handheld device, GPS navigation system, etc., and use the data to determine the likelihood of finding available parking.
It is also contemplated that a parking management entity can set dynamic parking rates based on parking occupancy. For example, a parking rate charged by the parking management entity during periods of low parking space availability (high parking occupancy) may be higher than during periods of high parking space availability (low occupancy parking). The dynamic parking rates can be configured for public or private viewing such that a user can determine current parking rates almost instantaneously. Dynamic parking rates can be configured for public or private viewing, for example using the processor 20, to broadcast dynamic parking rates or uploading the dynamic parking rates from the processor 20 to a network such as the internet. A user can access the dynamic parking rate through a processor or computing device such as a laptop, smartphone, tablet, handheld device, GPS navigation system, etc., and select parking based on the current dynamic parking rate.
Additionally, it is contemplated that parking payment devices 34 (
Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the present teachings are approximations, the numerical values set forth in the specific examples are reported as precisely as possible. Any numerical value, however, inherently contains certain errors necessarily resulting from the standard deviation found in their respective testing measurements. Moreover, all ranges disclosed herein are to be understood to encompass any and all sub-ranges subsumed therein. For example, a range of “less than 10” can include any and all sub-ranges between (and including) the minimum value of zero and the maximum value of 10, that is, any and all sub-ranges having a minimum value of equal to or greater than zero and a maximum value of equal to or less than 10, e.g., 1 to 5. In certain cases, the numerical values as stated for the parameter can take on negative values. In this case, the example value of range stated as less than 10 can assume negative values, e.g. −1, −2, −3, −10, −20, −30, etc.
While the present teachings have been illustrated with respect to one or more implementations, alterations and/or modifications can be made to the illustrated examples without departing from the spirit and scope of the appended claims. For example, it will be appreciated that while the process is described as a series of acts or events, the present teachings are not limited by the ordering of such acts or events. Some acts may occur in different orders and/or concurrently with other acts or events apart from those described herein. Also, not all method stages may be required to implement a methodology in accordance with one or more aspects or embodiments of the present teachings. It will be appreciated that structural components and/or processing stages can be added or existing structural components and/or processing stages can be removed or modified. Further, one or more of the acts depicted herein may be carried out in one or more separate acts and/or phases. Furthermore, to the extent that the terms “including,” “includes,” “having,” “has,” “with,” or variants thereof are used in either the detailed description and the claims, such terms are intended to be inclusive in a manner similar to the term “comprising.” The term “at least one of is used to mean one or more of the listed items can be selected. The term “about” indicates that the value listed may be somewhat altered, as long as the alteration does not result in nonconformance of the process or structure to the illustrated embodiment. Finally, “exemplary” indicates the description is used as an example, rather than implying that it is an ideal. Other embodiments of the present teachings will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the present teachings being indicated by the following claims.
Terms of relative position as used in this application are defined based on a plane parallel to the conventional plane or working surface of a workpiece, regardless of the orientation of the workpiece. The term “horizontal” or “lateral” as used in this application is defined as a plane parallel to the conventional plane or working surface of workplace, regardless of the orientation of the workplace. The term “vertical” refers to a direction perpendicular to the horizontal. Terms such as “on,” “side” (as in “sidewall”), “higher,” “lower,” “over,” “top,” and “under” are defined with respect to the conventional plane or working surface being on the top surface of the workpiece, regardless of the orientation of the workpiece.