The invention relates to the management of parking facilities, more particularly, to the detection of available parking spaces in a parking facility using a lighting system.
Growth of several metropolitan areas, urban revitalization and the mobility of our society using vehicles has increased the need for an efficient transport infrastructure. The demand for parking is increasing and represents a major concern. Advanced Parking Management Systems (APMS) are used to assist motorists in finding parking spaces quickly. APMS can provide real-time information of parking space occupancy for several facilities. This information is used to generate parking availability messages that are displayed by message information boards through several different means.
System accuracy is a critical factor. The method used for vehicle counting is an important aspect of the APMS. Two types of systems are generally used for counting. The first one is based on entry/exit counters. Usually, inductive loop counters, RF tags and video detection are used. The second method is based on space occupancy detectors and use ultrasonic, inductive loop, infrared or microwave sensors. In that case, each sensing unit is installed over individual lot spaces and detects if the lot is available or not. Entry/exit counters are not as accurate as individual space sensors but are easier to implement. The installation of space sensors or the availability of mounting space for detectors in pre-existing parking facilities can be a problem. The ability to communicate data for each individual space is also an important issue to consider. The degree of complexity of the installation of the sensors can have an important impact on costs for an APMS and is another issue for these systems.
Some systems used for detecting available parking spaces use an emitter and a receiver. A signal is transmitted from the emitter to the receiver to detect an available space. When the receiver is hidden behind a vehicle, the signal will not be able to reach the receiver and the system will conclude that a vehicle is present. Other systems use optical sensors which read barcodes written on the pavement. When the optical sensor is not able to read the bar code, the system concludes that a vehicle is parked in the parking space.
In a vision-based system, each camera is able to survey more than one parking space. Some are able to identify features such as plate number, color, make and model of a vehicle. If such precision of detection is required, the performance of the camera is critical, notably in terms of resolution. Also, vision-based systems need a high bandwidth to communicate the data. Some smart cameras have an embedded processing system but are expensive. Other vision-based systems use stereoscopic principles to determine the occupancy of a parking space.
For all of these prior art solutions, costly installation is necessary. Typically, sensors and cameras must be mounted on walls, on a pillar, on a rail or must be embedded in the parking space. Prior art parking management systems are stand-alone systems that do not integrate within the current lighting infrastructure, using their own power line and, own cable interfaces. Some sensors are powered by batteries and require maintenance over time.
It is therefore an aim of the present invention to address at least one of the above mentioned difficulties
It is an object of the present invention to provide a lighting system capable of detecting the presence of vehicles for advanced parking management.
It is a further object of the present invention to provide a method for detecting the presence of vehicles for advanced parking management.
By integrating detection of the availability of a parking space within a LED lighting fixture, the benefits for the parking operator are important: by the return through low cost installation, low cost maintenance, lower energy cost and by improvement of parking efficiency.
A system and a method for detecting availability of a parking space in a parking facility are provided.
According to a first aspect of the present invention, there is provided a method for determining an availability of a parking space comprising: providing a lighting system having at least one visible-light source for illumination of at least part of the parking space. Providing an available space time-of-flight trace. Providing an availability threshold value. Illuminating the at least part of the parking space using the at least one visible-light source. Emitting a status visible-light signal from the visible-light source in the predetermined direction toward the predetermined target in the parking space. Capturing a status reflection trace at the visible-light source. Determining a time-of-flight difference value by comparing the status reflection trace to the available space time-of-flight trace. Comparing the time-of-flight difference value with the availability threshold value and determining a status of the parking space to be one of available and not available.
According to a second aspect of the present invention, there is provided a method for determining an availability of a parking space comprising: providing a lighting system having at least one visible-light source for illumination of at least part of the parking space. Providing a camera. Providing an available space region value. Providing an availability threshold value. Illuminating the at least part of the parking space using the at least one visible-light source. Emitting visible light from the visible-light source in the predetermined direction to the predetermined target in the parking space. Capturing a reflection of the emitted visible light at the camera and determining a status region value. Determining a region difference value by comparing the status region value to the available space region value. Comparing the region difference value with the availability threshold value and determining a status of the parking space to be one of available and not available.
According to a third aspect of the invention, there is provided a system for detecting availability of a parking space in a parking facility, comprising: a powered lighting module having at least one visible-light source driven by an illumination driver, an optical detector for capturing a status reflection trace at said lighting module, a lighting module processor for controlling said illumination driver and said optical detector and receiving said status reflection trace, a powered central unit having a memory, a central unit processor and a network for communicating information between said processor and said central unit.
Having thus generally described the nature of the invention, reference will now be made to the accompanying drawings, showing by way of illustration a preferred embodiment thereof and in which:
It will be noted that throughout the appended drawings, like features are identified by like reference numerals.
Typically, illumination for parking facilities is made by various lighting modules using metal halide, high pressure sodium or fluorescent lights. Recently, due an increase in performance needs and cost reductions, LEDs are now another source of illumination. LEDs have several advantages notably in terms of lifetime, efficiency, color rendering index (CRI) and robustness.
An example of a parking facility equipped with the system of an embodiment of the invention is shown in
Each device 100 provides an occupancy status for at least one individual parking space 102. This information is transmitted by the communications network 103 to a central computer (not shown). The device has at least one field of view for the detection and, in the calibration process, will take into consideration permanent obstacles, such as structural pillars 104 which could be obstructing part of the field of view.
The visible-light source 512 is connected to a source controller 514, so as to be driven in order to produce visible light. In addition to emitting light, the system 510 performs detection of objects and particles (vehicles, passengers, pedestrians, airborne particles, gases and liquids) when these objects are part of the environment/scene illuminated by the light source 512. Accordingly, the source controller 514 drives the visible-light source 512 in a predetermined mode, such that the emitted light takes the form of a light signal, for instance by way of amplitude-modulated or pulsed light emission.
These light signals are such that they can be used to provide the illumination level required by the application, through data/signal processor 518 and source controller 514, while producing a detectable signal. Accordingly, it is possible to obtain a light level equivalent to a continuous light source by modulating the light signal fast enough (e.g., frequency more than 100 Hz) to be generally imperceptible to the human eye and having an average light power equivalent to a continuous light source.
In an embodiment, the source controller 514 is designed to provide an illumination drive signal, such as a constant DC signal or a pulse-width modulated (PWM) signal, that is normally used in lighting systems to produce the required illumination and control the intensity. The illumination drive signal is produced by the illumination driver sub-module 514A of the controller 514.
The modulated/pulsed drive signal is produced by a modulation driver sub-module 514B of the controller 514. The amplitude of short-pulse (typ. <50 ns) can be several times the nominal value while the duty cycle is low (typ. <0.1%). The modulation can be up to several MHz.
The modulator driver 514B can also be used to send data for optical communication. Both driving signals can be produced independently or in combination. Sequencing of the drive signals is controlled by the data/signal processor 518. The light source 512 can be monitored by the optical detector and acquisition 516 and the resulting parameters sent to the data/signal processor 518 for optimization of data processing.
An alternative for sourcing the light signal for detection involves an auxiliary light source (ALS) 522, which can be a visible or non-visible source (e.g., UV or IR light, LEDs or laser) using the modulation driver 514B. The auxiliary light source 522 provides additional capabilities for detecting objects and particles. IR light can be used to increase the performance and the range of the detection area. IR lights and other types of light can be used to detect several types of particles by selecting specific wavelengths. The auxiliary light source 522 can also be useful during the installation of the system by using it as a pointer and distance meter reference. It can also be used to determine the condition of the lens.
The visible-light source 512 is preferably made up of LEDs. More specifically, LEDs are well suited to be used in the lighting system 510 since LED intensity can be efficiently modulated/pulsed at suitable speeds. Using this feature, current lighting systems already installed and featuring LEDs for standard lighting applications can be used as the light source 512 for detection applications, such as presence detection for energy savings, distance measurements, smoke detection and spectroscopic measurements for gas emission.
The system 510 has at least one lens 530 through which light is emitted in an appropriate way for the parking applications. At least one input lens section 530a of at least one lens 530 is used for receiving the light signal, for instance reflected or diffused (i.e., backscattered) by the objects 534. This input lens section 530a can be at a single location or distributed (multiple zone elements) over the lens 530 and have at least one field of view. Several types of lens 530 can be used, such as Fresnel lenses and fisheye lenses for instance. A sub-section of the lens 530 can be used for infrared wavelength. A sub-section of the lens 530 can be used for optical data reception.
A detector and acquisition module 516 is associated with the visible-light source 512 and/or auxiliary light source 522 and the lens 530. The detector and acquisition module 516 is an optical detector (or detectors) provided so as to collect light emitted by the light source 512/ALS 522 and back-scattered (reflected) by the vehicle/pedestrian 534. Detector and acquisition module 16 can also monitor the visible-light source 512 or auxiliary light source 522. The light signal can also come from an object 534 being the direct source of this light (such as a remote control) in order to send information to the data/signal processor through the optical detector module 516. The optical detector and acquisition module 516 is, for example, composed of photodiodes, avalanche photodiodes (APD), photomultipliers (PMT), complementary metal oxide semiconductor (CMOS) or charge coupled device (CCD) array sensors, 3D camera sensors (time-of-flight depth sensor), with appropriate acquisition circuits to digitalize the data.
Filters are typically provided with the detector module 516 to control background ambient light emitted from sources other than the lighting system 510. Filters can also be used for spectroscopic measurements and to enhance performance of the optical detector module 516.
The data/signal processor 518 is connected to the detector and acquisition module 516 and receives digitalized data. The data/signal processor 518 is also connected to the source controller 514, so as to receive driving data there from. The data/signal processor 518 has a processing unit (e.g., CPU) so as to interpret the data from the detector and acquisition module 516, in comparison with the driving data of the source controller 514, which provides information about the predetermined mode of emission of the light signals emitted by the visible-light source 512.
Accordingly, information about the vehicle and pedestrian (e.g., presence, distance, etc.) is calculated by the data/signal processor 518 as a function of the relation (e.g., phase difference, relative intensity, spectral content, time of flight, etc.) between the driving data and the detected light data. A database 520 may be provided in association with the data/signal processor 518 so as to provide historical data (serial number, date of installation, etc), calibration (data store during the calibration process, threshold for instance), and tabulated data to accelerate the calculation of the object parameters (signal reference for instance).
In view of the calculation it performs, the data/signal processor 518 controls the source controller 514 and thus the light output of the visible-light source 512. For instance, the visible-light source 512 may be required to increase or reduce its intensity, or change the parameters of its output. For example, changes in its output power can adapt the lighting level required in daytime conditions versus nighttime conditions. The system can also slightly increase the level of luminance when activities are detected. In the case of the configuration of
The system 510 has sensors 532 connected to the data/signal processor 518. Sensors 532 can be passive infrared sensors, temperature sensors, day/night sensors, etc. Sensors 532 are useful during the installation of the system and during operation of the system. For instance, data from a passive infrared sensor can be useful to detect the presence of a pedestrian and adapt the level of light output or to transmit the information of a presence of pedestrian to the central computer. Information from sensors 532 and data/signal processor 518 and light from light source 512 and auxiliary light source 522 can be useful during installation, in particular for adjusting the field of view of the optical receiver. The system 510 has a power supply and interface 528. The supply input receives a nominal voltage to operate the system, typically 120Vac or 240Vac. This input voltage is transformed to supply the lighting source and electronic circuits. The interface section is connected to a Data/signal processor and communicates information to an external system 540 (via wireless, power line, Ethernet, etc.). This communicated information is related to the occupancy of each parking space. Data transmission from sensors or visual data from 2D sensors can also be used, notably for security purposes. Information from the lighting network is sent to central computer that controls the advanced parking management system. Central computer 540 can configure and control each lighting module in the network. For instance, a parking space can be reserved for a specific function and the central computer can set the lighting module to indicate that this space is not available even if no vehicle is detected by the sensor of the lighting module. Lighting modules can be configured as a mesh network. Information can be shared between each module in the network.
Detection Based on the Time of Flight Principle
The time of flight principle is especially well suited for a parking management system. The range for the detection is relatively short (typically less than 10 m) and the reflected signal is relatively strong. The requirements for accuracy and refresh rate are relatively low. The power of the source does not need to be high. In fact, only a one bit resolution is needed to determine the occupancy of a parking spot using relative distance measurement namely whether a vehicle is present or absent. However, the electronic circuits still need to detect small signals with a few nanosecond of resolution. Moreover, when a LED light source with pulse width modulation (PWM) is used as the source for detection and ranging, the rise time for the signal source could be several tens and even hundreds of a nanosecond.
In order to detect occupancy of a parking space at a low cost, the following method can be used. A LED light source is used for illumination with a PWM circuit to control the intensity of illumination. The same PWM LED light source is also used as a source for detection and ranging measurement. An optical detector and acquisition sub-system samples the reflected or backscattered returned light signal from a parking space. The distance is estimated and compared to the distance to the floor measured during the calibration. The status of the occupancy of the parking space is determined and sent to an external system.
During the calibration process, a threshold is set to discriminate the presence or the absence of a vehicle in the parking space. For example, the distance between the floor of a parking space and the lighting module can be 6 meters. The time-of-flight trace of the reflection of the optical signal between the lighting module and the floor is stored in the memory of the lighting module. When a vehicle is displaced into the parking space, the distance for the optical reflection path of the light is shorter and the time-of-flight trace will shift a few nanoseconds earlier (ex.: 10 ns for a distance change of 1.5 meter). A threshold of 5 ns can be set to clearly define the status of the occupancy of the parking space.
At least one detector is used per parking space. Several detectors can be multiplexed and can use the same acquisition circuit. One detector can be used with different fields of view when sequential sources illuminate different sections of the scene. Time-of-flight depth sensors or 3D imaging sensors can also be used as detectors for the system. Mechanical adjustment can be required to set the right position of the field-of-view (FOV) of the sensor, particularly when only one FOV per parking space is installed. When a plurality of FOVs are available per parking space, selection of at least one FOV per parking space is determined during the installation typically using a computer communicating with the lighting module.
The detection circuit detects the backscattered signal and the system acquisition and processing determines the delay between the signals sent and received. This delay is determined in part by the distance of the background or the object (ex: parking ground or a vehicle) which is present in the field of view. It is also determined by the delays generated by the electronic circuits. These circuits may introduce a significant drift in the delays. This drift is relatively slow, generally due to the thermal variation.
Starting from the measured delay, the Data/Signal Processor calculates the relative variation between the new measurement of the delay and the previous delay. A significant reduction in the shift indicates that an object closer to the detector was detected in the field of view (ex.: a new vehicle entering in the parking space), while a significant increase in the shift indicates that an object was removed from the field of view (ex.: a vehicle leaving the parking space).
During configuration, a Noise Level parameter (NL, typically in ns) is initialized in order to determine what is considered a significant variation with respect to the system noise. A Threshold parameter (Th, typically in ns) is also initialized during the configuration in order to determine what is considered a significant variation representing a change in the parking space occupancy (arrival or departure of vehicle). The reading is taken for a sufficiently long period in order to discriminate events related to the parking of a vehicle versus events related to other activities (the crossing of a pedestrian, vehicle partially covering the field of view). A Setup Time parameter (ST, typically in ms) is initialized during the configuration to discriminate between two types of events: arrival or departure of a vehicle in the parking space versus other activities. If the status of the parking space changes or if no event is observed, the measurement of the delay will contribute as the reference for the next measurement. The current value of the current background during initialisation of the system and the state of the parking status are stored.
With specific reference to
All the information related to the events is transferred by interface to the computer. This information allows the central computer to update its database information concerning the occupancy of parking spaces, to modify the parking signal indicators integrated within the lighting module for the occupancy of the parking spaces, to indicate to the vehicles in the aisle by means of these indicators of the parking space occupancy and warn of movement notably for safety applications (ex.: vehicle backing from a parking space) and to control the level intensity of lights within the parking to facilitate the management of the parking. During installation, a central computer or a local computer allow the configuration of each lighting module part of the lighting network (upgrade, set parameters, etc.) and to calibrate and to determine the start of configurations. The parameters may be modified subsequently.
Detection Based on Triangulation
This method uses sequential illumination sources and a camera. The lighting device 800 illuminates sequentially the scene by regions at a frequency imperceptible to the human eye. Sub-sections 880 of the lighting device 800 illuminate at least one region 801 over a certain number of parking spaces. At least one camera 884 located at a certain distance from the sub-section 880 of the lighting device 800 integrates the reflection signal 883 which originates from this specific lighting signal 801. During the calibration process, without any vehicle, the lighting signal 801 reaches the floor 809 and the reflected signal 883 is captured by the camera 884. The shape of the illumination of the region is stored in the database. When vehicles are parked in the parking spaces, the lighting signal 801 is reflected from the tip of the vehicle 803 in the field of view to the camera 884 at a different angle and the camera 884 can detect a modification in the shape of the illumination of the region and can estimate the probability of the occupancy of the parking space. The distance between sub-sections 880 and camera 884 permits the use of triangulation principle. For instance, if sub-sections 880 illuminate an available parking space with a pattern, a line for example, this pattern is captured by the camera. When a vehicle enters in the parking space, the pattern seen by the camera will change (for example, a line will have transitions) and this modification of the pattern is used as the information of the presence of a vehicle.
Auxiliary light sources can be used to generate specific lighting patterns in the parking spaces for detection purposes. Clusters of pixels can be defined during calibration as targets to exploit during operation. This selection can be made using calibration software installed in the computer of the external system.
In summary, with reference to
The available space time-of-flight trace can optionally include a trace of a reflection of an obstacle located between the visible-light source and the predetermined target and the determining a time-of-flight difference value includes considering the trace of the reflection of the obstacle 922.
The method preferably comprises indicating a status of the availability of the parking space using a status indicator for the parking space 920.
Optionally, invisible light source can be provided and used 902.
With reference to
The available space region value can optionally take into consideration, a permanent obstacle in the field of view 956.
The method preferably comprises indicating a status of the availability of the parking space using a status indicator for the parking space 972.
Optionally, invisible light source can be provided and used 902. The available space time-of-flight trace can optionally include a trace of a reflection of an obstacle located between the visible-light source and the predetermined target and the determining a time-of-flight difference value includes considering the trace of the reflection of the obstacle 922.
The method preferably comprises indicating a status of the availability of the parking space using a status indicator for the parking space 920.
This invention provides a cost effective solution by integrating an energy efficient illumination with added-value parking guidance with real-time data of parking space vacancy. The strength of the invention resides in the important valuation of the sensing functions brought to the lighting system and by the advantage in terms of diminution of cost of installation since only one installation is required for the two functions (illumination and detection) covering the same area and sharing the same electrical infrastructure.
While illustrated in the block diagrams as groups of discrete components communicating with each other via distinct data signal connections, it will be understood by those skilled in the art that the illustrated embodiments may be provided by a combination of hardware and software components, with some components being implemented by a given function or operation of a hardware or software system, and many of the data paths illustrated being implemented by data communication within a computer application or operating system. The structure illustrated is thus provided for efficiency of teaching the described embodiment.
This application is a national phase entry of PCT Application Number PCT/CA2008/002248 entitled “PARKING MANAGEMENT SYSTEM AND METHOD USING LIGHTING” filed on Dec. 19, 2008, which in turn claims priority of US provisional patent applications nos. 61/015,738 and 61/015,867, both filed on Dec. 21, 2007 by Applicant, the specifications of which are hereby incorporated by reference.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/CA2008/002248 | 12/19/2008 | WO | 00 | 6/18/2010 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2009/079779 | 7/2/2009 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3680085 | Del Signore | Jul 1972 | A |
4717862 | Anderson | Jan 1988 | A |
4808997 | Barkley et al. | Feb 1989 | A |
4891624 | Ishikawa et al. | Jan 1990 | A |
4928232 | Gentile | May 1990 | A |
5102218 | Min et al. | Apr 1992 | A |
5134393 | Henson | Jul 1992 | A |
5179286 | Akasu | Jan 1993 | A |
5317311 | Martell et al. | May 1994 | A |
5357331 | Flockencier | Oct 1994 | A |
5381155 | Gerber | Jan 1995 | A |
5389921 | Whitton | Feb 1995 | A |
5621518 | Beller | Apr 1997 | A |
5633629 | Hochstein | May 1997 | A |
5633801 | Bottman | May 1997 | A |
5714754 | Nicholas | Feb 1998 | A |
5760686 | Toman | Jun 1998 | A |
5760887 | Fink et al. | Jun 1998 | A |
5764163 | Waldman et al. | Jun 1998 | A |
5777564 | Jones | Jul 1998 | A |
5805468 | Bl{dot over (o)}hbaum | Sep 1998 | A |
5812249 | Johnson et al. | Sep 1998 | A |
5828320 | Buck | Oct 1998 | A |
5838116 | Katyl et al. | Nov 1998 | A |
5889477 | Fasterath | Mar 1999 | A |
5896190 | Wangler et al. | Apr 1999 | A |
5942753 | Dell | Aug 1999 | A |
5995900 | Hsiao et al. | Nov 1999 | A |
6044336 | Marmarelis et al. | Mar 2000 | A |
6094159 | Osterfeld et al. | Jul 2000 | A |
6100539 | Blümcke et al. | Aug 2000 | A |
6104314 | Jiang | Aug 2000 | A |
6107942 | Yoo et al. | Aug 2000 | A |
6115113 | Flockencier | Sep 2000 | A |
6142702 | Simmons | Nov 2000 | A |
6147624 | Clapper | Nov 2000 | A |
6166645 | Blaney | Dec 2000 | A |
6259515 | Benz et al. | Jul 2001 | B1 |
6259862 | Marino et al. | Jul 2001 | B1 |
6266609 | Fastenrath | Jul 2001 | B1 |
6285297 | Ball | Sep 2001 | B1 |
6340935 | Hall | Jan 2002 | B1 |
6377167 | Juds et al. | Apr 2002 | B1 |
6404506 | Cheng et al. | Jun 2002 | B1 |
6411221 | Horber | Jun 2002 | B2 |
6417783 | Gabler et al. | Jul 2002 | B1 |
6426708 | Trajkovic et al. | Jul 2002 | B1 |
6502011 | Haag | Dec 2002 | B2 |
6502053 | Hardin et al. | Dec 2002 | B1 |
6516286 | Aebischer et al. | Feb 2003 | B1 |
6548967 | Dowling et al. | Apr 2003 | B1 |
6556916 | Waite et al. | Apr 2003 | B2 |
6559776 | Katz | May 2003 | B2 |
6642854 | McMaster | Nov 2003 | B2 |
6650250 | Muraki | Nov 2003 | B2 |
6665621 | Drinkard et al. | Dec 2003 | B2 |
6753766 | Patchell | Jun 2004 | B2 |
6753950 | Morcom | Jun 2004 | B2 |
6765495 | Dunning et al. | Jul 2004 | B1 |
6771185 | Yoo et al. | Aug 2004 | B1 |
6794831 | Leeb et al. | Sep 2004 | B2 |
6825778 | Bergan et al. | Nov 2004 | B2 |
6831576 | Geiger et al. | Dec 2004 | B2 |
6836317 | Perger | Dec 2004 | B1 |
6842231 | Nourrcier et al. | Jan 2005 | B2 |
6850156 | Bloomfield et al. | Feb 2005 | B2 |
6885311 | Howard | Apr 2005 | B2 |
6885312 | Kirkpatrick | Apr 2005 | B1 |
6917307 | Li | Jul 2005 | B2 |
6927700 | Quinn | Aug 2005 | B1 |
6946974 | Racunas, Jr. | Sep 2005 | B1 |
7026954 | Slemmer et al. | Apr 2006 | B2 |
7081832 | Nelson et al. | Jul 2006 | B2 |
7106214 | Jesadanont et al. | Sep 2006 | B2 |
7116246 | Winter et al. | Oct 2006 | B2 |
7119674 | Sefton | Oct 2006 | B2 |
7119715 | Orita | Oct 2006 | B2 |
7123166 | Haynes et al. | Oct 2006 | B1 |
7135991 | Slemmer et al. | Nov 2006 | B2 |
7148813 | Bauer | Dec 2006 | B2 |
7221271 | Reime | May 2007 | B2 |
7236102 | Shimotani | Jun 2007 | B2 |
7250605 | Zhevelev et al. | Jul 2007 | B2 |
7253747 | Noguchi | Aug 2007 | B2 |
7317384 | Lefranc | Jan 2008 | B2 |
7321317 | Nath et al. | Jan 2008 | B2 |
7350945 | Albou et al. | Apr 2008 | B2 |
7352972 | Franklin | Apr 2008 | B2 |
7405676 | Janssen | Jul 2008 | B2 |
7417718 | Wada et al. | Aug 2008 | B2 |
7426450 | Arnold et al. | Sep 2008 | B2 |
7486204 | Quintos | Feb 2009 | B2 |
7554652 | Babin et al. | Jun 2009 | B1 |
7573400 | Arnold et al. | Aug 2009 | B2 |
7635854 | Babin | Dec 2009 | B1 |
7640122 | Levesque et al. | Dec 2009 | B2 |
7895007 | Levesque et al. | Feb 2011 | B2 |
7917320 | Levesque et al. | Mar 2011 | B2 |
7957900 | Chowdhary et al. | Jun 2011 | B2 |
8242476 | Mimeault et al. | Aug 2012 | B2 |
20030154017 | Ellis | Aug 2003 | A1 |
20030189500 | Lim | Oct 2003 | A1 |
20040035620 | McKeefery | Feb 2004 | A1 |
20040083035 | Ellis | Apr 2004 | A1 |
20040135992 | Munro | Jul 2004 | A1 |
20050046597 | Hutchison et al. | Mar 2005 | A1 |
20050117364 | Rennick et al. | Jun 2005 | A1 |
20050187701 | Baney | Aug 2005 | A1 |
20050231384 | Shimotani | Oct 2005 | A1 |
20050269481 | David et al. | Dec 2005 | A1 |
20050270175 | Peddie et al. | Dec 2005 | A1 |
20060033641 | Jaupitre | Feb 2006 | A1 |
20060145824 | Frenzel et al. | Jul 2006 | A1 |
20060147089 | Han et al. | Jul 2006 | A1 |
20060149472 | Han et al. | Jul 2006 | A1 |
20060180670 | Acosta et al. | Aug 2006 | A1 |
20070018106 | Zhevelev et al. | Jan 2007 | A1 |
20070061192 | Chew | Mar 2007 | A1 |
20070091294 | Hipp | Apr 2007 | A1 |
20070096943 | Arnold et al. | May 2007 | A1 |
20070205918 | Riesco Prieto et al. | Sep 2007 | A1 |
20070222639 | Giles et al. | Sep 2007 | A1 |
20070228262 | Cantin et al. | Oct 2007 | A1 |
20070255525 | Lee et al. | Nov 2007 | A1 |
20080006762 | Fadell et al. | Jan 2008 | A1 |
20080172171 | Kowalski | Jul 2008 | A1 |
20080309914 | Cantin et al. | Dec 2008 | A1 |
20090102699 | Behrens et al. | Apr 2009 | A1 |
20090251680 | Farsaie | Oct 2009 | A1 |
20090299631 | Hawes et al. | Dec 2009 | A1 |
20100191418 | Mimeault et al. | Jul 2010 | A1 |
20100194595 | Mimeault et al. | Aug 2010 | A1 |
20100277713 | Mimeault | Nov 2010 | A1 |
20100309024 | Mimeault | Dec 2010 | A1 |
20110134249 | Wood et al. | Jun 2011 | A1 |
Number | Date | Country |
---|---|---|
2710212 | Jul 2009 | CA |
19604338 | Jul 2004 | DE |
102004035856 | Mar 2005 | DE |
102006025020 | Nov 2007 | DE |
202008007078 | Oct 2008 | DE |
102009013841 | Sep 2009 | DE |
0318260 | May 1989 | EP |
0476562 | Mar 1992 | EP |
0494815 | Dec 1996 | EP |
0838695 | Apr 1998 | EP |
0612049 | Sep 1998 | EP |
0988624 | Mar 2000 | EP |
0912970 | Apr 2000 | EP |
1034522 | Sep 2000 | EP |
0798684 | Jan 2001 | EP |
0779990 | Mar 2003 | EP |
0935764 | Mar 2003 | EP |
1296302 | Mar 2003 | EP |
0789342 | Jun 2003 | EP |
0784302 | Sep 2003 | EP |
0866434 | Jun 2004 | EP |
1521226 | Jun 2006 | EP |
1049064 | Sep 2006 | EP |
1048961 | Jul 2009 | EP |
1224632 | Dec 2009 | EP |
2136550 | Dec 2009 | EP |
1435036 | Jan 2010 | EP |
1611458 | Apr 2010 | EP |
1997090 | Sep 2010 | EP |
2690519 | Oct 1993 | FR |
2264411 | Aug 1993 | GB |
2354898 | Jul 2003 | GB |
2431498 | Apr 2007 | GB |
2445767 | Jul 2008 | GB |
2059608 | Feb 1990 | JP |
0414390 | May 1992 | JP |
04145391 | May 1992 | JP |
09178786 | Jul 1997 | JP |
2004102889 | Apr 2004 | JP |
2005-170184 | Jun 2005 | JP |
2006172210 | Jun 2006 | JP |
2006521536 | Sep 2006 | JP |
2007121116 | May 2007 | JP |
8705138 | Aug 1987 | WO |
9203808 | Mar 1992 | WO |
9634252 | Oct 1996 | WO |
9904378 | Jan 1999 | WO |
0139153 | May 2001 | WO |
0215334 | Feb 2002 | WO |
03007269 | Jan 2003 | WO |
2004100103 | Nov 2004 | WO |
2005008271 | Jan 2005 | WO |
2006044758 | Apr 2006 | WO |
2006092659 | Sep 2006 | WO |
2007071032 | Jun 2007 | WO |
2007096814 | Aug 2007 | WO |
2008037049 | Apr 2008 | WO |
2008154737 | Dec 2008 | WO |
2009013739 | Jan 2009 | WO |
2009087536 | Jul 2009 | WO |
2009104955 | Aug 2009 | WO |
2009117197 | Sep 2009 | WO |
2011077400 | Jun 2011 | WO |
Entry |
---|
United States Department of Transportation, Federal Highway Administration, Sensor Technology—Chapter 2, Traffic Detector Handbook ; Third Edition—vol. 1, FHWA-HRT-06-108, available at http://www.tfhrc.gov/its/pubs/06108/02a.htm on Sep. 16, 2009. |
The Vehicle Detector Cleaninghouse, A Summary of Vehicle Detection and Surveillance Technologies used in Intelligent Transportation Systems, Nov. 30, 2000, available at http://www.fhwa.dot.gov/environment/airtoxicmsat/4.htm on Sep. 16, 2009. |
United States Department of Transportation, Federal Highway Administration, Department of Environment, Air Quality, Air Toxic MSAT, Monitoring Methods available at http://www.fhwa.dot.gov/ohim/tvtw/vdstits.pdf on Jul. 3, 2007. |
United States Department of Transportation, Research and Innovative Technology Administration, 5.3. Infrared Detectors, available at http://ntl.bts.gov/DOCS/96100/ch05/body13 ch05—03.html on Sep. 16, 2009. |
Kon Tayfun, Thesis, Collision Warning and Avoidance System for Crest Vertical Curves, Virginia Tech, May 4, 1998, Appendix B2, pp. 51-92, published on Digital Library and Archives, University Libraries of Virginia Tech, available at http://scholar.lib.vt.edu/theses/available/etd-43098-201311/unrestricted/APPENDIX-B2.PDF on Sep. 16, 2009. |
Lawrence A. Klein, Vehicle Detector Technologies for Traffic Management Applications, Part 1, Colorado Department of Transportation, Intelligent Transportation Systems (ITS), 1997, available at http://www.cotrip.org/its/ITS%20Guidelines%20Web%20New%20Format%202-05/Web%20Solutions%20Packages/ITS%20Solution%20Packages%20-%20Web%20Copy/Vehicle%20Detectors/Klein%20Part%201%20Vehicle%20Detectors%20Technologies.doc on Sep. 16, 2009. |
Hussain, Tarik Mustafa, City University of New-York, Infrared Vehicle Sensor for Traffic Control, Thesis (PHD) City University of New York, Dissertation Abstracts International, vol. 55-07, Section A, p. 2176, 1994, available at http://adsabs.harvard.edu//abs/1994PhDT 85H on Sep. 16, 2009. |
Dimitri Loukakos, Active Laser infrared Detectors, Intelligent Transportation Systems, Traffic Surveillance, California Center for Innovative Transportation at the University of California, Dec. 20, 2001, available at http://www.calccit.org/itsdecision/serv—and—tech/Traffic—Surveillance/road-based/roadside/other—roadside—rep.html on Sep. 16, 2009. |
GENEQ Inc., Passive Infrared Detector for Traffic Data Acquisition, Model IR 250, Department of Meteorology, available at http://www.geneq.com/catalog/en/ir250.html on Sep. 16, 2009. |
Akindinov et al., “Detection of Light Pulses Using an Avalanche-Photodiode Array with a Metal-Resistor-Semiconductor Structure”, Instruments and Experimental Techniques, Nov. 2004, vol. 48, No. 3 205, pp. 355-363, Russia. |
Braun et al., “Nanosecond transient electroluminescence from polymer lightemitting diodes”, Applied Physics Letters Dec. 1992, vol. 61, No. 26, pp. 3092-3094, California. |
The Vehicule Detector Clearinghouse, “A Summary of Vehicle Detection and Surveillance Technologies used in Intelligent Transportation Systems”, Fall 2000, Southwest Technology Development Institute (SWTDI) at New Mexico State University (NMSU), sponsored in cooperation with the U.S. Department of Transportation FHWA, available at http://www.fhwa.dot.gov/ohim/tvtw/vdstits. |
U.S. Department of Transportation Federal Highway Administration, “Detailed Monitoring Protocol 4.0 Monitoring Methods”, Department of Air Quality, Air Toxic MSAT, available at http://www.fhwa.dot.gov/environment/air—quality/air—toxics/research—and—analysis/near—road—study/protocol/protocol04.cfm, Feb. 21, 2012. |
Tayfun Kon, Thesis, “Collision Warning and Avoidance System for Crest Vertical Curves”, Virginia Tech, May 4, 1998, Appendix 82, pp. 51-92, published on Digital Library and Archives, University Libraries of Virginia Tech, VA. |
Number | Date | Country | |
---|---|---|---|
20100309024 A1 | Dec 2010 | US |
Number | Date | Country | |
---|---|---|---|
61015738 | Dec 2007 | US | |
61015867 | Dec 2007 | US |