The present invention relates to a system and method for traffic detection and more particularly to an optical system that detects the presence of vehicles on a roadway regardless of environmental particles present in the field of view using an active three-dimensional sensor based on the time-of-flight ranging principle.
Information from sensors is the base point in the optimization of traffic management and law enforcement. Using sensors allows gathering statistical data about different parameters related to traffic monitoring and detecting traffic infractions like speed limit violations. Examples of interesting parameters to track are detecting the presence of a vehicle in a detection zone, counting the number of vehicles on the roadway, namely the volume on the roadway, determining the lane position, classifying the vehicle, counting the number of axles, determining the direction of the vehicle, estimating the occupancy and determining the speed.
In the case of speed enforcement, especially for average speed enforcement, determining the exact position of the front and back of a vehicle is useful data. Average speed measurement systems measure the average speed of a vehicle over a predetermined distance and use detectors to determine the time at the entry and the exit points of one section of a vehicle. The entry and exit points are usually hundreds of meters or even kilometers apart. Then, they synchronize the automatic plate number recognition and vehicle identification systems and use the known distance between those points to calculate the average speed of a vehicle. In the case of an average speed exceeding the speed limit, a fine can be issued by law enforcement authorities.
Speed enforcement can require classifying vehicles to determine the right speed limit for a vehicle type. Some countries set different minimum and/or maximum speed limits for heavy trucks and buses. Commercial vehicles can also have other constraints such as truck lane restrictions specifying on which lane a certain type of vehicle is allowed to travel, to requiring classification functionality from the detection system.
Advanced Transportation Management Systems (ATMS) rely on accurate traffic data from different kinds of detectors divided in two categories: intrusive and non-intrusive. One type of intrusive detectors involves inductive loop detectors that are still a common technology for detecting vehicles even if that technology has some disadvantages such as lengthy disruption to the traffic flow during installation and maintenance, relatively high cost, high failure rate and inflexibility. Other detectors, like cameras with video processing, radar-based sensors, laser-based sensors, passive infrared and ultrasound sensors have been introduced for traffic monitoring but also have their limitations and the market is still searching for alternatives.
Video processing sensors have well know drawbacks such as the lack of performance in terms of false alarms during night operation or the difficulty to perform during bad weather conditions affecting visibility such as during an episode of fog. Environmental particles are known to be difficult to manage.
Radar technology is known to perform well in bad weather conditions but has some limitations in terms of lateral resolution. Accurate occupancy measurement can be limited when occupancy is high. In some cases, for measuring the speed of a vehicle, radar traffic detectors located on the side of the road use an average length for the vehicles which causes errors in the vehicle speed estimate.
Thus, there is a need for a method and system for robust and accurate detection for multipurpose traffic management applications.
According to a broad aspect of the present invention, there is provided a method for detecting a vehicle located in a detection zone of a roadway having at least one lane, the detection zone on the roadway at least partly covering a width of the lane, the method comprising: providing a multi-channel scannerless full-waveform lidar system operating in to pulsed Time-Of-Flight operation, an optical window of the full-waveform lidar system being oriented towards a surface of the roadway in order for the full-waveform lidar system to cover the detection zone; providing at least one initialization parameter for the full-waveform lidar system; using the full-waveform lidar system, emitting pulses at an emission frequency; receiving reflections of the pulses from the detection zone; and acquiring and digitalizing a series of individual complete traces at each channel of the multi-channel system; identifying at least one detection in at least one of the individual complete traces; obtaining a height and an intensity for the detection using the individual complete trace; determining a nature of the detection to be one of an environmental particle detection, a candidate object detection and a roadway surface detection using at least one of the individual complete traces, the height, the intensity and the at least one initialization parameter; if the nature of the detection is the candidate object detection, detecting a presence of a vehicle in the detection zone.
In one embodiment, the method further comprises tracking an evolution of the detection in a time-spaced individual complete trace, the time-spaced individual complete trace being acquired after the individual complete trace, wherein the determining the nature includes comparing at least one of the height and the intensity in the time-spaced individual complete trace and the individual complete trace.
In one embodiment, the method further comprises obtaining a distance for the detection using the individual complete trace and the initialization parameter, wherein the determining the nature includes using at least one of the individual complete traces, the height, the intensity, the distance and the at least one initialization parameter.
In one embodiment, determining the nature includes determining a probability that the nature of the detection is the environment particle if the tracking the evolution determines that the height decreases by more than a height threshold and the distance increases by more than a distance threshold; if the probability is higher than a probability threshold, determining the nature to be the environmental particle.
In one embodiment, determining the nature to be the environmental particle includes determining a presence of at least one of fog, water, rain, liquid, dust, dirt, vapor, snow, smoke, gas, smog, pollution, black ice and hail.
In one embodiment, the method further comprises identifying a presence of a retroreflector on the vehicle using the individual complete traces and the initialization is parameters, by comparing an intensity of the detections with an intensity threshold and identifying detections having an intensity higher than the intensity threshold to be caused by a retroreflector on the vehicle.
In one embodiment, the method further comprises tracking an evolution of the detection in a time-spaced individual complete trace, the time-spaced individual complete trace being acquired at a time delay after the individual complete trace, wherein the identifying the presence of the retroreflector is carried out for the individual complete trace and the time-spaced individual complete trace, determining a distance of the retroreflector using the individual complete trace and the time-spaced individual complete trace and estimating a speed of the vehicle based on the initialization parameter, the distance and the time delay.
In one embodiment, the multi-channel scannerless full-waveform lidar system includes a light emitting diode (LED) light source adapted to emit the pulses.
In one embodiment, digitalizing the series of individual complete traces at each channel of the multi-channel system includes digitalizing the series at a high frame rate, the high frame rate being greater than Hz.
In one embodiment, the method further comprises providing an image sensing module adapted and positioned to acquire an image covering at least the detection zone; synchronizing acquisition of the image with the acquiring and digitalizing of the full-waveform lidar system; acquiring the image with the image sensing module.
In one embodiment, the method further comprises recognizing a pattern in the image using the initialization parameter.
In one embodiment, the pattern is a circle, the pattern in the image corresponding to a wheel of the vehicle.
In one embodiment, the method further comprises determining a position of the pattern in the image, taking a second image after an elapsed time delay, recognizing the pattern in the second image and determining a second position of the pattern, determining a displacement of the pattern between the image and the second image.
In one embodiment, the method further comprises obtaining a distance for the pattern using the individual complete traces and the initialization parameter, and estimating a speed of the vehicle using the displacement, the distance for the pattern in the image and the pattern in the second image, the elapsed time delay and the initialization parameter.
In one embodiment, a longitudinal dimension of the detection zone is perpendicular to the roadway.
In one embodiment, the method further comprises identifying a section of the vehicle detected to be present in the detection zone using the individual complete trace, the section being one of a front, a side, a top and a rear of the vehicle, the identifying the section including comparing a height of the detection with a height threshold and comparing an intensity of the detection with an intensity threshold.
In one embodiment, the method further comprises determining a position of the section of the vehicle in the detection zone using at least one of the individual complete traces and the at least one initialization parameter.
In one embodiment, the method further comprises determining a current lane of the roadway in which the vehicle is present using the initialization parameter and the individual complete trace.
In one embodiment, obtaining the height and the intensity for the detection using the individual complete trace further comprises converting the detections in Cartesian coordinates.
In one embodiment, the method further comprises generating a profile of one of a side and a top of the vehicle using a plurality of the individual complete traces.
In one embodiment, the method further comprises determining a length of the vehicle using a plurality of the individual complete traces and the speed of the vehicle, the time delay and the initialization parameter.
In one embodiment, the method further comprises providing a second one of the multi-channel scannerless full-waveform lidar system, an optical window of the second full-waveform lidar system being oriented towards a surface of the roadway in order for the second system to cover a second detection zone, the second detection zone at least partly overlapping the detection zone, operation of the full-waveform lidar system and the second full-waveform lidar system being synchronized.
In one embodiment, the method further comprises providing a second one of the multi-channel scannerless full-waveform lidar system, an optical window of the second full-waveform lidar system being oriented towards a surface of the roadway in order for the second system to cover a second detection zone, operation of the full-waveform lidar system and the second full-waveform lidar system being synchronized, wherein the second system is provided at a lateral offset on the roadway with respect to the full-waveform lidar system; determining a speed of the vehicle using a delay between detection of the vehicle by the full-waveform lidar system and the second full-waveform lidar system and the initialization parameter.
In one embodiment, the method further comprises associating a type to the vehicle to classify the vehicle using the height.
In one embodiment, the method further comprises associating a type to the vehicle to classify the vehicle using at least one of the height and the length.
In one embodiment, the method further comprises associating a type to the vehicle to classify the vehicle using at least one of the height, the length and the pattern.
In one embodiment, the method further comprises associating a type to the vehicle to classify the vehicle using at least one of the height, the length, the pattern and the profile.
In one embodiment, the method further comprises generating a detection signal upon the detecting the presence.
In one embodiment, the detection signal controls at least one of a hardware trigger and a software trigger.
In one embodiment, the detection signal includes information about the detection.
In one embodiment, the method further comprises generating a recall signal to invalidate at least one of the hardware trigger and the software trigger.
In one embodiment, the method further comprises storing information about the detection.
In one embodiment, the method further comprises generating and storing statistical information.
In one embodiment, the method further comprises determining a direction of displacement of the vehicle using the displacement and identifying a wrong-way vehicle using the direction of displacement and the initialization parameter.
According to another broad aspect of the present invention, there is provided a method for detecting a vehicle comprising: providing a multi-channel scannerless full-waveform lidar system operating in pulsed Time-Of-Flight operation oriented towards a surface of the roadway to cover the detection zone; providing at least one initialization parameter; emitting pulses at an emission frequency; receiving reflections of the pulses from the detection zone; and acquiring and digitalizing a series of individual complete traces at each channel of system; identifying at least one detection in at least one of the traces; obtaining a height and an intensity for the detection; determining a nature of the detection to be one of an environmental particle detection, a candidate object detection and a roadway surface detection; if the nature of the detection is the candidate object detection, detecting a presence of a vehicle in the detection zone.
According to another broad aspect of the present invention, there is provided a method for detecting a vehicle located in a detection zone of a roadway. The method comprises providing a multiple-field-of-view scannerless LED full-waveform lidar system operating in pulsed Time-Of-Flight operation at a detection height and at a lateral distance from a side of the roadway; the method including emitting at a high repetition rate, receiving, acquiring and digitalizing a series of individual complete traces at each channel, in parallel; detecting and identifying, at least one of, for a vehicle, a presence, a position of the front, rear or middle, a profile of a side, a height, a number of axles, a length, a direction of movement, a displacement speed, a distance, and/or a number of detections of vehicles over time, a percentage of time during which a vehicle is detected, a position of a surface of the roadway or a visibility.
According to still another broad aspect of the present invention, there is provided a method for detecting a vehicle which includes providing a multi-channel scannerless full-waveform lidar system operating in pulsed Time-Of-Flight operation at a lateral distance from a side of the roadway, providing an initialization parameter, using the full-waveform lidar system, emitting pulses; receiving reflections from the detection zone; and acquiring and digitalizing a series of individual complete traces at each channel of the multi-channel system; identifying at least one detection in an individual complete trace; obtaining a height and an intensity for the detection using the individual complete trace; determining a nature of the detection to be one of an environmental particle detection, a candidate object detection and a roadway surface detection; if the nature of the detection is the candidate object detection, detecting a presence of a vehicle in the detection zone.
Throughout this specification, the term “vehicle” is intended to include any movable means of transportation for cargo, humans and animals, not necessarily restricted to ground transportation, including wheeled and unwheeled vehicles, such as, for example, a truck, a bus, a boat, a subway car, a train wagon, an aerial tramway car, a ski lift, a plane, a car, a motorcycle, a tricycle, a bicycle, a Segway™, a carriage, a wheelbarrow, a stroller, etc.
Throughout this specification, the term “environmental particle” is intended to include any particle detectable in the air or on the ground and which can be caused by an environmental, chemical or natural phenomenon or by human intervention. It includes fog, water, rain, liquid, dust, dirt, vapor, snow, smoke, gas, smog, pollution, black ice, hail, etc.
Throughout this specification, the term “object” is intended to include a moving object and a stationary object. For example, it can be a vehicle, an environmental particle, a person, a passenger, an animal, a gas, a liquid, a particle such as dust, a pavement, a wall, a post, a sidewalk, a ground surface, a tree, etc.
The accompanying drawings, which are included to provide a better understanding of the main aspects of the system and method and are incorporated in and constitute a part of this specification, illustrate different embodiments and together with the description serve to explain the principles of the system and method. The accompanying drawings are not intended to be drawn to scale. In the drawings:
Reference will now be made in detail to examples. The system and method may however, be embodied in many different forms and should not be construed as limited to the example embodiments set forth in the following description.
An example mounting configuration of the traffic detection system 10 can be appreciated with reference to
The mounting height 20 of the traffic detection system 10 is for example between 1 m and 8 m with a lateral distance 22 from the nearest traffic lane 14 for example between 1 m and 6 m. The system can also be installed over the roadway, for example under the transversal beam of a gantry (not shown). The 3D detection zone would still have a longitudinal dimension which is perpendicular to the traffic direction under the gantry. In
In another example embodiment of the system, shown in
The system allows optically monitoring a region of a roadway by using a plurality of independent detection zones. The system then enables traffic detection for each individual lane while providing substantial flexibility in configuring the system. For example,
The traffic detection system 10 is referred to as being “active” due to the fact that it radiates light having predetermined characteristics over the overall detection zone. The active nature of the system enables its operation around the clock and in numerous daytime/nighttime lighting conditions, while making it relatively immune to disturbances coming from parasitic light of various origins. The outline of the portion of the roadway that is lighted by the traffic detection system is outlined in
As it will be explained in further details below, an image sensing device can be integrated in the traffic detection system that forwards images to a remote operator to help him in performing a fine adjustment of the location of the overall detection zone of the system. By way of example,
In addition to the detection of vehicles present within a two-dimensional detection zone, the active nature of the traffic detection system provides an optical ranging capability that enables measurement of the instantaneous distances of the detected vehicles from the system. This optical ranging capability is implemented via the emission of light in the form of very brief pulses along with the recordal of the time it takes to the pulses to travel from the system to the vehicle and then to return to the system. Those skilled in the art will readily recognize that the optical ranging is performed via the so-called Time-Of-Flight (TOF) principle, of widespread use in optical rangefinder devices. However, most optical rangefinders rely on analog peak detection of the light pulse signal reflected from a remote object followed by its comparison with a predetermined amplitude threshold level. On the contrary, the traffic detection system numerically processes the signal waveform acquired for a certain period of time after the emission of a light pulse. The traffic detection system can then be categorized as a full-waveform LIDAR (LIght Detection And Ranging) instrument.
Because light travels at a rapid but nevertheless finite speed, the emission of a single pulse of light by the traffic detection system will result in the subsequent reception of a brief optical signal echo starting at the time t=2LMIN/c and having a duration Δt=2(LMAX−LMIN)/c. In these expressions, c is the speed of light, namely 3×108 m/s. For an example installation, the distance between the sensor and the objects to be detected is in the range of 2 m to 20 m. An optical signal echo from an object would start to be recorded after a time delay t≈13 ns following the emission of the light pulse, and it would end up at a time t+Δt≈135 ns. Any vehicle present in a lane monitored by the traffic detection system would reflect the incoming light in a manner that differs substantially from the reflection of the light on a road pavement. The difference between the measurement of the distance of the road pavement and the measurement of the distance of any vehicle detected by the sensor during its presence in the detection zone is enough to produce a distinctive signal echo and a distinctive distance measurement on which the reliable detection of the vehicle by the system is based.
The functionalities of the various components integrated in an example traffic detection system 10 can be better understood by referring to the functional block diagram shown in
The Control and Processing Unit 48 has numerous functions in the operation of the traffic detection system, one of these being the calibration of the system. This calibration process can be done by connecting a remote computer to the Control and Processing Unit and communicate together by the operation of a data interface module and power supply 50. During normal operation of the traffic detection system, data interface 50 also allows the Control and Processing Unit 48 to send data about the vehicles detected at the monitored intersection to an external controller for traffic management. The detection data outputted from the Control and Processing Unit 48 results from the numerical real-time processing of the voltage waveforms forwarded by the ORM and also includes data from the ISM. Several types of interface can be used to communicate with the external controller: Ethernet, RS-485, wireless link, etc. The data information can also be stored in memory and recovered later. The data interface 50 can also send electrical trigger signals to synchronize events like the detection of a front or a rear of a vehicle to other devices like an external camera or other traffic management controllers.
The data interface module 50 can also be useful to transmit images to an external system or network to allow a remote operator to monitor the traffic at the intersection. Video compression, for example H.264, can be done by a processor to limit the bandwidth required for the video transmission.
The system implements a processing of the signal waveforms generated by the plurality of optical detection channels. The primary objective of the waveform processing is to detect, within a prescribed minimum detection probability, the presence of vehicles in a lane that is mapped to a number of adjacent detection channels. Because of the usual optical reflection characteristics of the bodies of vehicles and of various constraints that limit the performances of the modules implemented in a traffic detection system, the optical return signals captured by the ORM are often plagued with an intense noise contribution that washes out faint signal echoes indicative of the presence of a vehicle. As a consequence, some of the first steps of the waveform processing are intended to enhance the Signal-to-Noise Ratio (SNR) of the useful signal echoes. Such filtering steps may start by numerically correlating the raw waveforms with a replica of a strong, clean signal echo that was previously captured or artificially generated. The waveforms processed this way get a smoother shape since a significant part of the high-frequency noise initially present in the raw waveforms has been eliminated.
In a second step of the processing, the SNR of the useful signal echoes present in the waveforms can be further enhanced by averaging a number of successively-acquired waveforms. The better SNRs obtained by standard signal averaging or accumulation are possible provided that the noise contributions present in the successive waveforms are independent from each other and fully uncorrelated. When this condition is satisfied, which is often the case after proper elimination of the fixed pattern noise contribution, it can be shown that the SNR of the waveforms can be increased by a factor of (N)1/2, where N is the number of averaged waveforms. Averaging 100 successive waveforms can then result in an order of magnitude SNR enhancement.
Another condition that can limit the number of waveforms to be averaged is the need for a stationary process which generates the useful signal echoes. In other words, the properties, such as the peak amplitude, shape, time/distance location, of the useful features present in the waveforms should remain ideally unchanged during the time period required to capture a complete set of waveforms that will be averaged. When attempting to detect vehicles that move rapidly, the signal echoes can drift more or less appreciably from waveform to waveform. Although this situation occurs frequently during operation of the traffic detection system, its detrimental impacts can be alleviated by designing the traffic detection system so that it radiates light pulses at a high repetition rate, for example in the tens or hundreds of kHz range. Such high repetition rates will enable the capture of a very large number of waveforms during a time interval sufficiently short enough to keep stationary the optical echoes associated to a moving vehicle. Detection information on each channel can then be upgraded, for example between few tens to few hundreds time per second. The high frame rate could be greater than 100 Hz for example. For example, with a traffic detection system using a frame rate at 200 Hz, a car at 250 km/h would have moved forward by 35 cm between each frame.
In one example embodiment of the system, the waveform averaging is implemented in the form of mobile averaging, wherein the current average waveform is continuously updated by summing it with a newly-acquired waveform while rejecting from the average the waveform that was first acquired. Using a mobile average does not impact on the rate at which the output detection data is generated by the Control and Processing Unit. Moreover, a timely detection of a vehicle that appears suddenly in a lane can be enabled by resetting the mobile average when a newly-acquired waveform presents at least one feature that differs appreciably from the current average waveform.
A method that allows a rapid and simple alignment step for the traffic detection system after it has been set in place is provided.
The intensity of the echo back signal is dependent on the condition of the road. A dry road has a higher intensity than a wet road. A road covered with black ice will have the lowest intensity due to the specular effect of the ice. Snow typically increases the intensity. The condition of the pavement can be monitored during installation and also during normal operation.
As will be readily understood, when the system is installed on the side of the roadway, the detection of the front of the vehicle is actually a detection of the side of the front of the vehicle and the detection of the rear of the vehicle is actually a detection of the side of the rear of the vehicle. The “middle” or “side” of the vehicle has a varying length depending on the type of vehicle circulating on the roadway. This region or section of the side of the vehicle is located between the front (the side of the front) and the rear (the side of the rear) of the vehicle and it includes the mathematical or geometrical center or middle of the side of the vehicle. However, because the side of the vehicle can have an extended length, it is possible that different detections of the side or middle of the vehicle will not include the actual mathematical or geometrical center or middle of the vehicle. Similarly, when the system is installed under a lateral beam of a gantry provided above the roadway, the front and rear sections of the vehicle are the top of the front and the top of the rear of the vehicle. Again, the “middle” or “top” of the vehicle have a varying length depending on the type of vehicle.
Most sensors such as video cameras, lidars or short wave infrared imagers are not able to distinguish between a detection of the vehicle and a detection of the water behind the vehicle. Water splashing is caused by the accumulation of rain, an environment particle, on the roadway. The accumulated rain is lifted by the tires of the moving vehicle and creates a splash behind the vehicle. An example water splash is shown in
Then, the system waits for the detection of the front of a vehicle 310 by the 3D sensor. After detecting the front of a vehicle, the system takes a snapshot 320 with the image sensor. At pattern recognition 330, the system analyzes the image to find a predetermined pattern in the image and determines its position (x0, y0) in the image and the distance if the pattern is in the FOV of the 3D sensor. The circular pattern of a wheel and a bright spot at night are good examples of patterns to be recognized. After pattern recognition, this pattern is tracked by taking at least one other snapshot 340 at a certain frame rate (fr) and determining each new position (xn, yn) of the pattern. At each iteration, the method analyzes if the pattern is in the overlay of the 3D sensor and, if it is the case, sets the distance of the pattern based on the information from the individual channel in the 3D sensor fitting with the position of the pattern. After at least two iterations with at least one iteration where the pattern has been recognized in the overlay to determine its distance, the data position of each iteration with the corresponding longitudinal distance measurement are analyzed for speed measurement. Lateral displacement based on each position of the pattern detected can be determined and this information can be filtered, using a Kalman filter for example. The measurements of several positions each memorized with a time stamp are used to estimate the speed 350 of the vehicle.
The pattern recognition process which uses wheels as a pattern to be recognized in the image is as follows. The first snapshot has been taken when the front of the vehicle entered the 3D detection zone shown in overlay. Any vehicle having a wheel on the ground relatively close to its front is detected by the 3D sensor. The Region of Interest (ROI) of the wheel can be defined considering the direction, the distance of the vehicle, the position of the ground and the channel(s) detecting the front of the vehicle. Wheels locations are delimited to a region close to the road and relatively close to the front of the vehicle. Several techniques can be used to detect circular shapes. Sobel edge detection and Hough transform, and its variations, are well-known pattern recognition techniques used to identify shapes like straight lines and circles and can be used to recognize wheels. Once the circular shape of the wheel is detected, the center point can be determined. The sequence information of the tracking of the pattern confirms the direction of movement of the vehicle and can be used as a wrong-way driver detection and warning system.
Near Infrared imaging, using an IR illumination source, not shown, can be used. It allows using the same pattern during daytime and nighttime and can help reducing sensitivity to lighting conditions.
At night, a lighting module on the vehicle can be used as a pattern to be recognized and tracked. When the front of a vehicle is detected, at least one lighting module in that area can be clearly distinguished based on the intensity of the illumination and a group of pixels, or blob, based on an intensity level higher than a threshold can be found. This blob can be tracked in the same way as the wheel and speed measurement can be done.
In another example embodiment, the speed measurement is based on the detection of a retroreflector. A retroreflector has a surface which reflects incoming light towards the source of illumination with practically no scattering effect if the angle of incidence is not too high, for example less than 45 degrees. When the traffic detection system has a reflector in its FOV, a very strong echo back signal is perceived by the Optical Receiver Module (ORM) and the amplitude of the signal is much higher to compare to a Lambertian reflectance type surface which has a diffusely reflecting incoming signal. In most countries, for any type of motor vehicle, the regulations require manufacturers to install retroreflectors on the sides of the vehicle, at least one on each front side and one on each rear side. When this retroreflector is in the FOV, a strong signal is acquired by the traffic detector system during the time the retroreflector is in the FOV of the ORM. Knowing the width of the FOV of the ORM in degrees (A), knowing the distance (D) of the retroreflector from the detector and knowing the time (T) that the retroreflector has spent in the FOV and generated a strong signal, the speed (S) of the vehicle can be estimated with the following equation:
S=2*D*TAN(A/2)/T
The system can also approximate the length (L) of the vehicle by storing a timestamp for the front side retroreflector (TO and storing another timestamp for the rear side retroreflector (Tr) using the following equation:
L=S*(Tr−Tf)
Usually, there are intermediate side retroreflectors for long vehicles, such as vehicles which are longer than 9.144 m (30 feet) for example. Because the system is adapted to detect the front, the middle and the end of the vehicle, it is possible to make an association between the front of the vehicle and the front retroreflector and the end of the vehicle with the rear retroreflector for length measurement, even in the context of a vehicle with an intermediate side retroreflector.
In one other example embodiment, speed measurements can be made using two traffic detection systems. A configuration using two sensors per lane, one on each side of the lane, installed under a transversal beam of a gantry for example, is useful to detect and profile both sides of any vehicle. In that configuration, the detectors are synchronized to collect information and the shape of a vehicle. When the position of each sensor is known, the width and height can be determined. If two traffic detection systems are installed on opposite sides of the roadway with a lateral offset along the roadway, it is possible to detect the front of a vehicle with the first sensor and within a short delay, as a function of the speed and the offset, the second sensor would also detect the front of the vehicle. Knowing the offset and measuring the delay between the detection of the front of the vehicle, speed estimation can be made. With an estimation of the speed, the length can also be estimated. The same method could be carried out with the back of the vehicle. The lateral offset between the two systems could be 1 m for example.
Fusion information can be also useful to improve classification, notably by counting the number of axels, and determine several types of vehicles. In the United States, the Federal HighWay Administration (FHWA) has defined a classification based on 13 categories of vehicles from motorcycles to passenger cars, buses, two-axle-six-tire-single-unit trucks, and up to a seven or more axle multi-trailer trucks classes. Several alternative classification schemes are possible and often the aggregation of the FHWA 13 classes is split into 3 or 4 classes. The number of axles and the distance between each axel are key elements in an algorithm to make a robust classification. Information from the 3D sensor based on a multi-channel TOF and from the image sensor with image processing analysis permits to the traffic detection system 10 to be a very efficient device for the classification function. For example, to show the strength of this traffic detection sensor, based on the knowledge of the position of the ground and the distance of the side of the vehicle, the system can determine if detected wheels are touching the ground or not. This information can be useful for classification purposes.
For example, when the sensor is used to scan the road as shown in
The system can also classify vehicles based on their profile when the traffic detection system is installed under a transversal beam of a gantry above the road. As shown in
For some applications, the system has to detect and send information rapidly. The best way to synchronize the sensor with an external system when a detection event occurs is by using a hardware trigger. It could be useful to take a snapshot with an external camera for example. The hardware trigger could include relay units, solid state relay units, differential lines, etc. Additional information related to this hardware trigger can be sent by the interface. A hardware trigger can therefore trigger an external camera to take a snapshot. Additional information is sent to a computer with some details of the event like the position of the detected object. In some cases, information sent by the sensor can be used to recall or cancel a hardware trigger. This can happen when the detection system needs to react very rapidly but, afterwards, the analysis module detects that it was a false alarm.
This application is a national phase entry of PCT Application No. PCT/IB2012/053045, entitled “SYSTEM AND METHOD FOR TRAFFIC SIDE DETECTION AND CHARACTERIZATION” filed on Jun. 15, 2012; which in turn claims priority under 35 USC §119(e) of U.S. provisional patent application 61/498,083 filed Jun. 17, 2011, the specifications of which are hereby incorporated by reference herein.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB2012/053045 | 6/15/2012 | WO | 00 | 12/13/2013 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/172526 | 12/20/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3680085 | Del Signore | Jul 1972 | A |
3967111 | Brown | Jun 1976 | A |
4533242 | McLauchlan et al. | Aug 1985 | A |
4634272 | Endo | Jan 1987 | A |
4717862 | Anderson | Jan 1988 | A |
4733961 | Mooney | Mar 1988 | A |
4808997 | Barkley et al. | Feb 1989 | A |
4891624 | Ishikawa et al. | Jan 1990 | A |
4928232 | Gentile | May 1990 | A |
5102218 | Min et al. | Apr 1992 | A |
5134393 | Henson | Jul 1992 | A |
5179286 | Akasu | Jan 1993 | A |
5270780 | Moran et al. | Dec 1993 | A |
5317311 | Martell et al. | May 1994 | A |
5357331 | Flockencier | Oct 1994 | A |
5381155 | Gerber | Jan 1995 | A |
5389921 | Whitton | Feb 1995 | A |
5546188 | Wangler | Aug 1996 | A |
5621518 | Beller | Apr 1997 | A |
5627511 | Takagi | May 1997 | A |
5629704 | Throngnumchai et al. | May 1997 | A |
5633629 | Hochstein | May 1997 | A |
5633801 | Bottman | May 1997 | A |
5663720 | Weissman | Sep 1997 | A |
5714754 | Nicholas | Feb 1998 | A |
5760686 | Toman | Jun 1998 | A |
5760887 | Fink et al. | Jun 1998 | A |
5764163 | Weldman et al. | Jun 1998 | A |
5777564 | Jones | Jul 1998 | A |
5793491 | Wangler | Aug 1998 | A |
5805468 | Blöhbaum | Sep 1998 | A |
5812249 | Johnson et al. | Sep 1998 | A |
5828320 | Buck | Oct 1998 | A |
5836583 | Towers | Nov 1998 | A |
5838116 | Katyl et al. | Nov 1998 | A |
5889477 | Fasterath | Mar 1999 | A |
5896190 | Wangler et al. | Apr 1999 | A |
5942753 | Dell | Aug 1999 | A |
5953110 | Burns | Sep 1999 | A |
5963127 | Lang et al. | Oct 1999 | A |
5995900 | Hsiao et al. | Nov 1999 | A |
6044336 | Marmarelis et al. | Mar 2000 | A |
6094159 | Osterfeld et al. | Jul 2000 | A |
6100539 | Blümcke et al. | Aug 2000 | A |
6104314 | Jiang | Aug 2000 | A |
6107942 | Yoo et al. | Aug 2000 | A |
6115113 | Flockencier | Sep 2000 | A |
6142702 | Simmons | Nov 2000 | A |
6147624 | Clapper | Nov 2000 | A |
6166645 | Blaney | Dec 2000 | A |
6259515 | Benz et al. | Jul 2001 | B1 |
6259862 | Marino et al. | Jul 2001 | B1 |
6266609 | Fastenrath | Jul 2001 | B1 |
6281632 | Stam et al. | Aug 2001 | B1 |
6285297 | Ball | Sep 2001 | B1 |
6301003 | Shirai | Oct 2001 | B1 |
6304321 | Wangler | Oct 2001 | B1 |
6340935 | Hall | Jan 2002 | B1 |
6363326 | Scully | Mar 2002 | B1 |
6377167 | Juds et al. | Apr 2002 | B1 |
6388565 | Bernhard et al. | May 2002 | B1 |
6404506 | Cheng et al. | Jun 2002 | B1 |
6411221 | Horber | Jun 2002 | B2 |
6417783 | Gabler et al. | Jul 2002 | B1 |
6426708 | Trajkovic et al. | Jul 2002 | B1 |
6502011 | Haag | Dec 2002 | B2 |
6502053 | Hardin et al. | Dec 2002 | B1 |
6516286 | Aebischer et al. | Feb 2003 | B1 |
6548967 | Dowling et al. | Apr 2003 | B1 |
6556916 | Waite et al. | Apr 2003 | B2 |
6559776 | Katz | May 2003 | B2 |
6580385 | Winner et al. | Jun 2003 | B1 |
6642854 | McMaster | Nov 2003 | B2 |
6650250 | Muraki | Nov 2003 | B2 |
6665621 | Drinkard et al. | Dec 2003 | B2 |
6674394 | Zoratti | Jan 2004 | B1 |
6753766 | Patchell | Jun 2004 | B2 |
6753950 | Morcom | Jun 2004 | B2 |
6765495 | Dunning et al. | Jul 2004 | B1 |
6771185 | Yoo et al. | Aug 2004 | B1 |
6794831 | Leeb et al. | Sep 2004 | B2 |
6821003 | Baker et al. | Nov 2004 | B2 |
6825778 | Bergan et al. | Nov 2004 | B2 |
6831576 | Geiger et al. | Dec 2004 | B2 |
6836317 | Perger | Dec 2004 | B1 |
6842231 | Nourrcier et al. | Jan 2005 | B2 |
6850156 | Bloomfield et al. | Feb 2005 | B2 |
6885311 | Howard | Apr 2005 | B2 |
6885312 | Kirkpatrick | Apr 2005 | B1 |
6917307 | Li | Jul 2005 | B2 |
6927700 | Quinn | Aug 2005 | B1 |
6946974 | Racunas, Jr. | Sep 2005 | B1 |
7026954 | Slemmer et al. | Apr 2006 | B2 |
7049945 | Breed et al. | May 2006 | B2 |
7081832 | Nelson et al. | Jul 2006 | B2 |
7106214 | Jesadanont et al. | Sep 2006 | B2 |
7116246 | Winter et al. | Oct 2006 | B2 |
7119674 | Sefton | Oct 2006 | B2 |
7119715 | Orita | Oct 2006 | B2 |
7123166 | Haynes et al. | Oct 2006 | B1 |
7135991 | Slemmer et al. | Nov 2006 | B2 |
7148813 | Bauer | Dec 2006 | B2 |
7209221 | Breed et al. | Apr 2007 | B2 |
7221271 | Reime | May 2007 | B2 |
7221288 | Fitzgibbon et al. | May 2007 | B2 |
7236102 | Shimotani | Jun 2007 | B2 |
7250605 | Zhevelev et al. | Jul 2007 | B2 |
7253747 | Noguchi | Aug 2007 | B2 |
7317384 | Lefranc | Jan 2008 | B2 |
7319777 | Morcom | Jan 2008 | B2 |
7321317 | Nath et al. | Jan 2008 | B2 |
7350945 | Albou et al. | Apr 2008 | B2 |
7352972 | Franklin | Apr 2008 | B2 |
7359782 | Breed et al. | Apr 2008 | B2 |
7378947 | Daura Luna et al. | May 2008 | B2 |
7405676 | Janssen | Jul 2008 | B2 |
7417718 | Wada et al. | Aug 2008 | B2 |
7426450 | Arnold et al. | Sep 2008 | B2 |
7486204 | Quintos | Feb 2009 | B2 |
7492281 | Lynam et al. | Feb 2009 | B2 |
7504932 | Bartels | Mar 2009 | B2 |
7554652 | Babin et al. | Jun 2009 | B1 |
7573400 | Arnold et al. | Aug 2009 | B2 |
7616293 | Sirota et al. | Nov 2009 | B2 |
7633433 | Behrens et al. | Dec 2009 | B2 |
7635854 | Babin | Dec 2009 | B1 |
7640122 | Levesque et al. | Dec 2009 | B2 |
7652245 | Crickmore et al. | Jan 2010 | B2 |
7688222 | Peddie et al. | Mar 2010 | B2 |
7725348 | Allen | May 2010 | B1 |
7734500 | Allen | Jun 2010 | B1 |
7760111 | Lynam et al. | Jul 2010 | B2 |
7764193 | Chen | Jul 2010 | B2 |
7796081 | Breed | Sep 2010 | B2 |
7808401 | Schwartz et al. | Oct 2010 | B1 |
7852462 | Breed et al. | Dec 2010 | B2 |
7855376 | Cantin et al. | Dec 2010 | B2 |
7859432 | Kim et al. | Dec 2010 | B2 |
7872572 | Harrington et al. | Jan 2011 | B2 |
7884740 | Tzuang | Feb 2011 | B2 |
7889097 | Arnold et al. | Feb 2011 | B1 |
7889098 | Arnold et al. | Feb 2011 | B1 |
7895007 | Levesque et al. | Feb 2011 | B2 |
7898433 | Roberts | Mar 2011 | B2 |
7917320 | Levesque et al. | Mar 2011 | B2 |
7933690 | Kushida et al. | Apr 2011 | B2 |
7952491 | Schwartz et al. | May 2011 | B2 |
7957900 | Chowdhary et al. | Jun 2011 | B2 |
8242476 | Mimeault et al. | Aug 2012 | B2 |
8331621 | Allen | Dec 2012 | B1 |
8436748 | Mimeault et al. | May 2013 | B2 |
8593519 | Tauchi | Nov 2013 | B2 |
8600656 | Mimeault et al. | Dec 2013 | B2 |
8629977 | Phillips | Jan 2014 | B2 |
8761447 | Maxik | Jun 2014 | B2 |
8823951 | Mimeault | Sep 2014 | B2 |
8924140 | Sakamoto | Dec 2014 | B2 |
9235988 | Mimeault | Jan 2016 | B2 |
20020005778 | Breed et al. | Jan 2002 | A1 |
20020033884 | Schurr | Mar 2002 | A1 |
20020117340 | Stettner | Aug 2002 | A1 |
20030154017 | Ellis | Aug 2003 | A1 |
20030189500 | Lim | Oct 2003 | A1 |
20040035620 | McKeefery | Feb 2004 | A1 |
20040051859 | Flockencier | Mar 2004 | A1 |
20040083035 | Ellis | Apr 2004 | A1 |
20040118624 | Beuhler et al. | Jun 2004 | A1 |
20040130702 | Jupp | Jul 2004 | A1 |
20040135992 | Munro | Jul 2004 | A1 |
20040254728 | Poropat | Dec 2004 | A1 |
20050036130 | Arita | Feb 2005 | A1 |
20050046597 | Hutchison et al. | Mar 2005 | A1 |
20050078297 | Doemens et al. | Apr 2005 | A1 |
20050117364 | Rennick et al. | Jun 2005 | A1 |
20050187701 | Baney | Aug 2005 | A1 |
20050231384 | Shimotani | Oct 2005 | A1 |
20050232469 | Schofield et al. | Oct 2005 | A1 |
20050269481 | David et al. | Dec 2005 | A1 |
20050270175 | Peddie et al. | Dec 2005 | A1 |
20050285738 | Seas et al. | Dec 2005 | A1 |
20060033641 | Jaupitre | Feb 2006 | A1 |
20060066472 | Janssen | Mar 2006 | A1 |
20060145824 | Frenzel et al. | Jul 2006 | A1 |
20060147089 | Han et al. | Jul 2006 | A1 |
20060149472 | Han et al. | Jul 2006 | A1 |
20060180670 | Acosta et al. | Aug 2006 | A1 |
20060203505 | Griesinger et al. | Sep 2006 | A1 |
20060221228 | Kikuchi | Oct 2006 | A1 |
20070018106 | Zhevelev et al. | Jan 2007 | A1 |
20070061192 | Chew | Mar 2007 | A1 |
20070091294 | Hipp | Apr 2007 | A1 |
20070096943 | Arnold et al. | May 2007 | A1 |
20070181786 | Doemens et al. | Aug 2007 | A1 |
20070205918 | Riesco Prieto et al. | Sep 2007 | A1 |
20070222639 | Giles et al. | Sep 2007 | A1 |
20070228262 | Cantin et al. | Oct 2007 | A1 |
20070255525 | Lee et al. | Nov 2007 | A1 |
20080006762 | Fadell et al. | Jan 2008 | A1 |
20080166023 | Wang | Jul 2008 | A1 |
20080172171 | Kowalski | Jul 2008 | A1 |
20080186470 | Hipp | Aug 2008 | A1 |
20080245952 | Troxell | Oct 2008 | A1 |
20080278366 | Behrens | Nov 2008 | A1 |
20080309914 | Cantin et al. | Dec 2008 | A1 |
20090027185 | Daura Luna | Jan 2009 | A1 |
20090102699 | Behrens et al. | Apr 2009 | A1 |
20090243822 | Hinninger et al. | Oct 2009 | A1 |
20090251680 | Farsaie | Oct 2009 | A1 |
20090267784 | Braghiroli et al. | Oct 2009 | A1 |
20090299631 | Hawes et al. | Dec 2009 | A1 |
20090323741 | Deladurantaye et al. | Dec 2009 | A1 |
20100066527 | Liou | Mar 2010 | A1 |
20100117812 | Laubinger et al. | May 2010 | A1 |
20100141765 | Capello et al. | Jun 2010 | A1 |
20100191418 | Mimeault et al. | Jul 2010 | A1 |
20100194595 | Mimeault et al. | Aug 2010 | A1 |
20100214554 | Audier et al. | Aug 2010 | A1 |
20100277713 | Mimeault | Nov 2010 | A1 |
20100309024 | Mimeault | Dec 2010 | A1 |
20110006188 | Lin | Jan 2011 | A1 |
20110025843 | Oggier et al. | Feb 2011 | A1 |
20110026007 | Gammenthaler | Feb 2011 | A1 |
20110115409 | Schwartz et al. | May 2011 | A1 |
20110115645 | Hall et al. | May 2011 | A1 |
20110134249 | Wood et al. | Jun 2011 | A1 |
20110205521 | Mimeault et al. | Aug 2011 | A1 |
20110235028 | Rohrseitz | Sep 2011 | A1 |
20120268602 | Hirai | Oct 2012 | A1 |
20120287417 | Mimeault | Nov 2012 | A1 |
20120307065 | Mimeault et al. | Dec 2012 | A1 |
20130083316 | Mimeault | Apr 2013 | A1 |
Number | Date | Country |
---|---|---|
2633377 | Jun 2007 | CA |
2710212 | Jul 2009 | CA |
2857132 | Jan 2007 | CN |
29617413 | Nov 1996 | DE |
19823135 | Nov 1999 | DE |
19921449 | Jan 2001 | DE |
69710579 | Aug 2002 | DE |
10247290 | Apr 2004 | DE |
19604338 | Jul 2004 | DE |
102004035856 | Mar 2005 | DE |
20200501816 | Nov 2005 | DE |
102006025020 | Nov 2007 | DE |
202008007078 | Oct 2008 | DE |
102007038973 | Feb 2009 | DE |
102009013841 | Sep 2009 | DE |
102004016025 | May 2010 | DE |
102008043880 | May 2010 | DE |
102010012811 | Sep 2011 | DE |
0188393 | Jul 1986 | EP |
0318260 | May 1989 | EP |
0476562 | Mar 1992 | EP |
0259445 | Aug 1993 | EP |
0494815 | Dec 1996 | EP |
0798684 | Oct 1997 | EP |
0838695 | Apr 1998 | EP |
0612049 | Sep 1998 | EP |
0912970 | Apr 2000 | EP |
0779990 | Mar 2003 | EP |
0935764 | Mar 2003 | EP |
1296302 | Mar 2003 | EP |
0789342 | Jun 2003 | EP |
1334869 | Aug 2003 | EP |
0784302 | Sep 2003 | EP |
1034522 | Jan 2004 | EP |
0866434 | Jun 2004 | EP |
0988624 | Jul 2004 | EP |
0834424 | Nov 2004 | EP |
1220181 | Aug 2005 | EP |
1521226 | Jun 2006 | EP |
1049064 | Sep 2006 | EP |
0904552 | Mar 2007 | EP |
1052143 | Jul 2007 | EP |
1542194 | May 2009 | EP |
1048961 | Jul 2009 | EP |
2106968 | Oct 2009 | EP |
1224632 | Dec 2009 | EP |
2136550 | Dec 2009 | EP |
1435036 | Jan 2010 | EP |
1611458 | Apr 2010 | EP |
1997090 | Sep 2010 | EP |
1859990 | Apr 2011 | EP |
2306426 | Apr 2011 | EP |
2393295 | Dec 2011 | EP |
2690519 | Oct 1993 | FR |
2743150 | Jul 1997 | FR |
2743151 | Jul 1997 | FR |
2749670 | Dec 1997 | FR |
2910408 | Jun 2008 | FR |
2264411 | Aug 1993 | GB |
2311265 | Sep 1997 | GB |
2354898 | Jul 2003 | GB |
2369737 | Feb 2005 | GB |
2399968 | Feb 2005 | GB |
2431498 | Apr 2007 | GB |
2445767 | Jul 2008 | GB |
57206872 | Dec 1982 | JP |
5824876 | Feb 1983 | JP |
2059608 | Feb 1990 | JP |
04145390 | May 1992 | JP |
04145391 | May 1992 | JP |
H04172285 | Jun 1992 | JP |
5119147 | May 1993 | JP |
6331745 | Dec 1994 | JP |
H07280940 | Oct 1995 | JP |
09178786 | Jul 1997 | JP |
2000198385 | Jul 2000 | JP |
2004102889 | Apr 2004 | JP |
20051425 | Jan 2005 | JP |
2005170184 | Jun 2005 | JP |
2006021720 | Jan 2006 | JP |
2006507180 | Mar 2006 | JP |
2006172210 | Jun 2006 | JP |
2006258598 | Sep 2006 | JP |
2006258598 | Sep 2006 | JP |
2006521536 | Sep 2006 | JP |
2007121116 | May 2007 | JP |
8705138 | Aug 1987 | WO |
9203808 | Mar 1992 | WO |
9634252 | Oct 1996 | WO |
9904378 | Jan 1999 | WO |
0139153 | May 2001 | WO |
0185491 | Nov 2001 | WO |
0215334 | Feb 2002 | WO |
03000520 | Jan 2003 | WO |
03007269 | Jan 2003 | WO |
2004010402 | Jan 2004 | WO |
2004027451 | Apr 2004 | WO |
2004036244 | Apr 2004 | WO |
2004039631 | May 2004 | WO |
2004100103 | Nov 2004 | WO |
2005008271 | Jan 2005 | WO |
2005072358 | Aug 2005 | WO |
2006031220 | Mar 2006 | WO |
2006044758 | Apr 2006 | WO |
2006082502 | Aug 2006 | WO |
2006092659 | Sep 2006 | WO |
2007005942 | Jan 2007 | WO |
2007071032 | Jun 2007 | WO |
2007096814 | Aug 2007 | WO |
2008037049 | Apr 2008 | WO |
2008121648 | Oct 2008 | WO |
2008154736 | Dec 2008 | WO |
2008154737 | Dec 2008 | WO |
2009013739 | Jan 2009 | WO |
2009079789 | Jul 2009 | WO |
2009087536 | Jul 2009 | WO |
2009104955 | Aug 2009 | WO |
2009117197 | Sep 2009 | WO |
2010033024 | Mar 2010 | WO |
2010057697 | May 2010 | WO |
2010069002 | Jun 2010 | WO |
2010122284 | Oct 2010 | WO |
2010144349 | Dec 2010 | WO |
2011015817 | Feb 2011 | WO |
2011025563 | Mar 2011 | WO |
2011055259 | May 2011 | WO |
2011077400 | Jun 2011 | WO |
2012153309 | Nov 2012 | WO |
2012172526 | Dec 2012 | WO |
2013128427 | Sep 2013 | WO |
Entry |
---|
Shimoni et al., “Detection of vehicles in shadow areas”, Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2011 3rd, Jun. 6-9, 2011, pp. 1-4, IEEE, Lisbon. |
United States Department of Transportation, Federal Highway Administration, Sensor Technology—Chapter 2, Traffic Detector Handbook : Third Edition—vol. 1, FHWA-HRT-06-108, available at http://www.tfhrc.gov/its/pubs/06108/02a.htm on Sep. 16, 2009. |
The Vehicule Detector Clearinghouse, “A Summary of Vehicle Detection and Surveillance Technologies used in Intelligent Transportaion Systems”, Fall 2000, Southwest Technology Development Insitute (SWTDI) at New Mexico State University (NMSU), sponsored in cooperation with the U.S. Department of Transportation FHWA, available at http://www.fhwa.dot.gov/ohim/tvtw/vdstits. |
U.S. Department of Transportation Federal Highway Administration, “Detailed Monitoring Protocol 4.0 Monitoring Methods”, Department of Air Quality, Air Toxic MSAT, available at http://www.fhwa.dot.gov/environment/air—quality/air—toxics/research—and—analysis/near—road—study/protocol/protocol04.cfm, Updated on Jul. 6, 2011. |
United States Department of Transportation, Research and Innovative Technology Administration, 5.3. Infrared Detectors, available at http://ntl.bts.gov/DOCS/96100/ch05/body—ch05—03.html on Sep. 16, 2009. |
Tayfun Kon, Thesis, “Collision Warning and Avoidance System for Crest Vertical Curves”, Virginia Tech, May 4, 1998, Appendix 82, pp. 51-92, published on Digital Library and Archives, University Libraries of Virginia Tech, VA. |
Lawrence A. Klein, Vehicle Detector Technologies for Traffic Management Applications, Part 1, Colorado Department of Transportation, Intelligent Transportation Systems (ITS), 1997, available at http://www.cotrip.org/its/ITS%20Guidelines%20Web%20New%20Format%202-05/Web%20Solutions%20Packages/ITS%20Solution%20Packages%20-%20Web%20Copy/Vehicle%20Detectors/Klein%20Part%201%20Vehicle%20Detector%20Technologies.doc on Sep. 16, 2009. |
Hussain, Tarik Mustafa, City University of New-York, Infrared Vehicle Sensor for Traffic Control, Thesis (PHD) City University of New York, Dissertation Abstracts International, vol. 55-07, Section A, p. 2176, 1994, available at http://adsabs.harvard.edu//abs/1994PhDT 85H on Sep. 16, 2009. |
Dimitri Loukakos, Active Laser Infrared Detectors, Intelligent Transportation Systems, Traffic Surveillance, California Center for Innovative Transportation at the University of California, Dec. 20, 2001, available at http://www.calccit.org/itsdecision/serv—and—tech/Traffic—Surveillance/road-based/roadside/other—roadside—rep.html on Sep. 16, 2009. |
Geneq Inc., Passive Infrared Detector for Traffic Data Acquisition, Model IR 250, Department of Meteorology, available at http://www.geneq.com/catalog/en/ir250.html on Sep. 16, 2009. |
Akindinov et al., “Detection of Light Pulses Using an Avalanche-Photodiode Array with a Metal-Resistor-Semiconductor Structure”, Instruments and Experimental Techniques, Nov. 2004, vol. 48, No. 3 205, pp. 355-363, Russia. |
Braun et al., “Nanosecond transient electroluminescence from polymer lightemitting diodes”, Applied Physics Letters Dec. 1992, vol. 61, No. 26, pp. 3092-3094, California. |
Number | Date | Country | |
---|---|---|---|
20140232566 A1 | Aug 2014 | US |
Number | Date | Country | |
---|---|---|---|
61498083 | Jun 2011 | US |