A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document of the patent disclosure as it appears in the United States Patent and Trademark Office patent file or records, but otherwise, reserves all copyright rights whatsoever. The following notice applies to the pseudo code described herein, inclusive of the drawing figures where applicable: Copyright© 2011, Laser Technology, Inc.
The present invention relates, in general, to the field of traffic monitoring and enforcement systems. More particularly, the present invention relates to an intelligent laser tracking system and method for mobile and fixed position traffic monitoring and enforcement applications.
Police have been using radar and laser speed measurement devices to determine vehicle speed in traffic enforcement operations for many years now. With respect to radar based devices, they generally function such that a microwave signal is emitted toward a moving vehicle and a reflection from the target returned to the device which then uses the determined Doppler shift in the return signal to determine the vehicle's speed. Radar based devices have an advantage over laser based speed guns in that they emit a very broad signal cone of energy and do not therefore, require precise aiming at the target vehicle. As such, they are well suited for fixed and mobile applications while requiring little, if any, manual operator aiming of the device.
On the other hand, laser based speed guns employ the emission of a series of short pulses comprising a very narrow beam of monochromatic laser energy and then measure the flight time of the pulses from the device to the target vehicle and back. These laser pulses travel at the speed of light which is on the order of 984,000,000 ft/sec. or approximately 30 cm/nsec. Laser based devices then very accurately determine the time from when a particular pulse was emitted until the reflection of that pulse is returned from the target vehicle and divide it by two to determine the distance to the vehicle. By emitting a series of pulses and determining the change in distance between samples, the speed of the vehicle can be determined very quickly and with great accuracy.
Because of the narrow beam width of laser based speed guns, they have heretofore been predominantly relegated to hand held units which must be manually aimed at a specific target vehicle. That being the case, they have not been able to be employed in autonomous applications wherein an operator is not manually aiming the device. Further, in mobile applications wherein the officer may be driving a vehicle himself, he is then unable to divert his attention from that function in order to track and aim a laser based speed measurement device at a suspected speeder let alone track multiple targets.
In fixed and semi-fixed uses of laser based speed detection devices, such as overpass mounted applications, it is important that the laser pulses be directed to a single point on an approaching target vehicle inasmuch as the frontal surface angles can vary between, for example, that of the grille (θ1) and the windshield (θ2) Where the distance to the target vehicle as measured by the laser based device is a distance M at an angle φ and the true distance to the target is D, D is then equal to M*(COS φ+SIN φ/TAN(θ1 or θ2)).
Thus, the true distance D can vary, and hence the calculated speed of the target vehicle. Normally, the angle φ is less than 10° and COS φ is then almost 1. This can reduce the calculated speed of the target vehicle, in effect giving a 1% to 2% detected speed advantage to the target vehicle as indicated below with respect to the “cosine effect”. However, the cosine effect can be minimized if an accurate tracking trajectory is maintained. On the other hand, it should be noted that the value of SIN φ/TAN(θ1 or θ2) can be greater than a normally acceptable error margin (e.g., 0.025 (2.5%)) and an even larger error can be encountered if the laser pulses are not consistently aimed at a single point on the target vehicle. As used herein, the SIN φ/TAN(θ1 or θ2) portion of the equation is referred to as a geometric error.
Both radar and laser based speed measurement devices can be used to measure the relative speed of approaching and receding vehicles from both fixed and mobile platforms. If the target vehicle is traveling directly (i.e. on a collision course) toward the device, the relative speed detected is the actual speed of the target. However, as is most frequently the case, if the vehicle is not traveling directly toward (or away from) the device but at an angle (α), the relative speed of the target with respect to that determined by the device will be slightly lower than its actual speed. This phenomenon is known as the previously mentioned cosine effect because the measured speed is directly related to the cosine of the angle between the speed detection device and the vehicle direction of travel. The greater the angle, the greater the speed error and the lower the measured speed. On the other hand, the closer the angle (α) is to 0°, the closer the measured speed is to actual target vehicle speed.
The present invention advantageously provides an intelligent laser tracking system and method for mobile and fixed position traffic monitoring and enforcement applications. The system disclosed herein can autonomously track multiple target vehicles with a highly accurate laser based speed measurement system or, under manual control via a touch screen, select a particular target vehicle of interest.
The system of the present invention provides extremely accurate tracking of target vehicles using a novel and extremely fast pan/tilt mechanism which is stabilized through the use of an onboard gyro and inclinometer. The pan/tilt mechanism utilizes respective pan and tilt brushless DC (BLDC) motors which provide high torque and efficiency. The relatively heavy motors are mounted to the pan/tilt mechanism base plate to minimize inertia and lower the mass of the moving pan and tilt plates to which the laser rangefinder of the high performance laser speed measurement subsystem and the visual sensor subsystem are affixed.
In a mobile implementation of the present invention, the police vehicle in which the system is mounted has its own speed uploaded to the system via the vehicle's onboard diagnostic (OBD II) controller area network (CAN) port. Increased accuracy of this information is assured through updating of the police vehicle's speed through appropriate application of a global positioning system (GPS) subsystem to correct speed data for tire wear and pressure. Conveniently, the system of the present invention can be mounted within a standard police vehicle light bar enclosure or in other locations to provide both a forward and rearward view of traffic.
The intelligent laser tracking system of the present invention also assures that the laser is consistently aimed at a single specific point on the target vehicle to obviate geometric errors. Moreover, the system and method of the present invention can accurately compensate for the cosine effect when the target vehicle is moving at an angle with respect to the system.
In addition to mobile embodiments of the present invention for use in a police vehicle, the system of the present invention can also be mounted on a tripod or other fixture in a fixed or stationary location adjacent one or more lanes of vehicle traffic while still providing accurate targeting of multiple target vehicle speeds, distances and angles.
The image sensors of the present invention provide both wide and narrow views of target vehicles simultaneously as well as providing motion clips for evidentiary purposes and substantiation of vehicle speed. In a representative embodiment disclosed herein, the narrow view and wide view images can be obtained using dual sensors, lenses and an associated multiplexer. A dual multiplexed camera system is capable of achieving a fast transition between both narrow and wide views. Optionally, if a single lens system is implemented, lens control of the system camera can be provided for zoom, iris and focus functions. Remote monitoring of the system is possible through an input/output (I/O) interface such as Ethernet, WiFi, serial interfaces such as RS232/485, universal serial bus (USB) and the like. The image sensors employed in the system can be remote or fully integrated and remote monitoring functionality is also provided.
In addition to the aforementioned uses of the system of the present invention for target vehicle speed monitoring, the system can also be used to augment roadside police officer safety in such applications as construction zone and area scanning for collision avoidance and the like. Moreover, the system of the present invention can also be employed as a low cost three dimensional (3D) scanner for pile volume calculation, jetway positioning for aircraft, accident reconstruction and other applications.
Particularly disclosed herein is a tracking system comprising a processor, a visual sensor subsystem coupled to the processor and a laser speed measurement subsystem also coupled to the processor. A pan/tilt subsystem is coupled to the processor and movably supports the visual sensor and laser speed measurement subsystems.
Also particularly disclosed herein is a system for monitoring the speed of one or more target vehicles comprising a processor, a laser speed measurement subsystem coupled to the processor and a visual sensor subsystem coupled to the processor. A pan/tilt subsystem is also coupled to the processor and is operative to autonomously track one or more of the target vehicles based on input from the visual sensor subsystem. The system determines the speed of the one or more target vehicles based on input from the laser speed measurement subsystem.
The aforementioned and other features and objects of the present invention and the manner of attaining them will become more apparent and the invention itself will be best understood by reference to the following description of a preferred embodiment taken in conjunction with the accompanying drawings, wherein:
With reference now to
A visual sensor subsystem 104 is bidirectionally coupled to the MPU 102 by one or more image buses as illustrated to which an intelligent pan/tilt subsystem 106 is also bidirectionally coupled. The visual sensor subsystem 104 may be made physically detachable from the rest of the unit if desired. A high performance laser speed measurement subsystem 108 is also bidirectionally coupled to the MPU 102 to provide distance and speed measurement data between the system 100 and a target vehicle 128.
An on-board diagnostic II (OBD II)/controller area network (CAN) interface 110 to a vehicle diagnostic port (e.g. in a police vehicle 130) is also coupled to the MPU 102 as well as a touch screen 112 for operator viewing and input. The touch screen 12 may also be made detachable from the rest of the unit if desired. A global positioning system (GPS) subsystem 116 also provides input to the MPU 102 while an input/output (I/O) interface 118, such as an Ethernet port, WiFi, serial port (e.g. RS232/485), universal serial bus (USB) or other interface couples external devices to the system 100 through MPU 102.
Back-up storage for the system 100 may be provided by means of a storage device 120 such as an SD card or similar non-volatile storage devices whether removable or otherwise. The system 100 is powered through a power submodule 122 which may comprise the operating vehicle electrical system in a mobile embodiment of the present invention, an external power supply (e.g. an automobile battery or generator) 124 and/or a battery back-up system to prevent data loss such as a 7.2 volt lithium ion (Li-Ion) battery 126.
The visual sensor subsystem 104 comprise, in a representative embodiment of the present invention a 5.0 megapixel image sensor functioning as a wide view camera 140 and another 5.0 megapixel image sensor functioning as a narrow view camera 142. These two sensors are coupled to the input of a low-voltage differential signaling (LVDS) interface and multiplexer 144 functioning as a data serializer which, in turn, is coupled over a two-wire connection to an LVDS interface deserializer 148 for the wide view and narrow view sensors 140,142 functioning as remote camera devices. In order to toggle between narrow to wide (or wide to narrow) views, the remote camera block (140 and 142) would have an associated multiplexer to select one camera input at a time. An onboard camera 146 is also coupled to the MPU 102 which, in a representative embodiment, may comprise a 5.0 megapixel complementary metal oxide (CMOS) image sensor.
The intelligent pan/tilt subsystem 106 comprises, in pertinent part a bidirectional bus 150 to which a pair of position sensors 152 and 154 are coupled in addition to a gyro 160 and inclinometer 162. It should be noted that, as used herein, the function of the inclinometer 162 can also be performed by, for example, an accelerometer. The positions sensors 152 and 154 are respectively associated with the intelligent pan/tilt subsystem 106 pan motor 156 and tilt motor 158. The operation and functional elements of the intelligent pan/tilt subsystem 106 will be more fully described hereinafter.
With reference additionally now to
At this point the distance between the system 100 (for example, as mounted in a police vehicle 130) and a target vehicle 128 is determined at step 206 by the high performance laser speed measurement subsystem 108. In a preferred embodiment, the laser speed measurement subsystem 108 may comprise a TruSense™ S200 laser sensor available from Laser Technology, Inc., assignee of the present invention which provides up to 200 distance measurements per second. The distance information provided by the laser speed measurement subsystem 108 may be utilized to augment the visual sensor subsystem 104 and to resolve any ambiguities that might arise due to an inability to distinguish, for example, a dark colored license plate from shading due to poor lighting conditions.
At step 208, the motion of the target vehicle 128 with respect to the system 100 is determined in Cartesian coordinates (x,y) on an image plane. This may be effectuated in the following manner:
1. An image (240×180 pixels) of the target vehicle 128 is grabbed by the CMOS image sensor of either the onboard camera 146 or the remote cameras 140 or 142;
2. Features of the image are extracted. This may be effectuated through the use of optic flow in which the direction of movement of each pixel from one image to the next is determined. Among the processes which may be used in this regard include those described in a Wikipedia wiki on Optical flow or the use of edges (such as a Sobel operation) or those described in a Wikipedia wiki on Edge detection.
3. The extracted features are segmented to produce an object. This may be effectuated by the grouping of pixels which have a similar direction or fuzzy logic and/or a neural network may be employed for segmenting the pixels.
4. The center of mass of the objects is tracked and estimated. This can be accomplished through the use of a Kalman filter as described in a Wikipedia wiki on Kalman filters; and
5. The estimated position (x,y) can be used for the target motion (x,y).
At step 210, the shock and vibration experienced by the system 100 due to the possible motion of the police vehicle 130 is determined such that it can be filtered out. In this regard, the outputs of the gyro 160 and inclinometer 162 are sampled on the order of every millisecond or less. In a representative embodiment of the present invention, 2047 samples/second are taken of the inclinometer 162 and 1000 samples/second of the gyro 160. As these devices tend to generate a great deal of noise, this must be filtered out. However, since relatively strong filters would lead to a slower signal response time the representative embodiment of the system 100 of the present invention implements a dual-stage adaptive low pass filter wherein:
For all measured data x[i], i=0 to n.
y1[i]=y1[i−1]−k1*(x[i]−y1[i−1])
y2[i]=y2[I−1]−k2*(x[i]−y2[I−1]),
where k1 and k2 are coefficients of the low pass filters.
y[i]=y1[i] if the difference between y1[i] and y2[i] is greater than a threshold, otherwise y2[i].
y[i] can provide very stable output from a strong low pass filter of y2[i] as well as much faster response time from the weaker low pass filter of y1[i].
At step 212, the information calculated in steps 206, 208 and 210 is used to calculate new motor positions for the pan motor 156 and tilt motor 158 of the pan/tilt subsystem 106 in conjunction with the positions of these brushless DC (BLDC) motors from an associated optical encoder or hall sensors at step 214. Thereafter at step 216 the pan motor 156 and tilt motor 158 are appropriately controlled.
At step 218, the speed of the target vehicle 128 is determined by the laser speed measurement subsystem 108 while at step 220 the speed of the system 100 as mounted in a police vehicle 130 is determined from its controller area network (CAN) interface to the vehicle's OBD II port. Inputs into this determination can be obtained from the GPS subsystem 116 at step 222 to provide correction for the police vehicle's tire pressure, wheel diameter and the like which might otherwise affect this calculation. It should be noted that GPS is usually very accurate if a vehicle is travelling with a constant speed and is otherwise less reliable. In the representative embodiment of the system 100 disclosed herein, the system 100 monitors the vehicle's speed primarily through the OBD II port and when this indicates a stable speed, tire condition is calibrated more correctly in conjunction with the GPS subsystem 116 data.
At step 224, a stationary target based calibration for the police vehicle 130 tire pressure and wheel diameters may be performed by aiming the system 100 at a stationary target such as a road sign or land feature. As the speed of such an object is zero, the system 100 can then calibrate tire condition. Utilizing the information and data computed previously, the system 100 then determines whether the target vehicle 128 speed is greater than the posted speed limit at decision step 226. If the speed of the target vehicle 128 is excessive, all previously measured data is saved in conjunction with evidentiary data such as still images and a motion video clip as recorded by the visual sensor subsystem 104 at step 228. In operation, the system 100 has determined the relative speed between the police vehicle 130 and the target vehicle 128 as well as the absolute speed of the system 100 itself as calibrated in conjunction with the GPS subsystem 116 (step 222) and/or stationary target evaluation (step 224). In a representative embodiment of the present invention, the system 100 may store two still images of the target vehicle 128, a wide view (e.g. on the order of 10 to 30 degrees to include contextual background information) and a narrow view (e.g. on the order of 5 to 20 degrees to include more detail of the target vehicle 128). A particular implementation of the present invention utilizes 100 mm and 30 mm focal length lenses in this regard. The motion clip can be saved from either the wide view or narrow view images and then stored to the storage device 120 which may be an SD card or the like or otherwise stored through the I/O interface 118 to a network through Ethernet or to an associated USB device. The captured still image may also be processed at step 228 by a number plate recognition system and its license number also stored with the other data.
At step 230, current information regarding the target vehicle 128 being tracked and information derived from the visual sensor subsystem 104 is displayed on the touch screen 112 whereupon the operator of the system 100 in the police vehicle 130 can direct certain system 100 functions. At decision step 232, if the operator determines to provide input to the process 200, such input can be provided at step 234. If the process 200 is to stop at decision step 236, then it reaches an end. Otherwise, the process 200 returns to the operations of steps 206, 208 and 210 as previously described. Alternatively, if the system 100 is to remain in automatic mode, then a new position is calculated for target vehicle 128 tracking at step 238 whereupon decision step 236 is again reached.
With reference additionally now to
With reference additionally now to
The tilt plate 300 is pivotally mounted to the panning plate 302 to provide elevational motion for the visual sensor subsystem 104 and laser speed measurement subsystem 108. The panning plate 302 provides rotational motion for the same system 100 subsystems. A worm 304 driven by the tilt motor 158 in turn drives a worm gear 306 to drive a tilt shaft/pinion rotatably held by upper and lower tilt bearings 310, 312. The tilt shaft/pinion then drives a tilt gear 314 to pivotally provide up and down elevational motion to the tilt plate 300.
With reference additionally now to
The design of the intelligent pan/tilt subsystem 106 of the present invention minimizes the inertia of the system 100 by placing the heavier mass of the pan and tilt motors 156, 158 on a fixed base plate and not on any of the moving parts. The design of this aspect of the present invention provides a particularly efficacious and low-cost solution.
With reference additionally now to
With reference additionally now to
With reference additionally now to
With reference additionally now to
With reference additionally now to
With reference additionally now to
While there have been described above the principles of the present invention in conjunction with specific circuitry and structure, it is to be clearly understood that the foregoing description is made only by way of example and not as a limitation to the scope of the invention. Particularly, it is recognized that the teachings of the foregoing disclosure will suggest other modifications to those persons skilled in the relevant art. Such modifications may involve other features which are already known per se and which may be used instead of or in addition to features already described herein. Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure herein also includes any novel feature or any novel combination of features disclosed either explicitly or implicitly or any generalization or modification thereof which would be apparent to persons skilled in the relevant art, whether or not such relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as confronted by the present invention. The applicants hereby reserve the right to formulate new claims to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom.
As used herein, the terms “comprises”, “comprising”, or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a recitation of certain elements does not necessarily include only those elements but may include other elements not expressly recited or inherent to such process, method, article or apparatus. None of the description in the present application should be read as implying that any particular element, step, or function is an essential element which must be included in the claim scope and THE SCOPE OF THE PATENTED SUBJECT MATTER IS DEFINED ONLY BY THE CLAIMS AS ALLOWED. Moreover, none of the appended claims are intended to invoke paragraph six of 35 U.S.C. Sect. 112 unless the exact phrase “means for” is employed and is followed by a participle.
Number | Name | Date | Kind |
---|---|---|---|
4815757 | Hamilton | Mar 1989 | A |
5528246 | Henderson et al. | Jun 1996 | A |
5680123 | Lee | Oct 1997 | A |
5902351 | Streit et al. | May 1999 | A |
5948038 | Daly et al. | Sep 1999 | A |
6639998 | Lee et al. | Oct 2003 | B1 |
7149325 | Pavlidis et al. | Dec 2006 | B2 |
7262790 | Bakewell | Aug 2007 | B2 |
7527439 | Dumm | May 2009 | B1 |
20030214585 | Bakewell | Nov 2003 | A1 |
20070050139 | Sidman | Mar 2007 | A1 |
20090079960 | Chung | Mar 2009 | A1 |
20110025862 | Lindsay | Feb 2011 | A1 |
Number | Date | Country |
---|---|---|
1020000038131 | Jul 2000 | KR |
1020050048961 | May 2005 | KR |
1020070121098 | Mar 2008 | KR |
Entry |
---|
PCT Notification of Transmittal of the International Search Report and The Written Opinion of the International Searching Authority, or the Declaration, PCT/US2012/054234, mailing date Feb. 27, 2013, pp. 11. |
Number | Date | Country | |
---|---|---|---|
20130066542 A1 | Mar 2013 | US |