Next generation automotive systems such as Lane Departure Warning (LDW), Collision Avoidance (CA), Blind Spot Detection (BSD) or Adaptive Cruise Control (ACC) systems will require target information from multiple sensors including a new class of sensor such as radar, image or laser, similar to those found on advanced tactical fighter aircraft. For example, one sensor may be located on the front bumper of the vehicle and obtains range and azimuth information about vehicles and stationary objects in front of the vehicle. Another sensor may be located on the dash of the vehicle and obtains image information about vehicles and stationary objects in front of the vehicle. Another sensor may be located on the side of the vehicle and obtains either range and azimuth data or image data in order to determine velocity and track information on vehicles that pass the vehicle. These new systems must take all of the information from the multiple sensors on the vehicle and compute an accurate picture of the moving objects around the vehicle; this is known as kinematic state of the targets, or Situation Awareness (SA). To do this the Situation Awareness Platform (SAP) must accurately align the sensors to each other so that information about a target from one sensor can be used with information about the target from a different sensor. This is called Sensor Fusion (SF), this is necessary for the SAP to get an optimal kinematic state of the targets around the vehicle in order to assess threat. The sensors must also be aligned to the body of the vehicle so that the SAP can determine the position and velocity of the target with respect to the vehicle; this is called Navigation Fusion (NF).
One method of aligning the sensors to each other and to the vehicle is to use mechanical and optical instruments, such as auto-collimators and laser boresight tools, during the production of the vehicle. This technique is not only costly, but would be required if a sensor were repaired or replaced after production. An alignment procedure would have to be performed again in order to assure the safety critical systems were reporting accurately. Also as the vehicle goes through normal wear and tear the sensors would start to become misaligned and may not be noticed by the operator. This means that the data from the sensors would not correlate with each other and the vehicle reference frame until the sensors were aligned again. Again, this would be costly to the vehicle operator and until performed, the SAP may not provide accurate data. Therefore, a method to align the sensors to each other and to the vehicle without the use of sophisticated optical tools is required. This patent addresses this problem by describing methods that can be used to align the sensors to each other and to the vehicle that do not require external alignment equipment.
In a discussion of Prior Art, U.S. Pat. No. 5,245,909, Automatic Sensor Alignment, relates to systems for maintaining alignment-sensitive aircraft-borne avionics and weapons sensors in precise alignment. It further relates to methods for precisely aligning sensitive avionics for weapons system instrumentation, which is subject to vibrations causing misalignment. Whereas this disclosure relates to methods and systems that support advanced automotive systems not described in the prior art. A second key difference is the reliance on sensor data from the vehicle as part of the alignment method. Another difference is using image sensors with elements of the vehicle in the field of view of the imager and employing optical methods for determining changes to the alignment with respect to the vehicle and vehicle reference frame, then applying a compensation based on the misalignment angle measured. Finally, this system described herein does not require a reliance on boresighting and aligning any sensor to achieve a vehicle reference frame.
U.S. Pat. No. 6,202,027, Automatic Curve Sensor Calibration, describes an improved system for accurately determining the travel path of a host vehicle and the azimuth angle of a target vehicle through an automatic calibration that detects and compensates for misalignment and curve sensor drift. The difference is a reliance on observed objects and track file generation and subsequent changes to the track files over time. Whereas this patent teaches methods of alignment based force vectors, rotational rates or optically measured changes with respect to the vehicle reference frame. Essentially all observed objects are compensated for misalignment error on the observing vehicle.
U.S. Pat. No. 5,031,330, Electronic Boresight, teaches that pairs of level sensing devices can be used in a method that aligns plane surfaces to one another by tilting platforms equal to the amount misalignment measured to adjust the sensor azimuth. Whereas this patent teaches that the sensors are rigidly mounted to the vehicle and correction to misalignment is done by compensation values observed with respect to the vehicle reference frame.
A vehicle sensor system configured to gather sensory data 360 degrees around the vehicle, comprises sensors for gathering data such as: range (e.g. ultrasonic); range and azimuth (e.g. laser and/or radar); images (e.g. optical and/or thermal). The vehicle has sensors that align and establish a vehicle reference frame by measuring body yaw, pitch and roll rates as well as acceleration along the 3 axes of the vehicle. The imaging sensors that have a clear view of body mold lines, like hood or rear deck, will align themselves to the vehicle reference frame, those sensors that can not align using optical methods are aligned to the vehicle using accelerometers and rate sensors by reading the inertial acceleration or angular rotation to align themselves to each other. An Integrated Computing Platform (ICP) hosts the SAP software that maintains complete system alignment by determining differences in alignment and applying or updating a compensation value with respect to the vehicle body coordinates resulting in a dynamically boresighted system.
One method is to attach three axis accelerometers to each sensor and to the vehicle and use gravity and the acceleration of the vehicle, which will be sensed by the accelerometers, to align the sensor axes to each other and to the vehicle. Information from the vehicle that is available on the Car Area Network (CAN) bus will also be used to perform the calculation of the misalignment angles.
In
The same approach can be used when the vehicle is turning and each accelerometer group experiences a centripetal acceleration. However, in this case the difference in accelerations must be compensated by the centripetal acceleration resulting from the lever arm vector between the two sensors and the angular rotation of the vehicle. The angular rotation of the vehicle is sensed by a gyro triad or micro-inertial device located at the vehicle body reference frame
Acomp=Asensora−wxwxRl
The input to the Kalman filter is now:
Acomp−Asensorb
where:
Also if the vehicle is stationary, the accelerometer groups will sense gravity and this can be used to help compute some of the misalignment angles. Information from the vehicle CAN bus, such as wheel rotation speeds are zero, will tell the Kalman filter that the vehicle is not moving and the only sensed acceleration will be from gravity.
The second method is to use accelerometers to align the sensors to each other and one of the sensors is aligned to the vehicle body by using optical information from the sensor itself. For example, acceleration data can be used to align sensor A to sensor B, but sensor B is aligned to the vehicle body directly by using sensor B to compute the misalignment angles between sensor B and the vehicle body. Since sensor A is aligned to sensor B and sensor B is aligned to the vehicle body, you can compute the misalignment between sensor A and the vehicle body. Sensor B can be a visual sensor, such as a video camera, and by observing the outline of the hood and body of the vehicle using this camera, you can compute the misalignment angles between sensor B and the vehicle body frame.
Φs=(Pp/480)*FOVv
The pitch misalignment angle is:
Φmisalign=Φs−Φvehicle
ΨMisalign=((Pyl−Pyr)/2*640)*FOVh
ΘMisalign=(2*Pr/640)*180/π
When the vehicle is moving, the micro-inertials sense the angular rotation and/or acceleration of the vehicle. Like
A third method is to use optical information from sensor A and sensor B to compute the misalignment between the two sensors and to use optical information from sensor B to compute the misalignment between sensor B and the vehicle body. For example, sensor A can be a ranging laser sensor and it sends out multiply beams of light to detect a target. When the light is reflected from the target, sensor B can also detect the reflected light in its video camera and using this information it can compute the misalignment between sensor A and sensor B.
A fourth method is to collocate all of the sensors into one box that is mounted on the vehicle, such as the roof, so that all sensors are always aligned with respect to each other and the only alignment required is the alignment between this sensor box and the vehicle body. This can be performed by using a set of accelerometers in the sensor box and on the vehicle body frame or optically by using a video camera in the sensor box.
The systems described above can use dedicated processor systems, micro controllers, programmable logic devices, or microprocessors that perform some or all of the operations. Some of the operations described above may be implemented in software and other operations may be implemented in hardware.
For the sake of convenience, the operations are described as various interconnected functional blocks or distinct software modules. This is not necessary, however, and there may be cases where these functional blocks or modules are equivalently aggregated into a single logic device, program or operation with unclear boundaries. In any event, the functional blocks and software modules or described features can be implemented by themselves, or in combination with other operations in either hardware or software.
Having described and illustrated the principles of the invention in a preferred embodiment thereof, it should be apparent that the invention may be modified in arrangement and detail without departing from such principles. Claim is made to all modifications and variation coming within the spirit and scope of the following claims.
This application is a continuation application of Ser. No. 10/985,577 filed Nov. 9, 2004 now U.S. Pat. No. 7,337,650 which is incorporated herein by reference. This application incorporates by reference U.S. Pat. Nos. 6,629,033, Issued Sep. 30, 2003 Titled—OPEN COMMUNICATION SYSTEM FOR REAL-TIME MULTIPROCESSOR APPLICATIONS, 6,771,208, Issued Aug. 3, 2004 Titled—MULTI SENSOR SYSTEM, and U.S. patent application Ser. No. 09/841,915, filed Apr. 24, 2001 entitled: METHOD AND APPARATUS FOR DYNAMIC CONFIGURATION OF MULTIPROCESSOR SYSTEM.
Number | Name | Date | Kind |
---|---|---|---|
2995318 | Cocharo | Aug 1961 | A |
4303978 | Shaw et al. | Dec 1981 | A |
4528563 | Takeuchi | Jul 1985 | A |
4591976 | Webber et al. | May 1986 | A |
4829434 | Karmel et al. | May 1989 | A |
4907159 | Mauge et al. | Mar 1990 | A |
5008678 | Herman | Apr 1991 | A |
5031330 | Stuart | Jul 1991 | A |
5045937 | Myrick | Sep 1991 | A |
5111401 | Everett, Jr. et al. | May 1992 | A |
5115245 | Wen et al. | May 1992 | A |
5245909 | Corrigan et al. | Sep 1993 | A |
5303297 | Hillis | Apr 1994 | A |
5339086 | DeLuca et al. | Aug 1994 | A |
5341301 | Shirai et al. | Aug 1994 | A |
5438361 | Coleman | Aug 1995 | A |
5471214 | Faibish et al. | Nov 1995 | A |
5506963 | Ducateau et al. | Apr 1996 | A |
5532706 | Reinhardt et al. | Jul 1996 | A |
5552773 | Kuhnert | Sep 1996 | A |
5581462 | Rogers | Dec 1996 | A |
5585798 | Yoshioka et al. | Dec 1996 | A |
5617085 | Tsutsumi et al. | Apr 1997 | A |
5646612 | Byon | Jul 1997 | A |
5749060 | Graft et al. | May 1998 | A |
5761320 | Farinelli et al. | Jun 1998 | A |
5786998 | Neeson et al. | Jul 1998 | A |
5872508 | Taoka | Feb 1999 | A |
5907293 | Tognazzini | May 1999 | A |
5915214 | Reece et al. | Jun 1999 | A |
5943427 | Massie et al. | Aug 1999 | A |
5964822 | Alland et al. | Oct 1999 | A |
5966658 | Kennedy et al. | Oct 1999 | A |
5969598 | Kimura | Oct 1999 | A |
5977906 | Ameen et al. | Nov 1999 | A |
5983092 | Whinnett et al. | Nov 1999 | A |
5983161 | Lemelson et al. | Nov 1999 | A |
6009330 | Kennedy et al. | Dec 1999 | A |
6028537 | Suman et al. | Feb 2000 | A |
6028548 | Farmer | Feb 2000 | A |
6061709 | Bronte | May 2000 | A |
6097285 | Curtin | Aug 2000 | A |
6128608 | Barnhill | Oct 2000 | A |
6148261 | Obradovich et al. | Nov 2000 | A |
6161071 | Shuman et al. | Dec 2000 | A |
6163711 | Juntunen et al. | Dec 2000 | A |
6166627 | Reeley | Dec 2000 | A |
6167253 | Farris et al. | Dec 2000 | A |
6175782 | Obradovich et al. | Jan 2001 | B1 |
6181994 | Colson et al. | Jan 2001 | B1 |
6182006 | Meek | Jan 2001 | B1 |
6202027 | Alland et al. | Mar 2001 | B1 |
6203366 | Muller et al. | Mar 2001 | B1 |
6204804 | Andersson | Mar 2001 | B1 |
6226389 | Lemelson et al. | May 2001 | B1 |
6240365 | Bunn | May 2001 | B1 |
6243450 | Jansen et al. | Jun 2001 | B1 |
6252544 | Hoffberg et al. | Jun 2001 | B1 |
6275231 | Obradovich et al. | Aug 2001 | B1 |
6292109 | Murano et al. | Sep 2001 | B1 |
6292747 | Amro et al. | Sep 2001 | B1 |
6294987 | Matsuda et al. | Sep 2001 | B1 |
6297732 | Hsu et al. | Oct 2001 | B2 |
6298302 | Walgers et al. | Oct 2001 | B2 |
6326903 | Gross et al. | Dec 2001 | B1 |
6327536 | Tsuji et al. | Dec 2001 | B1 |
6362748 | Huang | Mar 2002 | B1 |
6374286 | Gee et al. | Apr 2002 | B1 |
6389340 | Rayner | May 2002 | B1 |
6405132 | Breed et al. | Jun 2002 | B1 |
6408174 | Steijer | Jun 2002 | B1 |
6417782 | Darnall | Jul 2002 | B1 |
6429789 | Kiridena et al. | Aug 2002 | B1 |
6429812 | Hoffberg | Aug 2002 | B1 |
6445308 | Koike | Sep 2002 | B1 |
6452484 | Drori | Sep 2002 | B1 |
6484080 | Breed | Nov 2002 | B2 |
6496689 | Keller et al. | Dec 2002 | B1 |
6505100 | Stuempfle et al. | Jan 2003 | B1 |
6515595 | Obradovich et al. | Feb 2003 | B1 |
6522875 | Dowling et al. | Feb 2003 | B1 |
6622083 | Knockeart et al. | Sep 2003 | B1 |
6778924 | Hanse | Aug 2004 | B2 |
6782315 | Lu et al. | Aug 2004 | B2 |
7006950 | Greiffenhagen et al. | Feb 2006 | B1 |
7024363 | Comerford et al. | Apr 2006 | B1 |
7079993 | Stephenson et al. | Jul 2006 | B2 |
7120129 | Ayyagari et al. | Oct 2006 | B2 |
7187947 | White et al. | Mar 2007 | B1 |
7257426 | Witkowski et al. | Aug 2007 | B1 |
7343160 | Morton | Mar 2008 | B2 |
20010008992 | Saito et al. | Jul 2001 | A1 |
20010009855 | L' Anson | Jul 2001 | A1 |
20010018639 | Bunn | Aug 2001 | A1 |
20010022927 | Mattio et al. | Sep 2001 | A1 |
20010041556 | Laursen et al. | Nov 2001 | A1 |
20010048749 | Ohmura et al. | Dec 2001 | A1 |
20010051853 | Evans et al. | Dec 2001 | A1 |
20020012329 | Atkinson et al. | Jan 2002 | A1 |
20020087886 | Ellis | Jul 2002 | A1 |
20020119766 | Bianconi et al. | Aug 2002 | A1 |
20020142759 | Newell et al. | Oct 2002 | A1 |
20020144010 | Younis et al. | Oct 2002 | A1 |
20020177429 | Watler et al. | Nov 2002 | A1 |
20020198925 | Smith et al. | Dec 2002 | A1 |
20030004633 | Russell et al. | Jan 2003 | A1 |
20030009270 | Breed | Jan 2003 | A1 |
20030011509 | Honda | Jan 2003 | A1 |
20030060188 | Gidron et al. | Mar 2003 | A1 |
20030065432 | Shuman et al. | Apr 2003 | A1 |
20030110113 | Martin | Jun 2003 | A1 |
20030201365 | Nelson | Oct 2003 | A1 |
20030201929 | Lutter et al. | Oct 2003 | A1 |
20040149036 | Foxlin et al. | Aug 2004 | A1 |
20040164228 | Fogg et al. | Aug 2004 | A1 |
20050080543 | Lu et al. | Apr 2005 | A1 |
20070115897 | Chen et al. | May 2007 | A1 |
Number | Date | Country |
---|---|---|
3125151 | Jan 1983 | DE |
3125161 | Jan 1983 | DE |
0441576 | Aug 1991 | EP |
1355128 | Oct 2003 | EP |
2000207691 | Jul 2000 | JP |
9624229 | Aug 1996 | WO |
9908436 | Feb 1999 | WO |
9957662 | Nov 1999 | WO |
9965183 | Dec 1999 | WO |
0040038 | Jun 2000 | WO |
0130061 | Apr 2001 | WO |
0158110 | Aug 2001 | WO |
Number | Date | Country | |
---|---|---|---|
Parent | 10985577 | Nov 2004 | US |
Child | 12024058 | US |