Next generation automotive systems such as Lane Departure Warning (LDW), Collision Avoidance (CA), Blind Spot Detection (BSD) or Adaptive Cruise Control (ACC) systems will require target information from multiple sensors including a new class of sensor called sensor apertures such as radar, image or laser, similar to those found on advanced tactical fighter aircraft. For example, one sensor aperture may be located on the front bumper of the vehicle and obtains range and azimuth information about vehicles and stationary objects in front of the vehicle. Another sensor aperture may be located on the dash of the vehicle and obtains image information about vehicles and stationary objects in front of the vehicle. Another sensor aperture may be located on the side of the vehicle and obtains either range and azimuth data or image data in order to determine velocity and track information on vehicles that pass the vehicle. These new systems must take all of the information from the multiple sensors apertures on the vehicle and compute an accurate picture of the moving objects around the vehicle; this is known as kinematic state of the targets, or Situation Awareness (SA). To do this the Situation Awareness Platform (SAP) must accurately align the sensors apertures to each other so that information about a target from one sensor aperture can be used with information about the target from a different sensor aperture. This is called Sensor Fusion (SF), this is necessary for the SAP to get an optimal kinematic state of the targets around the vehicle in order to assess threat. The sensor apertures must also be aligned to the body of the vehicle so that the SAP can determine the position and velocity of the target with respect to the vehicle; this is called Navigation Fusion (NF).
One method of aligning the sensors apertures to each other and to the vehicle is to use mechanical and optical instruments, such as auto-collimators and laser boresight tools, during the production of the vehicle. This technique is not only costly, but would be require if a sensor aperture were repaired or replaced after production. An alignment procedure would have to be performed again in order to assure the safety critical systems were reporting accurately. Also as the vehicle goes through normal wear and tear the sensor apertures would start to become misaligned and may not be noticed by the operator. This means that the data from the sensor apertures would not correlate with each other and the vehicle reference frame until the sensor apertures were aligned again. Again, this would be costly to the vehicle operator and until performed, the SAP may not provide accurate data. Therefore, a method to align the sensor apertures to each other and to the vehicle without the use of sophisticated optical tools is required. This patent addresses this problem by describing methods that can be used to align the sensor apertures to each other and to the vehicle that do not require external alignment equipment.
In a discussion of Prior Art, U.S. Pat. No. 5,245,909, Automatic Sensor Alignment, relates to systems for maintaining alignment-sensitive aircraft-borne avionics and weapons sensors in precise alignment. It further relates to methods for precisely aligning sensitive avionics for weapons system instrumentation, which is subject to vibrations causing misalignment. Whereas this disclosure relates to methods and systems that support advanced automotive systems not described in the prior art. A second key difference is the reliance of sensor data from the vehicle as part of the alignment method. Another difference is using image apertures with elements of the vehicle in the field of view of the imager and employing optical methods for determining changes to the alignment with respect to the vehicle and vehicle reference frame, then applying a compensation based on the misalignment angle measured. Finally, this system described herein does not require a reliance on boresighting and aligning any sensor to achieve a vehicle reference frame.
U.S. Pat. No. 6,202,027, Automatic Curve Sensor Calibration, describes an improved system for accurately determining the travel path of a host vehicle and the azimuth angle of a target vehicle through an automatic calibration that detects and compensates for misalignment and curve sensor drift. The difference is a reliance on observed objects and track file generation and subsequent changes to the track files over time. Whereas this patent teaches methods of alignment based force vectors, rotational rates or optically measured changes with respect to the vehicle reference frame. Essentially all observed objects are compensated for misalignment error on the observing vehicle.
U.S. Pat. No. 5,031,330, Electronic Boresight, teaches that pairs of level sensing devices can be used in a method that aligns plane surfaces to one another by tilting platforms equal to the amount misalignment measured to adjust the sensor azimuth. Whereas this patent teaches that the sensor apertures are rigidly mounted to the vehicle and correction to misalignment is done by compensation values observed with respect to the vehicle reference frame.
Different sensors can be used in vehicles to identify objects and possible collision conditions. For example, there may be an optical sensor, such as a camera, mounted to the roof of the vehicle. Another Infrared (IR) sensor may be mounted in the front grill of the vehicle. A third inertial sensor may be located in yet another location in the central portion of the vehicle. Data from these different sensors is correlated together to identify and track objects that may come within a certain vicinity of the vehicle.
The measurements from the different sensors must be translated to a common reference point before the different data can be accurately correlated. This translation is difficult because the sensors are positioned in different locations on the vehicle. For example, the sensor located inside the front bumper of the vehicle may move in one direction during a collision while the sensor located on the top of the vehicle roof may move in a different direction.
One of the sensors may also experience vibrations at a different time than the other sensor. For example, the front bumper sensor may experience a vertical or horizontal movement when the vehicle runs over an obstacle before any movements or vibrations are experienced by the roof sensor. This different movements of sensors relative to each other make is very difficult to accurately determine the precise position and orientation of the sensors when the sensor readings are taken. This makes it difficult to translate the data into common reference coordinates.
The present invention addresses this and other problems associated with the prior art.
A vehicle sensor system configured to gather sensory data 360 degrees around the vehicle, comprising of sensor apertures for gathering data such as: range (e.g. ultrasonic); range and azimuth (e.g. laser and/or radar); images (e.g. optical and/or thermal). The vehicle has sensors that align and establish a vehicle reference frame by measuring body yaw, pitch and roll rates as well as acceleration along the 3 axes of the vehicle. The imaging apertures that have a clear view of body mold lines, like hood or rear deck, will align themselves to the vehicle reference frame, those apertures that can not align using optical methods are aligned to the vehicle using accelerometers and rates sensors by reading the inertial acceleration or angular rotation to align themselves to each other. An Integrated Computing Platform (ICP) hosts the SAP software that maintains complete system alignment by determining differences in alignment and applying or updating a compensation value with respect to the vehicle body coordinates resulting in a dynamically boresighted system.
A multi-sensor system includes multiple sensors that are integrated onto the same substrate forming a unitary multi-sensor platform that provides a known consistent physical relationship between the multiple sensors. A processor can also be integrated onto the substrate so that data from the multiple sensors can be processed locally by the multi-sensor system.
The foregoing and other objects, features and advantages of the invention will become more readily apparent from the following detailed description of a preferred embodiment of the invention which proceeds with reference to the accompanying drawings.
One method is to attach three axis accelerometers to each sensor and to the vehicle and use gravity and the acceleration of the vehicle, which will be sensed by the accelerometers, to align the sensor axes to each other and to the vehicle. Information from the vehicle that is available on the Car Area Network (CAN) bus will also be used to perform the calculation of the misalignment angles.
In
The same approach can be used when the vehicle is turning and each accelerometer group experiences a centripetal acceleration. However, in this case the difference in accelerations must be compensated by the centripetal acceleration resulting from the lever arm vector between the two sensor apertures and the angular rotation of the vehicle. The angular rotation of the vehicle is sensed by a gyro triad or micro-inertial device located at the vehicle body reference frame Acomp=Asensora−wxwxR1 The input to the Kalman filter is now: Acomp−Asensorb where: Asensora is the acceleration measured by sensor A accelerometers Asensorb is the acceleration measured by sensor B accelerometers w is the angular rotation of the vehicle measured by the ref gyros x is the cross product of two vectors R1 is the lever arm vector between sensor A and sensor B Acomp is the sensor acceleration compensated for lever arm rotation.
Also if the vehicle is stationary, the accelerometer groups will sense gravity and this can be used to help compute some of the misalignment angles. Information from the vehicle CAN bus, such as wheel rotation speeds are zero, will tell the Kalman filter that the vehicle is not moving and the only sensed acceleration will be from gravity.
The second method is to use accelerometers to align the sensor apertures to each other and one of the sensor apertures is aligned to the vehicle body by using optical information from the sensor aperture itself. For example, acceleration data can be used to align sensor aperture A to sensor aperture B, but sensor aperture B is aligned to the vehicle body directly by using sensor aperture B to compute the misalignment angles between sensor aperture B and the vehicle body. Since sensor aperture A is aligned to sensor aperture B and sensor aperture B is aligned to the vehicle body, you can compute the misalignment between sensor aperture A and the vehicle body. Sensor aperture B can be a visual sensor aperture, such as a video camera, and by observing the outline of the hood and body of the vehicle using this camera, you can compute the misalignment angles between sensor aperture B and the vehicle body frame.
When the vehicle is moving, the micro-inertials sense the angular rotation and/or acceleration of the vehicle. Like
A third method is to use optical information from sensor aperture A and sensor aperture B to compute the misalignment between the two sensor apertures and to use optical information from sensor aperture B to compute the misalignment between sensor aperture B and the vehicle body. For example, sensor aperture A can be a ranging laser sensor aperture and it sends out multiply beams of light to detect a target. When the light is reflected from the target, sensor aperture B can also detect the reflected light in its video camera and using this information it can compute the misalignment between sensor aperture A and sensor aperture B.
A fourth method is to collocate all of the sensor apertures into one box that is mounted on the vehicle, such as the roof, so that all sensor apertures are always aligned with respect to each other and the only alignment required is the alignment between this sensor aperture box and the vehicle body. This can be performed by using a set of accelerometers in the sensor aperture box and on the vehicle body frame or optically by using a video camera in the sensor aperture box.
The systems described above can use dedicated processor systems, micro controllers, programmable logic devices, or microprocessors that perform some or all of the operations. Some of the operations described above may be implemented in software and other operations may be implemented in hardware.
For the sake of convenience, the operations are described as various interconnected functional blocks or distinct software modules. This is not necessary, however, and there may be cases where these functional blocks or modules are equivalently aggregated into a single logic device, program or operation with unclear boundaries. In any event, the functional blocks and software modules or described features can be implemented by themselves, or in combination with other operations in either hardware or software.
Having described and illustrated the principles of the invention in a preferred embodiment thereof, it should be apparent that the invention may be modified in arrangement and detail without departing from such principles. Claim is made to all modifications and variation coming within the spirit and scope of the following claims.
Locating the sensors 20816 and 20818 on the same substrate 20814 simplifies the cost of sensor manufacturing and installation. For example, the two sensors 20816 can be assembled onto the substrate 20814 in a factory prior to being installed on a vehicle. If the two sensors 20816 and 20818 were not mounted on the same substrate 20814, then each sensor would have to be separately mounted on the vehicle and then calibrated to a known alignment with respect to each other. Even if the two sensors were installed correctly, changes in the shape of the vehicle due to wear, temperature, etc. over time could change the initial alignment between the two sensors.
Premounting or prefabricating the sensors 20816 and 20818 on the substrate 20814 prior to installation on a vehicle, prevents these alignment errors. Only the substrate 20814 of the multi-sensor system 20812 has to be mounted to the vehicle, not the individual sensors 20816 and 20818. This allows the relative position 20820 and alignment between the two sensors 20816 and 20818 to remain the same regardless of how the substrate 20814 is mounted on the vehicle. Wiring is also simplified since only one wiring harness has to be run through the vehicle to the multi-sensor system 20812.
In one example, the sensor 20816 senses an area 20824 and the sensor 20818 senses an area 20822 that are both coincident. One of the sensors may have a wider field of view than the other sensor. There can also be more than two sensors on substrate 20814 and any active or passive sensor that provides object detection or vehicle force measurements can be mounted onto substrate 20814. Some examples of sensors include ultrasonic, Infra-Red (IR), video, radar, and lidar sensors.
Depending on the substrate 20814 and the types of sensors, different mounting techniques can be used. The sensors may be separate components that are glued or bolted onto the substrate 20814. If the multi-sensor system 20812 is an integrated circuit, then the sensors 20816 and 20818 may be integrally fabricated onto a silicon or alternative temperature resilent substrate 20814 using known deposition processes.
In one example, sensor 20814 is a radar or lidar sensor and sensor 20818 is a camera. Combining a video camera sensor with a radar and/or lidar sensor on the substrate 14 provides several advantages. The camera sensor 20818 provides good angle resolution and object identification. The radar or lidar sensor 20816 on the other hand is very effective in identifying range information.
Combining the camera video sensor 20818 with the radar or lidar sensor 20816 on the same substrate 20814 allows more effective correlation of camera angle and identification data with radar or lidar range information. For example, the radar sensor 20814 may only be able to measure angle of an object to within one-half a degree. Because of the limited angle accuracy of the radar angle readings, it may not be possible to determine from the radar reading along if an oncoming vehicle is coming from the same lane of traffic or from an opposite lane of traffic.
The video sensor 20818 may be able to accurately determine the angle of an object to within one-tenth or one-one hundredth of a degree. By correlating the radar information with the camera information, the location of an on-coming vehicle can be determined more accurately.
Do to vibration differences and possible inaccuracies in sensor alignment, it may not be possible, within fractional degrees of accuracy, to correlate information with separately mounted sensors. In other words, if the camera angle varies within plus or minus one degree with respect to the radar angle, then the camera data may not be able to refine the radar measurements.
By mounting the camera sensor 20818 and the radar sensor 20816 to the same substrate 20814, the relative position and alignment between the two sensors remains essentially the same regardless of physical effects on the vehicle. Thus, the camera data can be correlated with radar data to within fractions of a degree of accuracy.
In another example, a first sensor may detect one object out in front of the vehicle. A second sensor located somewhere else on the vehicle may detect two different objects in front of the vehicle. Because of vibrations in different parts of the vehicle, a central processor may not be able to determine which of the two objects detected by the second sensor is associated with the object detected by the first sensor. With the multi-sensor system 20812, measurement errors caused by this vehicle vibration is cancelled since the two sensors 20816 and 20818 effectively experience the same amount of vibration at the same time.
In previous multi-sensor applications, each sensor was required to send all data back to the same central processing system. This takes additional time and circuitry to send all of the data over a bus. By mounting the processor 20826 in the multi-sensor system 20812, data from both sensor 20816 and sensor 20818 can be processed locally requiring fewer reports to be sent over connection 20828.
Referring to
The correlation may include first determining if the reports actually identify an object in block 20840. The processor 20826 can verify or refine object detection information from one of the sensors with the message reports received from the other sensor. If both sensors do not verify detection of the same object within some degree of certainty, then the processor system 20826 may discard the message reports or continue to analyze additional reports in block 20840.
When an object is detected in block 20840, the processor 20826 only has to send one report in block 20842 representing the information obtained from both sensor 20816 and sensor 20818. This reduces the total amount of data that has to be sent either to a central controller or another multi-sensor system in block 20842.
Distributed Sensor Fusion
Referring to
Whenever an object is detected, identified and tracked, a track file is created for that object in memory 20858 (
For example, a bicycle 20865 may be initially detected by multi-sensor system 20812A at location 20864A in zone 1. The multi-sensor system 20812A creates a track file containing position, speed, acceleration, range, angle, heading, etc. for the bike 20865.
As the vehicle 20860 moves, or the bike 20865 moves, or both, the bike 20865 may move into a new position 20864B in an overlapping region 20866 between zone 1 and zone 2. The multi-sensor system 20812A upon detecting the bike 20865 in the overlapping region 20866 sends the latest track file for the bike 20865 to multi-sensor system 20812B over bus 20862. This allows the multi-sensor system 20812B to start actively tracking bike 20865 using the track information received from multi-sensor system 20812A.
The multi-sensor system 20812A only has to send a few of the latest track files for the common area 20866 over connection 20864 to multi-sensor 20812B in order for system 20812B to maintain a track on bike 20865. The track files can be exchanged between any of the multi-sensor systems 20812A-20812D. When there are two multi-sensor systems that have overlapping tracks for the same object, the track file with the greatest confidence of accuracy is used for vehicle warning, security, and control operations. There are known algorithms that calculate track files and calculate a degree of confidence in the track file calculations. Therefore, describing these algorithms will not be discussed in further detail.
There may be vibrational effects on the different multi-sensor systems 20812A-20812D. This however does not effect the track calculations generated by the individual multi-sensor systems 20812A-20812D. The only compensation for any vibration may be when the track files are translated into body coordinates when a possible control decision is made by the central controller 20868.
The connection 20862 can a CAN bus, wireless 802.11 link or any other type of wired or wireless link. The system described above can use dedicated processor systems, micro controllers, programmable logic devices, or microprocessors that perform some or all of the operations. Some of the operations described above may be implemented in software and other operations may be implemented in hardware.
For the sake of convenience, the operations are described as various interconnected functional blocks or distinct software modules. This is not necessary, however, and there may be cases where these functional blocks or modules are equivalently aggregated into a single logic device, program or operation with unclear boundaries. In any event, the functional blocks and software modules or features of the flexible interface can be implemented by themselves, or in combination with other operations in either hardware or software.
Having described and illustrated the principles of the invention in a preferred embodiment thereof, it should be apparent that the invention may be modified in arrangement and detail without departing from such principles. I claim all modifications and variation coming within the spirit and scope of the following claims.
This application is a continuation of U.S. patent application Ser. No. 12/698,960, filed Feb. 2, 2010, which is a continuation of U.S. patent application Ser. No. 12/024,058, filed Jan. 31, 2008, which is a continuation of U.S. Pat. No. 7,337,650, Issued Mar. 4, 2008 Titled—SYSTEM AND METHOD FOR ALIGNING SENSORS ON A VEHICLE the disclosures of which are incorporated herein by reference in their entirety and further incorporates by reference: U.S. Pat. No. 6,629,033, Issued Sep. 30, 2003 Titled—OPEN COMMUNICATION SYSTEM FOR REAL-TIME MULTIPROCESSOR APPLICATIONS, U.S. Pat. No. 6,771,208, Issued Aug. 3, 2004 Titled—MULTI SENSOR SYSTEM, and U.S. Pat. No. 7,146,260, Issued Dec. 5, 2006 Titled—METHOD AND APPARATUS FOR DYNAMIC CONFIGURATION OF MULTIPROCESSOR SYSTEM. Applicants believe the above-incorporated material constitutes “essential material” within the meaning of 37 CFR 1.57(c)(1)-(3), applicants have amended the specification to expressly recite the essential material that is incorporated by reference as allowed by the applicable rules.
Number | Name | Date | Kind |
---|---|---|---|
2995318 | Cocharo | Aug 1961 | A |
4303978 | Shaw | Dec 1981 | A |
4528563 | Takeuchi | Jul 1985 | A |
4591976 | Webber | May 1986 | A |
4829434 | Karmel | May 1989 | A |
4907159 | Mauge | Mar 1990 | A |
5008678 | Herman | Apr 1991 | A |
5031330 | Stuart | Jul 1991 | A |
5045937 | Myrick | Sep 1991 | A |
5111401 | Everett, Jr. | May 1992 | A |
5115245 | Wen | May 1992 | A |
5245909 | Corrigan | Sep 1993 | A |
5287199 | Zoccolillo | Feb 1994 | A |
5303297 | Hillis | Apr 1994 | A |
5339086 | DeLuca | Aug 1994 | A |
5341301 | Shirai | Aug 1994 | A |
5438361 | Coleman | Aug 1995 | A |
5440726 | Fuchs et al. | Aug 1995 | A |
5471214 | Faibish | Nov 1995 | A |
5500794 | Fujita et al. | Mar 1996 | A |
5506963 | Ducateau | Apr 1996 | A |
5532706 | Reinhardt | Jul 1996 | A |
5537539 | Narihiro | Jul 1996 | A |
5552773 | Kuhnert | Sep 1996 | A |
5555503 | Kyrtsos et al. | Sep 1996 | A |
5572201 | Graham | Nov 1996 | A |
5579219 | Mori et al. | Nov 1996 | A |
5581462 | Rogers | Dec 1996 | A |
5585798 | Yoshioka | Dec 1996 | A |
5617085 | Tsutsumi | Apr 1997 | A |
5646612 | Byon | Jul 1997 | A |
5749060 | Graf | May 1998 | A |
5751211 | Shirai | May 1998 | A |
5761320 | Farinelli | Jun 1998 | A |
5786998 | Neeson | Jul 1998 | A |
5793366 | Mano et al. | Aug 1998 | A |
5794164 | Beckert et al. | Aug 1998 | A |
5872508 | Taoka | Feb 1999 | A |
5898392 | Bambini et al. | Apr 1999 | A |
5907293 | Tognazzini | May 1999 | A |
5909559 | So | Jun 1999 | A |
5915214 | Reece | Jun 1999 | A |
5943427 | Massie | Aug 1999 | A |
5948040 | DeLorme et al. | Sep 1999 | A |
5951620 | Ahrens et al. | Sep 1999 | A |
5956025 | Goulden et al. | Sep 1999 | A |
5956250 | Gudat et al. | Sep 1999 | A |
5959536 | Chambers et al. | Sep 1999 | A |
5963092 | VanZalinge | Oct 1999 | A |
5964822 | Alland | Oct 1999 | A |
5966658 | Kennedy, III | Oct 1999 | A |
5969598 | Kimura | Oct 1999 | A |
5977906 | Ameen | Nov 1999 | A |
5983092 | Whinnett | Nov 1999 | A |
5983161 | Lemelson | Nov 1999 | A |
6009330 | Kennedy, III | Dec 1999 | A |
6009403 | Sato | Dec 1999 | A |
6028537 | Suman | Feb 2000 | A |
6028548 | Farmer | Feb 2000 | A |
6032089 | Buckley | Feb 2000 | A |
6032202 | Lea et al. | Feb 2000 | A |
6038625 | Ogino et al. | Mar 2000 | A |
6054950 | Fontana | Apr 2000 | A |
6060989 | Gehlot | May 2000 | A |
6061709 | Bronte | May 2000 | A |
6075467 | Ninagawa | Jun 2000 | A |
6097285 | Curtin | Aug 2000 | A |
6105119 | Kerr et al. | Aug 2000 | A |
6128608 | Barnhill | Oct 2000 | A |
6144336 | Preston | Nov 2000 | A |
6148261 | Obradovich | Nov 2000 | A |
6150961 | Alewine | Nov 2000 | A |
6154123 | Kleinberg | Nov 2000 | A |
6161071 | Shuman | Dec 2000 | A |
6163711 | Juntunen | Dec 2000 | A |
6166627 | Reeley | Dec 2000 | A |
6167253 | Farris | Dec 2000 | A |
6169894 | McCormick | Jan 2001 | B1 |
6175728 | Mitama | Jan 2001 | B1 |
6175782 | Obradovich | Jan 2001 | B1 |
6179489 | So et al. | Jan 2001 | B1 |
6181922 | Iwai | Jan 2001 | B1 |
6181994 | Colson | Jan 2001 | B1 |
6182006 | Meek | Jan 2001 | B1 |
6185491 | Gray | Feb 2001 | B1 |
6198996 | Berstis | Mar 2001 | B1 |
6199136 | Shteyn | Mar 2001 | B1 |
6202027 | Alland | Mar 2001 | B1 |
6203366 | Muller | Mar 2001 | B1 |
6204804 | Andersson | Mar 2001 | B1 |
6226389 | Lemelson, III | May 2001 | B1 |
6233468 | Chen | May 2001 | B1 |
6236652 | Preston | May 2001 | B1 |
6240365 | Bunn | May 2001 | B1 |
6243450 | Jansen | Jun 2001 | B1 |
6243772 | Ghori et al. | Jun 2001 | B1 |
6252544 | Hoffberg | Jun 2001 | B1 |
6275231 | Obradovich | Aug 2001 | B1 |
6282714 | Ghori et al. | Aug 2001 | B1 |
D448366 | Youngers | Sep 2001 | S |
6292109 | Murano | Sep 2001 | B1 |
6292747 | Amro | Sep 2001 | B1 |
6294987 | Matsuda | Sep 2001 | B1 |
6295541 | Bodnar et al. | Sep 2001 | B1 |
6297732 | Hsu | Oct 2001 | B2 |
6298302 | Walgers | Oct 2001 | B2 |
6298370 | Tang et al. | Oct 2001 | B1 |
6314326 | Fuchu | Nov 2001 | B1 |
6326903 | Gross | Dec 2001 | B1 |
6327536 | Tsuji | Dec 2001 | B1 |
6362748 | Huang | Mar 2002 | B1 |
6374286 | Gee | Apr 2002 | B1 |
6377860 | Gray | Apr 2002 | B1 |
6382897 | Mattio | May 2002 | B2 |
6389340 | Rayner | May 2002 | B1 |
6401029 | Kubota | Jun 2002 | B1 |
6405132 | Breed | Jun 2002 | B1 |
6408174 | Steijer | Jun 2002 | B1 |
6417782 | Darnall | Jul 2002 | B1 |
6421429 | Merritt | Jul 2002 | B1 |
6429789 | Kiridena | Aug 2002 | B1 |
6429812 | Hoffberg | Aug 2002 | B1 |
6430164 | Jones | Aug 2002 | B1 |
6434447 | Shteyn | Aug 2002 | B1 |
6442485 | Evans | Aug 2002 | B2 |
6445308 | Koike | Sep 2002 | B1 |
6445983 | Dickson et al. | Sep 2002 | B1 |
6452484 | Drori | Sep 2002 | B1 |
6484080 | Breed | Nov 2002 | B2 |
6487717 | Brunemann et al. | Nov 2002 | B1 |
6493338 | Preston | Dec 2002 | B1 |
6496107 | Himmelstein | Dec 2002 | B1 |
6496117 | Gutta | Dec 2002 | B2 |
6496689 | Keller | Dec 2002 | B1 |
6498939 | Thomas | Dec 2002 | B1 |
6505100 | Stuempfle | Jan 2003 | B1 |
6515595 | Obradovich | Feb 2003 | B1 |
6522875 | Dowling | Feb 2003 | B1 |
6523696 | Saito et al. | Feb 2003 | B1 |
6542812 | Obradovich et al. | Apr 2003 | B1 |
6559773 | Berry | May 2003 | B1 |
6567069 | Bontrager et al. | May 2003 | B1 |
6571136 | Staiger | May 2003 | B1 |
6580973 | Leivian et al. | Jun 2003 | B2 |
6584403 | Bunn | Jun 2003 | B2 |
D479228 | Sakaguchi et al. | Sep 2003 | S |
6614349 | Proctor et al. | Sep 2003 | B1 |
6615137 | Lutter | Sep 2003 | B2 |
6616071 | Kitamura | Sep 2003 | B2 |
6622083 | Knockeart | Sep 2003 | B1 |
6629033 | Preston | Sep 2003 | B2 |
6641087 | Nelson | Nov 2003 | B1 |
6647270 | Himmelstein | Nov 2003 | B1 |
6647328 | Walker | Nov 2003 | B2 |
6670912 | Honda | Dec 2003 | B2 |
6675081 | Shuman | Jan 2004 | B2 |
6681121 | Preston | Jan 2004 | B1 |
6690681 | Preston | Feb 2004 | B1 |
6707421 | Drury et al. | Mar 2004 | B1 |
6708100 | Russell | Mar 2004 | B2 |
6714139 | Saito | Mar 2004 | B2 |
6718187 | Takagi et al. | Apr 2004 | B1 |
6725031 | Watler | Apr 2004 | B2 |
6734799 | Munch | May 2004 | B2 |
6738697 | Breed | May 2004 | B2 |
6754183 | Razavi et al. | Jun 2004 | B1 |
6756998 | Bilger | Jun 2004 | B1 |
6771208 | Lutter et al. | Aug 2004 | B2 |
6771629 | Preston | Aug 2004 | B1 |
6778073 | Lutter | Aug 2004 | B2 |
6778924 | Hanse | Aug 2004 | B2 |
6782315 | Lu | Aug 2004 | B2 |
6785551 | Richard | Aug 2004 | B1 |
6792351 | Lutter | Sep 2004 | B2 |
6806977 | Freeny et al. | Oct 2004 | B1 |
6816458 | Kroon | Nov 2004 | B1 |
6895238 | Newell | May 2005 | B2 |
6895240 | Laursen | May 2005 | B2 |
6901057 | Rune | May 2005 | B2 |
6906619 | Williams | Jun 2005 | B2 |
6920129 | Preston | Jul 2005 | B2 |
6925368 | Funkhouser et al. | Aug 2005 | B2 |
6937732 | Ohmura | Aug 2005 | B2 |
6952155 | Himmelstein | Oct 2005 | B2 |
6972669 | Saito | Dec 2005 | B2 |
6973030 | Pecen | Dec 2005 | B2 |
6980092 | Turnbull | Dec 2005 | B2 |
6993511 | Himmelstein | Jan 2006 | B2 |
7000469 | Foxlin | Feb 2006 | B2 |
7006950 | Greiffenhagen | Feb 2006 | B1 |
7024363 | Comerford | Apr 2006 | B1 |
7039858 | Humpleman et al. | May 2006 | B2 |
7079993 | Stephenson | Jul 2006 | B2 |
7089206 | Martin | Aug 2006 | B2 |
7092723 | Himmelstein | Aug 2006 | B2 |
7103834 | Humpleman et al. | Sep 2006 | B1 |
7120129 | Ayyagari | Oct 2006 | B2 |
7123926 | Himmelstein | Oct 2006 | B2 |
7146260 | Preston | Dec 2006 | B2 |
7151768 | Preston | Dec 2006 | B2 |
7158956 | Himmelstein | Jan 2007 | B1 |
7164662 | Preston | Jan 2007 | B2 |
7171189 | Bianconi | Jan 2007 | B2 |
7178049 | Lutter | Feb 2007 | B2 |
7187947 | White | Mar 2007 | B1 |
7206305 | Preston | Apr 2007 | B2 |
7207042 | Smith | Apr 2007 | B2 |
7215965 | Fournier et al. | May 2007 | B2 |
7221669 | Preston | May 2007 | B2 |
7239949 | Lu | Jul 2007 | B2 |
7249266 | Margalit | Jul 2007 | B2 |
7257426 | Witkowski | Aug 2007 | B1 |
7263332 | Nelson | Aug 2007 | B1 |
7269188 | Smith | Sep 2007 | B2 |
7272637 | Himmelstein | Sep 2007 | B1 |
7274988 | Mukaiyama | Sep 2007 | B2 |
7277693 | Chen | Oct 2007 | B2 |
7283567 | Preston | Oct 2007 | B2 |
7283904 | Benjamin | Oct 2007 | B2 |
7286522 | Preston | Oct 2007 | B2 |
7317696 | Preston | Jan 2008 | B2 |
7337650 | Preston et al. | Mar 2008 | B1 |
7343160 | Morton | Mar 2008 | B2 |
7375728 | Donath | May 2008 | B2 |
7379707 | DiFonzo | May 2008 | B2 |
7411982 | Smith | Aug 2008 | B2 |
7418476 | Salesky | Aug 2008 | B2 |
7450955 | Himmelstein | Nov 2008 | B2 |
7480501 | Petite | Jan 2009 | B2 |
7506020 | Ellis | Mar 2009 | B2 |
7508810 | Moinzadeh | Mar 2009 | B2 |
7509134 | Fournier et al. | Mar 2009 | B2 |
7536277 | Pattipatti et al. | May 2009 | B2 |
7579942 | Kalik | Aug 2009 | B2 |
7587102 | Maris | Sep 2009 | B2 |
7587370 | Himmelstein | Sep 2009 | B2 |
7594000 | Himmelstein | Sep 2009 | B2 |
7596391 | Himmelstein | Sep 2009 | B2 |
7599715 | Himmelstein | Oct 2009 | B2 |
7614055 | Buskens et al. | Nov 2009 | B2 |
7664315 | Woodfill | Feb 2010 | B2 |
7681448 | Preston et al. | Mar 2010 | B1 |
7689321 | Karlsson | Mar 2010 | B2 |
7733853 | Moinzadeh et al. | Jun 2010 | B2 |
7747281 | Preston | Jun 2010 | B2 |
7848763 | Fournier et al. | Dec 2010 | B2 |
7891004 | Gelvin et al. | Feb 2011 | B1 |
7924934 | Birmingham | Apr 2011 | B2 |
7928898 | Franken | Apr 2011 | B2 |
7966111 | Moinzadeh et al. | Jun 2011 | B2 |
7970500 | Parra Carque | Jun 2011 | B2 |
7979095 | Birmingham | Jul 2011 | B2 |
7983310 | Hirano et al. | Jul 2011 | B2 |
8001860 | Preston et al. | Aug 2011 | B1 |
8014942 | Moinzadeh et al. | Sep 2011 | B2 |
8036201 | Moinzadeh et al. | Oct 2011 | B2 |
8036600 | Garrett et al. | Oct 2011 | B2 |
8068792 | Preston | Nov 2011 | B2 |
8108092 | Phillips et al. | Jan 2012 | B2 |
8204927 | Duong et al. | Jun 2012 | B1 |
8244408 | Lee et al. | Aug 2012 | B2 |
8260515 | Huang et al. | Sep 2012 | B2 |
8346186 | Preston et al. | Jan 2013 | B1 |
20010009855 | L'Anson | Jul 2001 | A1 |
20020012329 | Atkinson | Jan 2002 | A1 |
20020022927 | Lemelson et al. | Feb 2002 | A1 |
20020070852 | Trauner | Jun 2002 | A1 |
20020083143 | Cheng | Jun 2002 | A1 |
20020095501 | Chiloyan et al. | Jul 2002 | A1 |
20020105423 | Rast | Aug 2002 | A1 |
20020144010 | Younis | Oct 2002 | A1 |
20020144079 | Willis et al. | Oct 2002 | A1 |
20030060188 | Gidron | Mar 2003 | A1 |
20030078754 | Hamza | Apr 2003 | A1 |
20030158614 | Friel | Aug 2003 | A1 |
20030204382 | Julier et al. | Oct 2003 | A1 |
20030212996 | Wolzien | Nov 2003 | A1 |
20040162064 | Himmelstein | Aug 2004 | A1 |
20040164228 | Fogg | Aug 2004 | A1 |
20050009506 | Smolentzov | Jan 2005 | A1 |
20050070221 | Upton | Mar 2005 | A1 |
20050130656 | Chen | Jun 2005 | A1 |
20050153654 | Anderson | Jul 2005 | A1 |
20050251328 | Merwe et al. | Nov 2005 | A1 |
20050260984 | Karabinis | Nov 2005 | A1 |
20050275505 | Himmelstein | Dec 2005 | A1 |
20050278712 | Buskens et al. | Dec 2005 | A1 |
20060206576 | Obradovich et al. | Sep 2006 | A1 |
20060293829 | Cornwell et al. | Dec 2006 | A1 |
20070115868 | Chen | May 2007 | A1 |
20070115897 | Chen | May 2007 | A1 |
20070260372 | Langer et al. | Nov 2007 | A1 |
20070260373 | Langer et al. | Nov 2007 | A1 |
20080092140 | Doninger et al. | Apr 2008 | A1 |
20090090592 | Mordukhovich et al. | Apr 2009 | A1 |
20090240481 | Durrant-Whyte et al. | Sep 2009 | A1 |
20090268947 | Schaufler | Oct 2009 | A1 |
20090284378 | Ferren et al. | Nov 2009 | A1 |
20110212700 | Petite | Sep 2011 | A1 |
Number | Date | Country |
---|---|---|
3125161 | Jan 1983 | DE |
4237987 | May 1994 | DE |
19922608 | Nov 2000 | DE |
19931161 | Jan 2001 | DE |
0 441 576 | Aug 1991 | EP |
0841648 | May 1998 | EP |
1 355 128 | Oct 2003 | EP |
10-076115 | Oct 1999 | JP |
2000207691 | Jul 2000 | JP |
WO9624229 | Aug 1996 | WO |
WO9908436 | Feb 1999 | WO |
WO9957662 | Nov 1999 | WO |
WO9965183 | Dec 1999 | WO |
WO 0029948 | May 2000 | WO |
WO0040038 | Jul 2000 | WO |
WO0130061 | Apr 2001 | WO |
WO0158110 | Aug 2001 | WO |
Entry |
---|
Stolowitz Ford Cowger LLP Listing of Related Cases Sep. 17, 2012. |
Longbin, Xiaoquain, Yizu Kang, Bar-Shalom: Unbiased converted measurements for tracking; IEEE Transactions on Aerospace and Electronic Systems vol. 34(4), Jul. 1998, pp. 1023-1027. |
Miller, Drummond: Comparison of methodologies for mitigating coordinate transformation basis in target tracking; Proceedings SPIE Conference on Signal and Data Processing of Small Targets 2000, vol. 4048, Jul. 2002, pp. 414-426. |
Duan, Han, Rong Li: Comments on “Unbiased (debiased) converted measurements for tracking” IEEE Transactions on Aerospace and Electronic Systems, vol. 40(4), Oct. 2004, pp. 1374-1377. |
Stolowitz Ford Cowger LLP, Listing of Related Cases, Mar. 15, 2011. |
MyGig User Guide; Mar. 11, 2008. |
Stolowitz Ford Cowger LLP Listing of Related Cases Oct. 12, 2011. |
A. Das, R. Fierro, V. Kumar, J. Ostrowski, J. Spletzer, and C. Taylor, “A Framework for Vision Based Formation Control”, IEEE Transactions on Robotics and Automation, vol. 18, Nov. 5, 2001, pp. 1-13. |
Ada 95 Transition Support—Lessons Learned, Sections 3, 4, and 5, CACI, Inc.—Federal, Nov. 15, 1996, 14 pages. |
AMIC. Architecture specification release 1, 2001; 35 pages. |
Bluetooth Doc; Advance Audio Distribution Profile Specification; Adopted version 1.0; dated May 22, 2003; 75 pages. |
Bluetooth Doc; Audio/Video Remote Control Profile; Version 1.0 Adopted; dated May 22, 2003; 52 pages. |
Bluetooth Hands-free Profile 1.5 Nov. 25, 2005. |
Bluetooth Specification version 1.1; Feb. 22, 2001; 452 pages. |
Boeing News Release, “Boeing Demonstrates JSF Avionics Multi-Sensor Fusion”, Seattle, WA, May 9, 2000, pp. 1-2. |
Boeing Statement, “Chairman and CEO Phil Condit on the JSF Decision”, Washington, D.C., Oct. 26, 2001, pp. 1-2. |
Counterair: The Cutting Edge, Ch. 2 “The Evolutionary Trajectory the Fighter Pilot-Here to Stay?” AF2025 v3c8-2, Dec. 1996, pp. 1-7. |
Counterair: The Cutting Edge, Ch. 4 “The Virtual Trajectory Air Superiority without an ”Air“ Force?” AF2025 v3c8-4, Dec. 1996, pp. 1-12. |
Embedded Bluetooth Migrates to Lisbon and Seattle; 11 pages; Jan. 23, 2008. |
Green Hills Software, Inc., “The AdaMULTI 2000 Integrated Development Environment,” Copyright 2002, printed Jul. 9, 2002; 7 pages. |
H. Chung, L. Ojeda, and J. Borenstein, “Sensor Fusion for Mobile Robot Dead-reckoning with a Precision-calibrated Fiber Optic Gyroscope”, 2001 IEEE International Conference on Robotics and Automation, Seoul, Korea, May 21-26, 2001, pp. 1-6. |
Hitachi Automated Highway System (AHS), Automotive Products, Hitachi, Ltd., Copyright 1994-2002, 8 pages. |
IEEE Standard for Information Technology—POSIX Based Supercomputing Application Environment Profile; Jun. 14, 1995, 72 pages. |
ISIS Project: Sensor Fusion, Linkoping University Division of Automatic Control and Communication Systems in cooperation with SAAB (Dynamics and Aircraft), 2001, 18 pages. |
J. Takezaki, N. Ueki, T. Minowa, H. Kondoh, “Support System for Safe Driving—A Step Toward ITS Autonomous Driving—”, Hitachi Review, vol. 49, Nov. 3, 2000, pp. 1-8. |
Joint Strike Fighter Terrain Database, ets-news.com “Simulator Solutions” 2002, 3 pages. |
Luttge, Karsten; “E-Charging API: Outsource Charging to a Payment Service Provider”; IEEE; 2001 (pp. 216-222). |
M. Chantler, G. Russel, and R. Dunbar, “Probabilistic Sensor Fusion for Reliable Workspace Sensing”, Fourth IARP workship on Underwater Robotics, Genoa, Nov. 1992, pp. 1-14. |
MSRC Redacted Proposal, 3.0 Architecture Development, Aug. 29, 2002; pp. 1-43. |
MyGig User Guide. |
Powerpoint Presentation by Robert Allen—Boeing Phantom Works entitled “Real-Time Embedded Avionics System Security and COTS Operating Systems”, Open Group Real-Time Forum, Jul. 18, 2001, 16 pages. |
Product description of Raytheon Electronic Systems (ES), Copyright 2002, pp. 1-2. |
Product description of Raytheon RT Secure, “Development Environment”, Copyright 2001, pp. 1-2. |
Product description of Raytheon RT Secure, “Embedded Hard Real-Time Secure Operating System”, Copyright 2000, pp. 1-2. |
Product description of Raytheon RT Secure, Copyright 2001, pp. 1-2. |
S.G. Goodridge, “Multimedia Sensor Fusion for Intelligent Camera Control and Human-Computer Interaction”, Dissertation submitted to the Graduate Faculty of North Carolina State University in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Electrical Engineering, Raleigh, NC, 1997, pp. 1-5. |
Specification of the Bluetooth System v1.0.B; Dec. 1, 1999. |
Specification of the Bluetooth System v1.1; Feb. 22, 2001. |
TNO FEL Annual Review 1998: Quality works, Observation Systems Division; “The Whole is More Than the Sum of its Parts”; 16 pages. |
Vehicle Dynamics Lab, University of California, Berkeley, funded by BMW, current members: D. Caveney and B. Feldman, “Adaptive Cruise Control”, at least as early as 2002, printed Jul. 2, 2002; 17 pages. |
Stirling A: “Mobile Multimedia platforms” Vehicular Technology Conference Fall 2000. IEEE VTS Fall VTC2000. 52nd Vehicular Technology Conference (Cat. No. 00CH37152). |
Nusser R. et al.: “Bluetooth-based wireless connectivity in an automotive environment” Vehicular Technology Conference Fall 2000. IEEE VTS Fall VTC2000 52nd Vehicular Technology Conference (Cat. No. 00CH37152). |
Martins e f v et al. “design of an 059 operating system extension for a message-passing multiprocessor” Microprocessors and Microsystems, IPC Business Press LT. London, BG, vol. 21, No. 9, Apr. 1, 1998, pp. 533-543. |
Gutierrez Garcia JJ et al. “Minimizing the effects of jitter in distributed hard real-time systems” Journal of Systems Architecture, Elsevier Science Publishers BV., Amsterdam, NL, vol. 41, No. 6/7. Dec. 15, 1996, pp. 431-447. |
International Search Report for PCT/US02/020402; Mailing date Apr. 3, 2003. |
International Search Report for PCT/US02/020403; Mailing date Jan. 27, 2003. |
International Search Report for PCT/US02/016364; Mailing date Feb. 14, 2003. |
International Search Report for PCT/US02/016371; Mailing date Aug. 18, 2003. |
Stolowitz Ford Cowger LLP Listing of Related Cases Feb. 4, 2011. |
Number | Date | Country | |
---|---|---|---|
Parent | 12698960 | Feb 2010 | US |
Child | 13010675 | US | |
Parent | 12024058 | Jan 2008 | US |
Child | 12698960 | US | |
Parent | 10985577 | Nov 2004 | US |
Child | 12024058 | US |