The present invention relates generally to a vehicle sensing system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more radar sensors at a vehicle.
Use of radar sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 6,587,186; 6,710,770 and/or 8,013,780, which are hereby incorporated herein by reference in their entireties.
Implementations herein provide a method for calibrating a driving assistance system or sensing system or control system for a vehicle that utilizes one or more radar sensors to sense regions exterior of the vehicle, with the radar sensor that transmits and receives signals, with the received signals processed to detect the presence of objects at or near the vehicle in the field of sensing of the sensor. The method includes disposing a spherical radar reflector at a calibration location exterior a vehicle equipped with the vehicular sensing system. The vehicular sensing system of the equipped vehicle includes at least two radar sensors disposed at the vehicle so as to have respective fields of sensing exterior of the vehicle. Each radar sensor of the at least two radar sensors includes a plurality of transmitting antennas that, during operation of the respective radar sensor, transmits radio frequency (RF) signals. Each radar sensor of the at least two radar sensors includes a plurality of receiving antennas that receive RF signals. The received RF signals are transmitted RF signals that are reflected from an object present in the field of sensing of the respective radar sensor. The vehicular sensing system at the equipped vehicle includes an electronic control unit (ECU) that includes a data processor that processes radar data derived from RF signals received by the plurality of receiving antennas of the at least two radar sensors and provided to the ECU. With the spherical radar reflector disposed at the calibration location exterior the vehicle, the calibration location of the spherical radar reflector relative to the vehicle includes a location where the fields of sensing of a first radar sensor of the at least two radar sensors and a second radar sensor of the at least two radar sensors overlap. The method includes transmitting first calibration RF signals by at least one of the plurality of transmitting antennas of the first radar sensor and receiving, by the plurality of receiving antennas of the first radar sensor, reflected first calibration RF signals. The reflected first calibration RF signals at least include the first calibration RF signals transmitted by the at least one of the plurality of transmitting antennas of the first radar sensor and reflected off the spherical radar reflector. The method includes determining a first distance between the first radar sensor and the spherical radar reflector based on the received reflected first calibration RF signals and transmitting second calibration RF signals by at least one of the plurality of transmitting antennas of the second radar sensor. The method also includes receiving, by the plurality of receiving antennas of the second radar sensor, reflected second calibration RF signals. The reflected second calibration RF signals at least include the second calibration RF signals transmitted by the at least one of the plurality of transmitting antennas of the second radar sensor and reflected off the spherical radar reflector. The method includes determining a second distance between the second radar sensor and the spherical radar reflector based on the received reflected second calibration RF signals and transmitting third calibration RF signals by at least one of the plurality of transmitting antennas of the first radar sensor. The method includes receiving, by the plurality of receiving antennas of the second radar sensor, reflected third calibration RF signals. The reflected third calibration RF signals at least include the transmitted third calibration RF signals transmitted by the at least one of the plurality of transmitting antennas of the first radar sensor and reflected off the spherical radar reflector. The method includes determining, based on the received reflected third calibration RF signals, a radar signal path distance from the first radar sensor to the second radar sensor via the spherical radar reflector and determining a location and an orientation of the first radar sensor relative to the vehicle and a location and an orientation of the second radar sensor relative to the vehicle based on the determined first distance, the determined second distance, and the determined radar signal path distance.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle sensing system and/or driver assist system and/or driving assist system and/or object detection system and/or alert system operates to capture sensing data exterior of the vehicle and may process the captured data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle or a control for an autonomous vehicle in maneuvering the vehicle in a forward or rearward direction. The system includes a processor that is operable to receive sensing data from one or more sensors and provide an output, such as an alert or control of a vehicle system.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 (
Referring now to
The radar reflectors 20 are placed at locations P1-Pn around the vehicle 10. The reflectors 20 may be placed in areas where the fields of sensing or fields of view (FOVs) of equipped radar sensors 14 overlap. For example, the reflector at position P3 is placed in an area of overlap between the FOVs of sensor 14a and sensor 14b (i.e., both sensors receive signals reflected from the reflector). The reflectors 20 may be positioned at any point around the vehicle in three dimensions (i.e., the x, y, and, z dimensions) as appropriate by the positioning of the radar sensors 14. For example, if one or more radar sensors 14 are angled upward (i.e., away from the ground), the reflectors 20 may be placed higher above the ground relative to reflectors associated with radar sensors angled downward (i.e., towards the ground).
The positioning of the reflectors 20 allow for a sequence of operations to calibrate the sensing system 12. The system 12 may first be placed in a calibration mode (e.g., via a user input). The calibration mode causes one or more transmitters of one or more of the sensors 14 equipped at the vehicle to transmit a fixed and predetermined transmission code known to all radar sensors 14. In the calibration mode, each receiver of each sensor 14 equipped at the vehicle is placed into a listening mode. The transmitters transmit the fixed transmission code, and the code reflects off of one or more reflectors 20. The reflected code is then received by two or more receivers of sensors 14 across multiple receive channels, which causes localization in the x, y, and z dimensions to initiate (in each receiving sensor 14 that receives the transmission) calibration. In some examples, only a single transmitter transmits the fixed transmission code at a time, and then the cycle of transmitting the fixed code and receivers receiving the reflected fixed code is repeated for each transmitter equipped at the vehicle. Each cycle may be repeated any number of times for a given position of each spherical reflector 20.
The sensing system 12 may determine the exact location in all three dimensions (i.e., x, y, and z) along with any roll, pitch, and yaw of all sensors 14 based on the received fixed transmission code using long baseline (LBL) techniques. The system 12 may measure the distance from each sensor 14 to multiple spherical reflectors 20 (e.g., by measuring time of flight) and triangulate the position, roll, pitch, and/or yaw of each sensor based on the measured distances. Each spherical reflector 20 may be located at a known position from each sensor 14 and/or each other reflector 20. The locations of the reflectors 20 may be moved during calibration. For example, a transmitter may transmit the fixed transmission code with the reflectors 20 in a first configuration and then may again transmit the fixed transmission code with the reflectors 20 in a second configuration. Different transmitters may transmits the fixed transmission codes with the reflectors 20 in different configurations. Offsets (e.g., an amount the sensor is off from a nominal position or a determined misalignment) for each radar sensor 14 individually may be determined or calculated (e.g., based on the determined position vs. the known position) and then stored. For example, the offsets may be stored in nonvolatile memory accessible by the control. The sensing system 12 may then use the stored offsets to maintain calibration of the sensors 14. That is, during normal operation of the vehicle, processing of data transmitted and/or captured by the radar sensors may be adjusted based on respective calibration data.
The positioning of the spherical reflectors 20 in areas of overlap of the fields of sensing of two or more sensors 14 allows the sensing system 12 to be calibrated and aligned. Positioning of the spherical reflectors in four or more areas of radar sensor overlap (around the vehicle, such as forward, rearward and sideward at each side of the vehicle) allows the entire radar sensor suite (i.e., the plurality of radar sensors of the radar sensing system) to be aligned. The system 12 may further include inputs from an inertial measurement unit (IMU) or other sensor to actively compensate measured radar responses to dynamic changes in the pitch, roll, and yaw of the vehicle while in operation. Additionally or alternatively, during operation, and via processing of captured radar data, the system may detect and/or identify roadway infrastructure (e.g., signs) to verify alignment of any cameras or other sensors also disposed at the vehicle. For example, the system may perform edge detection on captured image data and captured radar data and compare orientations of the edges. The cameras or other sensors could also be aligned similarly during calibration, during service, or during repair (e.g., after a collision), as described above. For example, after a collision or other accident involving the equipped vehicle, one or more of the sensors may be misaligned. Recalibration of the system may determine new offsets and replace the previously stored offsets with the new offsets in order to compensate for or accommodate the determined misalignment of one or more of the sensors.
Referring now to
The system 12 determines offsets between each sensor's own 3D coordinate system and the vehicle coordinate system (i.e., the pitch, roll, and height of the first sensor 14 (“S1” in this example) and the second sensor 14 (“S2” in this example)). The system 12 also determines locations of each sensor 14 and the spherical reflector 20 in the field of view of the sensors 14. That is, the system determines S1h(xS1, yS1, zS1), S2h(xS2, yS2, zS2), and P1h(xP1, yP1, zP1).
The sensing system 12 measures, in one or more monostatic operations (see
Referring now to
Thus, the sensing system 12 may be calibrated by placing spherical reflectors 20 in positions that overlap the field of view of two or more radar sensors 14. Transmitters from the sensors 14 transmit a fixed transmission code that is known to all receivers. The receivers of each sensor 14 are placed in a listen mode and two or more receivers at a time receive the fixed transmission code reflected off of one or more spherical reflectors. The received fixed transmission codes are used (via long baseline techniques) to triangulate the position, roll, pitch, and yaw of each sensor 14. In response to determination of a misalignment of one or more of the sensors, the system calibrates the sensor or the system to correct or accommodate a determined misalignment (such as via adjusting processing of the data captured by the determined misaligned sensor to accommodate the offset or misalignment of the sensor's position, roll, pitch and/or yaw at the vehicle).
Thus, to calibrate the sensing system, the vehicle may be positioned at a calibration area or facility, where the spherical radar reflectors are disposed so as to at least partially surround the vehicle and so that each radar reflector is located at locations where fields of sensing of at least two radar sensors overlap. The calibration area may be any area where the vehicle can be parked (e.g., at a testing facility or a manufacturing facility, such as at an end of line testing station or facility at the vehicle manufacturing plant or the like), and where the spherical radar reflectors can be placed or located at spaced apart locations at least partially around the vehicle. When the vehicle is located relative to the spherical radar reflectors in this manner, the system enters the calibration mode and at least one transmitter transmits the radio signal that is reflected off one or more of the spherical radar reflectors and received by the receiving antennas of at least two of the radar sensors. Because the system knows the location of the spherical radar reflector relative to the vehicle and relative to the radar sensors, and because at least two radar sensors receive the transmitted signal reflected off each spherical radar reflector, the system calibrates the sensors by determining misalignment of the radar sensors by processing output data from the radar sensors that are sensing the spherical radar reflector(s). The system may perform an initial calibration for the vehicle (e.g., after manufacturing) or may perform a subsequent recalibration (e.g., after a collision).
The sensors comprise radar sensors or imaging radar sensors, and optionally aspects described herein may be implemented with lidar sensors or the like. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 10,866,306; 9,954,955; 9,869,762; 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 7,053,357; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or U.S. Publication Nos. US-2019-0339382; US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/199,798, filed Jan. 26, 2021, which is hereby incorporated herein by reference in its entirety. The present application also is a continuation-in-part of U.S. patent application Ser. No. 16/801,605, filed Feb. 26, 2020, which claims the filing benefits of U.S. provisional application Ser. No. 62/810,453, filed Feb. 26, 2019, which are both hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5949331 | Schofield et al. | Sep 1999 | A |
6587186 | Bamji et al. | Jul 2003 | B2 |
6674895 | Rafii et al. | Jan 2004 | B2 |
6678039 | Charbon | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6690354 | Sze | Feb 2004 | B2 |
6693517 | McCarthy et al. | Feb 2004 | B2 |
6710770 | Tomasi et al. | Mar 2004 | B2 |
6876775 | Torunoglu | Apr 2005 | B2 |
6906793 | Bamji et al. | Jun 2005 | B2 |
6919549 | Bamji et al. | Jul 2005 | B2 |
7053357 | Schwarte | May 2006 | B2 |
7157685 | Bamji et al. | Jan 2007 | B2 |
7176438 | Bamji et al. | Feb 2007 | B2 |
7203356 | Gokturk et al. | Apr 2007 | B2 |
7212663 | Tomasi | May 2007 | B2 |
7283213 | O'Connor et al. | Oct 2007 | B2 |
7310431 | Gokturk et al. | Dec 2007 | B2 |
7321111 | Bamji et al. | Jan 2008 | B2 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7352454 | Bamji et al. | Apr 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7379100 | Gokturk et al. | May 2008 | B2 |
7379163 | Rafii et al. | May 2008 | B2 |
7405812 | Bamji | Jul 2008 | B1 |
7408627 | Bamji et al. | Aug 2008 | B2 |
7580795 | McCarthy et al. | Aug 2009 | B2 |
8013780 | Lynam | Sep 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8698894 | Briggance | Apr 2014 | B2 |
8855849 | Ferguson et al. | Oct 2014 | B1 |
9036026 | Dellantoni et al. | May 2015 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9279882 | Hukkeri et al. | Mar 2016 | B2 |
9575160 | Davis et al. | Feb 2017 | B1 |
9599702 | Bordes et al. | Mar 2017 | B1 |
9689967 | Stark et al. | Jun 2017 | B1 |
9753121 | Davis et al. | Sep 2017 | B1 |
9869762 | Alland et al. | Jan 2018 | B1 |
9954955 | Davis et al. | Apr 2018 | B2 |
10852418 | Wodrich et al. | Dec 2020 | B2 |
10866306 | Maher et al. | Dec 2020 | B2 |
11275175 | Wodrich et al. | Mar 2022 | B2 |
11333739 | Wodrich et al. | May 2022 | B2 |
20050267683 | Fujiwara et al. | Dec 2005 | A1 |
20080169963 | White et al. | Jul 2008 | A1 |
20100001897 | Lyman | Jan 2010 | A1 |
20100106356 | Trepagnier et al. | Apr 2010 | A1 |
20100245066 | Sarioglu et al. | Sep 2010 | A1 |
20120062743 | Lynam et al. | Mar 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20130063257 | Schwindt et al. | Mar 2013 | A1 |
20130215271 | Lu | Aug 2013 | A1 |
20130222592 | Gieseke | Aug 2013 | A1 |
20130241766 | Kishigami et al. | Sep 2013 | A1 |
20140062762 | Kurono et al. | Mar 2014 | A1 |
20140218529 | Mahmoud et al. | Aug 2014 | A1 |
20140375476 | Johnson et al. | Dec 2014 | A1 |
20150124096 | Koravadi | May 2015 | A1 |
20150158499 | Koravadi | Jun 2015 | A1 |
20150251599 | Koravadi | Sep 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160036917 | Koravadi et al. | Feb 2016 | A1 |
20160116573 | Appia et al. | Apr 2016 | A1 |
20160161602 | Prokhorov | Jun 2016 | A1 |
20160210853 | Koravadi | Jul 2016 | A1 |
20160291146 | Wang et al. | Oct 2016 | A1 |
20170129489 | Pawlicki et al. | May 2017 | A1 |
20170222311 | Hess et al. | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170276788 | Wodrich | Sep 2017 | A1 |
20170315231 | Wodrich | Nov 2017 | A1 |
20170328997 | Silverstein et al. | Nov 2017 | A1 |
20170356994 | Wodrich et al. | Dec 2017 | A1 |
20180015875 | May et al. | Jan 2018 | A1 |
20180045812 | Hess | Feb 2018 | A1 |
20180065623 | Wodrich et al. | Mar 2018 | A1 |
20180067194 | Wodrich et al. | Mar 2018 | A1 |
20180105176 | Pawlicki et al. | Apr 2018 | A1 |
20180231635 | Woehlte | Aug 2018 | A1 |
20180231657 | Woehlte | Aug 2018 | A1 |
20180299533 | Pliefke et al. | Oct 2018 | A1 |
20180372841 | Hieida et al. | Dec 2018 | A1 |
20190061760 | Pawlicki et al. | Feb 2019 | A1 |
20190072666 | Duque Biarge et al. | Mar 2019 | A1 |
20190072667 | Duque Biarge et al. | Mar 2019 | A1 |
20190072668 | Duque Biarge et al. | Mar 2019 | A1 |
20190072669 | Duque Biarge et al. | Mar 2019 | A1 |
20190217775 | May et al. | Jul 2019 | A1 |
20190339382 | Hess et al. | Nov 2019 | A1 |
20200233073 | Olbrich | Jul 2020 | A1 |
20200271755 | Wodrich et al. | Aug 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20220146626 A1 | May 2022 | US |
Number | Date | Country | |
---|---|---|---|
63199798 | Jan 2021 | US | |
62810453 | Feb 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16801605 | Feb 2020 | US |
Child | 17648835 | US |