The present invention relates generally to a vehicle sensing system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more radar sensors at a vehicle.
Use of radar sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 6,587,186; 6,710,770 and/or 8,013,780, which are hereby incorporated herein by reference in their entireties.
The present invention provides a method for calibrating a driving assistance system or sensing system or control system for a vehicle that utilizes one or more radar sensors to sense regions exterior of the vehicle, with the radar sensor that transmits and receives signals, with the received signals processed to detect the presence of objects at or near the vehicle in the field of sensing of the sensor. The method includes establishing at least one spherical radar reflector at a location exterior a vehicle equipped with the vehicular sensing system and configuring the sensing system to enter a calibration mode. Once in the calibration mode, the method includes transmitting, by at least one of the plurality of transmitters, calibration radio waves. The method also includes receiving, by the plurality of receivers of the at least two radar sensors, reflected calibration waves. The reflected calibration waves include calibration waves reflected off the spherical radar reflector. The method also includes, in response to receiving the reflected calibration radio waves, calibrating, by the controller, the sensing system.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle sensing system and/or driver assist system and/or driving assist system and/or object detection system and/or alert system operates to capture sensing data exterior of the vehicle and may process the captured data to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle or a control for an autonomous vehicle in maneuvering the vehicle in a forward or rearward direction. The system includes a processor that is operable to receive sensing data from one or more sensors and provide an output, such as an alert or control of a vehicle system.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 (
Referring now to
The radar reflectors 20 are placed at locations P1-Pn around the vehicle 10. The reflectors 20 may be placed in areas where the fields of sensing or fields of view (FOVs) of equipped radar sensors 14 overlap. For example, the reflector at position P4 is placed in an area of overlap between the FOVs of sensor 14a and sensor 14b. The reflectors 20 may be positioned at any point around the vehicle in three dimensions (i.e., the x, y, and, z dimensions) as appropriate by the positioning of the radar sensors 14. For example, if one or more radar sensors 14 are angled upward (i.e., away from the ground), the reflectors 20 may be placed higher above the ground relative to reflectors associated with radar sensors angled downward (i.e., towards the ground).
The positioning of the reflectors 20 allow for a sequence of operations to calibrate the sensing system 12. The system 12 may first be placed in a calibration mode. The calibration mode causes one or more transmitters of one or more of the sensors 14 equipped at the vehicle to transmit a fixed and predetermined transmission code known to all radar sensors 14. Each receiver of each sensor 14 equipped at the vehicle is placed into a listening mode. The transmitters transmit the fixed transmission code, and the code reflects off of one or more reflectors 20. The reflected code is then received by two or more receivers of sensors 14 across multiple receive channels, which causes localization in the x, y, and z dimensions to initiate (in each receiving sensor 14 that receives the transmission). In some examples, only a single transmitter transmits the fixed transmission code at a time, and then the cycle of transmitting the fixed code and receivers receiving the reflected fixed code is repeated for each transmitter equipped at the vehicle. Each cycle may be repeated any number of times for a given position of each spherical reflector 20.
The sensing system 12 may determine the exact location (in all three dimensions), roll, pitch, and yaw of all sensors 14 based on the received fixed transmission code using long baseline (LBL) techniques. The system 12 may measure the distance from each sensor 14 to multiple spherical reflectors 20 (e.g., by measuring time of flight) and triangulate the position, roll, pitch, and yaw of each sensor based on the measured distances. Each spherical reflector 20 may be located at a known position from each sensor 14. The locations of the reflectors 20 may be moved during calibration. For example, a transmitter may transmit the fixed transmission code with the reflectors 20 in a first configuration and then may again transmit the fixed transmission code with the reflectors 20 in a second configuration. Different transmitters may transmits the fixed transmission codes with the reflectors 20 in different configurations. Offsets (e.g., an amount the sensor is off from a nominal position or a determined misalignment) for each radar sensor 14 individually may be determined or calculated (e.g., based on the determined position vs. the known position) and then stored. For example, the offsets may be stored in nonvolatile memory accessible by the control. The sensing system 12 may then use the stored offsets to maintain calibration of the sensors 14. That is, during normal operation of the vehicle, processing of data transmitted and/or captured by the radar sensors may be adjusted based on respective calibration data.
The positioning of the spherical reflectors 20 in areas of overlap of the fields of sensing of two or more sensors 14 allows the sensing system 12 to be calibrated and aligned. Positioning of the spherical reflectors in four or more areas of radar sensor overlap (around the vehicle, such as forward, rearward and sideward at each side of the vehicle) allows the entire radar sensor suite (the plurality of radar sensors of the radar sensing system) to be aligned. The system 12 may further include inputs from an inertial measurement unit (IMU) to actively compensate measured radar responses to dynamic changes in the pitch, roll, and yaw of the vehicle while in operation. Also, during operation, and via processing of captured radar data, the system may detect and/or identify roadway infrastructure (e.g., signs) to verify alignment of any cameras or other sensors also disposed at the vehicle. For example, the system may perform edge detection on captured image data and captured radar data and compare orientations of the edges. The cameras or other sensors could also be aligned similarly during calibration, during service, or during repair (e.g., after a collision), as described above. For example, after a collision or other accident involving the equipped vehicle, one or more of the sensors may be misaligned. Recalibration of the system may determine new offsets and replace the previously stored offsets with the new offsets in order to compensate for or accommodate the determined misalignment of one or more of the sensors.
Thus, the sensing system 12 may be calibrated by placing spherical reflectors 20 in positions that overlap the field of view of two or more radar sensors 14. Transmitters from the sensors 14 transmit a fixed transmission code that is known to all receivers. The receivers of each sensor 14 are placed in a listen mode and two or more receivers at a time receive the fixed transmission code reflected off of one or more spherical reflectors. The received fixed transmission codes are used (via long baseline techniques) to triangulate the position, roll, pitch, and yaw of each sensor 14. In response to determination of a misalignment of one or more of the sensors, the system calibrates the sensor or the system to correct or accommodate a determined misalignment (such as via adjusting processing of the data captured by the determined misaligned sensor to accommodate the offset or misalignment of the sensor's position, roll, pitch and/or yaw at the vehicle).
Thus, to calibrate the sensing system, the vehicle may be positioned at a calibration area or facility, where the spherical radar reflectors are disposed so as to at least partially surround the vehicle and so that each radar reflector is located at locations where fields of sensing of at least two radar sensors overlap. The calibration area may be any area where the vehicle can be parked (e.g., at a testing facility or a manufacturing facility, such as at an end of line testing station or facility at the vehicle manufacturing plant or the like), and where the spherical radar reflectors can be placed or located at spaced apart locations at least partially around the vehicle. When the vehicle is located relative to the spherical radar reflectors in this manner, the system enters the calibration mode and at least one transmitter transmits the radio signal that is reflected off one or more of the spherical radar reflectors and received by the receiving antennae of at least two of the radar sensors. Because the system knows the location of the spherical radar reflector relative to the vehicle and relative to the radar sensors, and because at least two radar sensors receive the transmitted signal reflected off each spherical radar reflector, the system calibrates the sensors by determining misalignment of the radar sensors by processing output data from the radar sensors that are sensing the spherical radar reflector(s). The system may perform an initial calibration for the vehicle (e.g., after manufacturing) or may perform a subsequent recalibration (e.g., after a collision).
The system may utilize sensors, such as radar or lidar sensors or the like. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 6,825,455; 7,053,357; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or International Publication Nos. WO 2018/007995 and/or WO 2011/090484, and/or U.S. Publication Nos. US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/810,453, filed Feb. 26, 2019, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5949331 | Schofield et al. | Sep 1999 | A |
6587186 | Bamji et al. | Jul 2003 | B2 |
6674895 | Rafii et al. | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6693517 | McCarthy et al. | Feb 2004 | B2 |
6710770 | Tomasi et al. | Mar 2004 | B2 |
6906793 | Bamji et al. | Jun 2005 | B2 |
6919549 | Bamji et al. | Jul 2005 | B2 |
7157685 | Bamji et al. | Jan 2007 | B2 |
7176438 | Bamji et al. | Feb 2007 | B2 |
7212663 | Tomasi | May 2007 | B2 |
7283213 | O'Connor et al. | Oct 2007 | B2 |
7321111 | Bamji et al. | Jan 2008 | B2 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7352454 | Bamji et al. | Apr 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7379100 | Gokturk et al. | May 2008 | B2 |
7379163 | Rafii et al. | May 2008 | B2 |
7405812 | Bamji | Jul 2008 | B1 |
7408627 | Bamji et al. | Aug 2008 | B2 |
7580795 | McCarthy et al. | Aug 2009 | B2 |
8013780 | Lynam | Sep 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8698894 | Briggance | Apr 2014 | B2 |
8855849 | Ferguson et al. | Oct 2014 | B1 |
9036026 | Dellantoni et al. | May 2015 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9279882 | Hukkeri | Mar 2016 | B2 |
9575160 | Davis et al. | Feb 2017 | B1 |
9599702 | Bordes et al. | Mar 2017 | B1 |
9689967 | Stark et al. | Jun 2017 | B1 |
9753121 | Davis et al. | Sep 2017 | B1 |
10852418 | Wodrich et al. | Dec 2020 | B2 |
20050267683 | Fujiwara et al. | Dec 2005 | A1 |
20080169963 | White et al. | Jul 2008 | A1 |
20100001897 | Lyman | Jan 2010 | A1 |
20100106356 | Trepagnier | Apr 2010 | A1 |
20100245066 | Sarioglu et al. | Sep 2010 | A1 |
20120062743 | Lynam et al. | Mar 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20130063257 | Schwindt et al. | Mar 2013 | A1 |
20130215271 | Lu | Aug 2013 | A1 |
20130222592 | Gieseke | Aug 2013 | A1 |
20130241766 | Kishigami et al. | Sep 2013 | A1 |
20140062762 | Kurono et al. | Mar 2014 | A1 |
20140218529 | Mahmoud et al. | Aug 2014 | A1 |
20140375476 | Johnson et al. | Dec 2014 | A1 |
20150124096 | Koravadi | May 2015 | A1 |
20150158499 | Koravadi | Jun 2015 | A1 |
20150251599 | Koravadi | Sep 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160036917 | Koravadi et al. | Feb 2016 | A1 |
20160116573 | Appia et al. | Apr 2016 | A1 |
20160161602 | Prokhorov | Jun 2016 | A1 |
20160210853 | Koravadi | Jul 2016 | A1 |
20160291146 | Wang et al. | Oct 2016 | A1 |
20170129489 | Pawlicki et al. | May 2017 | A1 |
20170222311 | Hess et al. | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170276788 | Wodrich | Sep 2017 | A1 |
20170315231 | Wodrich | Nov 2017 | A1 |
20170328997 | Silverstein et al. | Nov 2017 | A1 |
20170356994 | Wodrich et al. | Dec 2017 | A1 |
20180015875 | May et al. | Jan 2018 | A1 |
20180045812 | Hess | Feb 2018 | A1 |
20180065623 | Wodrich et al. | Mar 2018 | A1 |
20180067194 | Wodrich et al. | Mar 2018 | A1 |
20180105176 | Pawlicki et al. | Apr 2018 | A1 |
20180231635 | Woehlte | Aug 2018 | A1 |
20180231657 | Woehlte | Aug 2018 | A1 |
20180299533 | Pliefke et al. | Oct 2018 | A1 |
20180372841 | Hieida | Dec 2018 | A1 |
20190061760 | Pawlicki et al. | Feb 2019 | A1 |
20190072666 | Duque Biarge et al. | Mar 2019 | A1 |
20190072667 | Duque Biarge et al. | Mar 2019 | A1 |
20190072668 | Duque Biarge et al. | Mar 2019 | A1 |
20190072669 | Duque Biarge et al. | Mar 2019 | A1 |
20190217775 | May et al. | Jul 2019 | A1 |
20190339382 | Hess et al. | Nov 2019 | A1 |
Number | Date | Country | |
---|---|---|---|
20200271755 A1 | Aug 2020 | US |
Number | Date | Country | |
---|---|---|---|
62810453 | Feb 2019 | US |