The present invention relates generally to a vehicle sensing system for a vehicle and, more particularly, to a vehicle sensing system that utilizes one or more sensors at a vehicle to provide a field of sensing at or around the vehicle.
Use of imaging sensors or ultrasonic sensors or radar sensors in vehicle sensing systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 8,013,780 and 5,949,331 and/or U.S. publication No. US-2010-0245066 and/or International Publication No. WO 2011/090484, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or sensing system for a vehicle that utilizes a sensor module or system disposed at the vehicle and comprising at least one radar sensor disposed at the vehicle and having a field of sensing exterior of the vehicle. The at least one radar sensor comprises multiple transmitting (Tx) antennas (transmitters) and receiving (Rx) antennas (receivers) to provide high definition, fine resolution in azimuth and/or elevation to determine high definition radar reflection responses for objects and surfaces detected by the system. The system includes a control, where outputs (such as radar data acquisitions of multiple scans) of the at least one radar sensor are communicated to the control, and where the control, responsive to the outputs of the at least one radar sensor, determines different types of surfaces at or near the equipped vehicle or on which the equipped vehicle is traveling. The system also detects the presence of one or more objects exterior the vehicle and within the field of sensing of at least one of the at least one radar sensor.
The control of the sensing system receives radar data sensed by at least one radar sensor (such as radar data of multiple consecutive scans) and receives a vehicle motion estimation. The control, responsive to received vehicle motion estimation and received sensed radar data (which is time stamped so that it can be correlated with the vehicle motion), determines the type of surface on and along which the vehicle is traveling.
The present invention provides a means to segment and distinguish different kind of surfaces seen by an automotive radar. Different surfaces present different scattering properties. Data acquired from consecutive scans can be used to coherently analyze the statistical properties of different range-angle cells corresponding to stationary objects. The cells sharing similar statistical properties can be clustered together. In this way, the range-angle imaging corresponding to stationary objects can be segmented. This technique is useful to distinguish the road and its path from the surroundings (such as a dirt shoulder along the side of the road) using automotive radar.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle sensing system, such as a driver assist system, object detection system, parking assist system and/or alert system, operates to capture sensing data exterior of the vehicle and may process the captured data to detect objects or other vehicles at or near the equipped vehicle and in the predicted path of the equipped vehicle, such as to assist a driver of the equipped vehicle in maneuvering the vehicle in a forward or rearward direction or to assist the driver in parking the vehicle in a parking space. The system includes a processor that is operable to receive sensing data from one or more sensors and to provide an output to a control that, responsive to the output, generates an alert or controls an accessory or system of the vehicle, or highlights or overlays an alert on a display screen (that may be displaying video images captured by a single rearward viewing camera or multiple cameras providing forward, side or 360 degree surround views of the area surrounding the vehicle during a reversing or low speed maneuver of the vehicle).
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an driver assistance system or sensing system 12 that includes at least one radar sensor unit, such as a forward facing radar sensor unit 14 (and the system may optionally include multiple exterior facing sensors, such as multiple exterior facing radar sensors or cameras or other sensors, such as a rearward facing sensor at the rear of the vehicle, and a sideward/rearward facing sensor at respective sides of the vehicle), which sense regions exterior of the vehicle. The sensing system 12 includes a control or electronic control unit (ECU) or processor that is operable to process data captured by the sensor or sensors and may detect objects or the like. The data transfer or signal communication from the sensor to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.
Some automotive radars use MIMO (Multiple Input Multiple Output) techniques to create an effective virtual antenna aperture, which is significantly larger than the real antenna aperture, and delivers much better angular resolution than conventional radars, such as, for example, conventional scanning radars.
Algorithms for automotive radar that estimate road surfaces and free space are typically based on target detection lists. The algorithms create clusters by grouping targets that are close to each other and taking into account their accuracies. This presents several limitations, such as sparsity to identify surfaces or the fact that there is no proper scattering mechanism analysis present in the algorithms. The system of the present invention deals with images related to non-moving objects. The images are naturally oriented to better identify surfaces. In addition, the system of the present invention is oriented to work with amplitudes and phases. Therefore, the system presents a more refined and robust statistical analysis than using only points as typical algorithms use.
The system of the present invention segments different kind of surfaces seen by an automotive radar by using the scattering properties of differing surfaces. The system receives as input an ego motion estimation (that estimates the motion of a sensor disposed at the subject or equipped vehicle), a complex (phase and amplitude) acquisition or image for at least two consecutive scans of the radar system, a time stamp for each acquisition, and a sensor position of the sensor with respect to the vehicle. Each of the images is dedicated to non-moving objects, and can be either two dimensional (2D) or three dimensional (3D). The dimensions in the case of a 2D acquisition include Range and Angle, while the dimensions in the case of having a volume (3D) include Range, Azimuth Angle and Elevation Angle. Optionally, the image may be provided in Cartesian coordinates.
During operation, the images from different scans are coregistered. To coregister the images, one image is taken as a reference and the rest of the images are transformed such that every pixel/voxel of the image refers to the same physical space as the pixels/voxels of the reference image. This coregistration may vary along the image. The displacement of the car (determined based on ego-motion and time stamp) is taken into account. A fine coregistration may be performed based on point-like targets or distributed targets or a mix of both, depending on the scene nature.
The set of images can be paired generating interferograms. In other words, pairing images by multiplying one coregistered image by the conjugate of another image. The scans can be paired in different ways, such as, for example, pairing consecutive scans in order to minimize the decorrelation and the amount of data.
The next step is to “flatten” the interferometric phase by taking into account a first model of a terrain model. When there is a terrain model available, it is possible to generate its synthetic phase and subtract it from the previously obtained interferometric phase. In cases where there is not a terrain model, it is possible to generate the corresponding interferograms for a flat surface, taking into account the sensor position on the scans.
The system of the present invention can carry on a statistical analysis that takes into account amplitudes and phases for the different rg-angle(s) interferometric cells. The cells showing similar statistical properties will be clustered together. Different kinds of interferometric analysis can be performed, such as Region Growing, Amplitude-Phase driven filters, Non-Local filters, and/or the like. In this way, the system determines a surface segmentation based on a surface scattering mechanism. The system can thus determine and distinguish a road surface from a non-road surface, such as a dirt shoulder by the road, so that the system can assist in determining or maintaining a path of travel of the vehicle along the road.
Thus, the sensing system of the present invention segments different kinds of surfaces sensed or seen by an automotive radar. The system is based on the fact that different surfaces present different scattering properties. Data acquired from consecutive scans can be used by the sensing system to coherently analyze the statistical properties of different range-angle cells corresponding to stationary objects. The cells sharing similar statistical properties may be clustered together. In this way, the range-angle imaging corresponding to stationary objects may be segmented. This technique is useful for distinguishing the road and its path from the surroundings using automotive radar.
The system may provide an output for a driving assist system of the vehicle, such as one or more of (i) automated parking, (ii) blind spot detection, (iii) cross traffic alert, (iv) lane change assist, (v) lane merge assist, (vi) automatic emergency braking, (vii) pedestrian detection, (viii) turn assist, (ix) terrain management, (x) collision mitigation and (xi) intersection collision mitigation. Optionally, the output may be provided to an autonomous vehicle control system.
For autonomous vehicles suitable for deployment with the system of the present invention, an occupant of the vehicle may, under particular circumstances, be desired or required to take over operation/control of the vehicle and drive the vehicle so as to avoid potential hazard for as long as the autonomous system relinquishes such control or driving. Such occupant of the vehicle thus becomes the driver of the autonomous vehicle. As used herein, the term “driver” refers to such an occupant, even when that occupant is not actually driving the vehicle, but is situated in the vehicle so as to be able to take over control and function as the driver of the vehicle when the vehicle control system hands over control to the occupant or driver or when the vehicle control system is not operating in an autonomous or semi-autonomous mode.
Typically an autonomous vehicle would be equipped with a suite of sensors, including multiple machine vision cameras deployed at the front, sides and rear of the vehicle, multiple radar sensors deployed at the front, sides and rear of the vehicle, and/or multiple lidar sensors deployed at the front, sides and rear of the vehicle. Typically, such an autonomous vehicle will also have wireless two way communication with other vehicles or infrastructure, such as via a car2car (V2V) or car2x communication system. The forward viewing camera and/or the sensor of the lane determining system may comprise one of the cameras and/or one of the sensors of the autonomous vehicle control system.
The sensing system may include a machine vision system (comprising at least one exterior viewing camera disposed at the vehicle and an image processor for processing image data captured by the at least one camera), where information is shared between the stereo radar and the machine vision system.
The system may include two or more individual radars, having individual or multiple Tx (transmitters) and Rx (receivers) on an antenna array, and may utilize aspects of the systems described in U.S. Pat. Nos. 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 6,825,455; 7,053,357; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or International Publication Nos. WO 2018/007995 and/or WO 2011/090484, and/or U.S. Publication Nos. US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/555,223, filed Sep. 7, 2017, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4943796 | Lee | Jul 1990 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5585798 | Yoshioka et al. | Dec 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5715093 | Schierbeek et al. | Feb 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6057754 | Kinoshita et al. | May 2000 | A |
6067110 | Nonaka et al. | May 2000 | A |
6085151 | Farmer et al. | Jul 2000 | A |
6097023 | Schofield et al. | Aug 2000 | A |
6118401 | Tognazzini | Sep 2000 | A |
6118410 | Nagy | Sep 2000 | A |
6201642 | Bos | Mar 2001 | B1 |
6216540 | Nelson et al. | Apr 2001 | B1 |
6313454 | Bos et al. | Nov 2001 | B1 |
6353392 | Schofield et al. | Mar 2002 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6492935 | Higuchi | Dec 2002 | B1 |
6498620 | Schofield et al. | Dec 2002 | B2 |
6580385 | Winner et al. | Jun 2003 | B1 |
6587186 | Bamji et al. | Jul 2003 | B2 |
6674895 | Rafii et al. | Jan 2004 | B2 |
6678039 | Charbon | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6690354 | Sze | Feb 2004 | B2 |
6710770 | Tomasi et al. | Mar 2004 | B2 |
6717610 | Bos et al. | Apr 2004 | B1 |
6757109 | Bos | Jun 2004 | B2 |
6771208 | Lutter et al. | Aug 2004 | B2 |
6795014 | Cheong | Sep 2004 | B2 |
6825455 | Schwarte | Nov 2004 | B1 |
6831591 | Horibe | Dec 2004 | B2 |
6876775 | Torunoglu | Apr 2005 | B2 |
6903677 | Takashima et al. | Jun 2005 | B2 |
6906793 | Bamji et al. | Jun 2005 | B2 |
6919549 | Bamji et al. | Jul 2005 | B2 |
6941211 | Kuroda et al. | Sep 2005 | B1 |
6946978 | Schofield | Sep 2005 | B2 |
7004606 | Schofield | Feb 2006 | B2 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7012560 | Braeuchle et al. | Mar 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7042389 | Shirai | May 2006 | B2 |
7053357 | Schwarte | May 2006 | B2 |
7123168 | Schofield | Oct 2006 | B2 |
7157685 | Bamji et al. | Jan 2007 | B2 |
7176438 | Bamji et al. | Feb 2007 | B2 |
7176830 | Horibe | Feb 2007 | B2 |
7203356 | Gokturk et al. | Apr 2007 | B2 |
7212663 | Tomasi | May 2007 | B2 |
7283213 | O'Connor et al. | Oct 2007 | B2 |
7310431 | Gokturk et al. | Dec 2007 | B2 |
7321111 | Bamji et al. | Jan 2008 | B2 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7352454 | Bamji et al. | Apr 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7379100 | Gokturk et al. | May 2008 | B2 |
7379163 | Rafii et al. | May 2008 | B2 |
7405812 | Bamji | Jul 2008 | B1 |
7408627 | Bamji et al. | Aug 2008 | B2 |
7432848 | Munakata | Oct 2008 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7613568 | Kawasaki | Nov 2009 | B2 |
7706978 | Schiffmann et al. | Apr 2010 | B2 |
7765065 | Stiller | Jul 2010 | B2 |
8013780 | Lynam | Sep 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8698894 | Briggance | Apr 2014 | B2 |
9036026 | Dellantoni et al. | May 2015 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9575160 | Davis et al. | Feb 2017 | B1 |
9599702 | Bordes et al. | Mar 2017 | B1 |
9689967 | Stark et al. | Jun 2017 | B1 |
9753121 | Davis et al. | Sep 2017 | B1 |
20030138132 | Stam et al. | Jul 2003 | A1 |
20030201929 | Lutter et al. | Oct 2003 | A1 |
20050104089 | Engelmann et al. | May 2005 | A1 |
20060091654 | De Mersseman et al. | May 2006 | A1 |
20100001897 | Lyman | Jan 2010 | A1 |
20100245066 | Sarioglu et al. | Sep 2010 | A1 |
20110037640 | Schmidlin | Feb 2011 | A1 |
20130215271 | Lu | Aug 2013 | A1 |
20170003134 | Kim | Jan 2017 | A1 |
20170222311 | Hess et al. | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170276788 | Wodrich | Sep 2017 | A1 |
20170315231 | Wodrich | Nov 2017 | A1 |
20170356994 | Wodrich et al. | Dec 2017 | A1 |
20180015875 | May et al. | Jan 2018 | A1 |
20180045812 | Hess | Feb 2018 | A1 |
20180059236 | Wodrich et al. | Mar 2018 | A1 |
20180065623 | Wodrich et al. | Mar 2018 | A1 |
20180067194 | Wodrich et al. | Mar 2018 | A1 |
20180210074 | Hoare | Jul 2018 | A1 |
20180217231 | Stanley | Aug 2018 | A1 |
20180231635 | Woehlte | Aug 2018 | A1 |
20180321142 | Seifert | Nov 2018 | A1 |
20190072666 | Duque Biarge et al. | Mar 2019 | A1 |
20190072667 | Duque Biarge et al. | Mar 2019 | A1 |
20190072668 | Duque Biarge et al. | Mar 2019 | A1 |
20190193735 | Cherniakov | Jun 2019 | A1 |
20200017083 | Casselgren | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
1506893 | Feb 2005 | EP |
2011090484 | Jul 2011 | WO |
2018007995 | Jan 2018 | WO |
Entry |
---|
Rapp et al. “Probabilistic ego-motion estimation using multiple automotive radar sensors.” Robotics and Autonomous Systems 89, 136-146, 2017. |
Das et al., “Scan registration with multi-scale k-means normal distributions transform.” Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on. IEEE, 2012. |
Lundquist et al., “Estimation of the free space in front of a moving vehicle.” 2009. |
Schreier et al., “Robust free space detection in occupancy grid maps by methods of image analysis and dynamic B-spline contour tracking.” Intelligent Transportation Systems (ITSC), 2012 15th International IEEE Conference on. IEEE, 2012. |
Number | Date | Country | |
---|---|---|---|
20190072669 A1 | Mar 2019 | US |
Number | Date | Country | |
---|---|---|---|
62555223 | Sep 2017 | US |