The present invention relates generally to a vehicle sensing system for a vehicle and, more particularly, to a vehicle sensing system that utilizes one or more sensors at a vehicle to provide a field of sensing around the vehicle.
It is known to provide sensors, such as ultrasonic sensors, at a rear bumper of a vehicle for sensing objects at the ground behind the vehicle.
A vehicular sensing system utilizes one or more sensors (e.g., ultrasonic sensors) to capture sensor data exterior of a vehicle. The system includes a first set of sensors disposed at a first rear portion of a vehicle equipped with the vehicular sensing system. The first set of sensors includes a plurality of first sensors and each first sensor of the first set of sensors has a respective first field of sensing that extends exterior and at least rearward of the vehicle. The system includes a second set of sensors disposed at a second rear portion of the vehicle that is above the first rear portion of the vehicle. The second set of sensors includes a plurality of second sensors and each second sensor of the second set of sensors has a respective second field of sensing that extends exterior and at least rearward of the vehicle. The respective first field of sensing of at least one first sensor of the plurality of first sensors at least partially overlaps the respective second field of sensing of at least one second sensor of the plurality of second sensors. The system includes an electronic control unit (ECU) with electronic circuitry and associated software. The electronic circuitry of the ECU includes a data processor for (i) processing sensor data captured by first sensors of the first set of sensors and (ii) processing sensor data captured by second sensors of the second set of sensors to detect presence of objects exterior and at least rearward of the vehicle. The vehicular sensing system, responsive at least in part to processing at the ECU of sensor data captured by the first sensors of the first set of sensors and by the second sensors of the second set of sensors, detects objects that are located within the at least partially overlapping fields of sensing of the at least one first sensor and the at least one second sensor. The vehicular sensing system, responsive to detecting the objects that are located within the at least partially overlapping fields of sensing, determines three-dimensional locations of the detected objects relative to the vehicle.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle sensing system operates to capture sensing data exterior of the vehicle and may process the captured data to detect objects at or near the vehicle (e.g., to the rear of the vehicle), such as to assist a driver of the vehicle in maneuvering the vehicle or to assist the driver in parking the vehicle in a parking space. The system includes a processor that is operable to receive sensing data from multiple sensors and to provide an output to a control that, responsive to the output, generates an alert or controls an accessory or system of the vehicle, or highlights or overlays an alert on a display screen (that may be displaying video images captured by a single rearward viewing camera or multiple cameras providing forward, side or 360 degree surround views of the area surrounding the vehicle during a reversing or low speed maneuver of the vehicle).
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a sensing system 12 that includes at least one exterior sensing sensor, such as an ultrasonic sensor 14a (and the system may optionally include multiple exterior sensing sensors, such as a forward sensing sensor 14b at the front of the vehicle), which senses objects exterior of the vehicle (
Referring now to
Referring now to
Each sensor 12 may have a field of sensing that intersects or at least partially overlaps with the field of sensing of one or more other sensors such that reflections of sensing energy transmitted by a single sensor are received by multiple other sensors. Due to the placement of the sensors (e.g., sensors placed along both horizontal and vertical dimensions instead of just a horizontal dimension), the system processes these multiple reflections to localize objects behind the vehicle in three dimensions (3D) (i.e., localize an object relative to an X, Y, and Z axis).
Referring now to
As shown in
Referring now to
Thus, the sensing system described herein includes a plurality of sensors (e.g., ultrasonic sensors) disposed at a rear of a vehicle. A first set of sensors includes at least one sensor disposed at or near a bumper of the vehicle (e.g., four sensors disposed linearly along the bumper). The first set of sensors may be approximately linearly arranged along a horizontal line parallel to the ground. That is, each sensor of the first set of sensors may be approximately the same height from the ground. The system includes a second set of sensors (including at least one sensor) disposed at or near a rear roofline of the vehicle (e.g., four sensors disposed linearly along the roofline). The second set of sensors may be approximately linearly arranged along another horizontal line parallel to the ground. That is, each sensor of the second set of sensors may be approximately the same height from the ground. Optionally, additional sensors may be disposed along the rear sides of the vehicle (e.g., above the sensors disposed at the bumper and below the sensors disposed at the roofline). Each sensor has a field of sensing that overlaps with the field of sensing of at least one other sensor. One or more sensors may have a field of sensing that at least partially overlaps two other sensors. Based on the overlapping fields of sensing (e.g., using triangulation), the system detects the presence of objects rear of the vehicle in three dimensions (i.e., relative to an X axis, a Y axis, and a Z axis of the vehicle). Each sensor, depending on its respective location at the vehicle, may have a respective field of sensing with a vertical component that is larger than a horizontal component or vice versa.
The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 10,866,306; 9,954,955; 9,869,762; 9,753,121; 9,689,967; 9,599,702; 9,575,160; 9,146,898; 9,036,026; 8,027,029; 8,013,780; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 7,053,357; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or U.S. Publication Nos. US-2019-0339382; US-2018-0231635; US-2018-0045812; US-2018-0015875; US-2017-0356994; US-2017-0315231; US-2017-0276788; US-2017-0254873; US-2017-0222311 and/or US-2010-0245066, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 63/261,111, filed Sep. 13, 2021, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5767793 | Agravante et al. | Jun 1998 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6587186 | Bamji et al. | Jul 2003 | B2 |
6674895 | Rafii et al. | Jan 2004 | B2 |
6678039 | Charbon | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6690354 | Sze | Feb 2004 | B2 |
6693517 | McCarthy et al. | Feb 2004 | B2 |
6710770 | Tomasi et al. | Mar 2004 | B2 |
6825455 | Schwarte | Nov 2004 | B1 |
6876775 | Torunoglu | Apr 2005 | B2 |
6906793 | Bamji et al. | Jun 2005 | B2 |
6919549 | Bamji et al. | Jul 2005 | B2 |
7053357 | Schwarte | May 2006 | B2 |
7157685 | Bamji et al. | Jan 2007 | B2 |
7176438 | Bamji et al. | Feb 2007 | B2 |
7203356 | Gokturk et al. | Apr 2007 | B2 |
7212663 | Tomasi | May 2007 | B2 |
7283213 | O'Connor et al. | Oct 2007 | B2 |
7310431 | Gokturk et al. | Dec 2007 | B2 |
7321111 | Bamji et al. | Jan 2008 | B2 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7352454 | Bamji et al. | Apr 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7379100 | Gokturk et al. | May 2008 | B2 |
7379163 | Rafii et al. | May 2008 | B2 |
7405812 | Bamji | Jul 2008 | B1 |
7408627 | Bamji et al. | Aug 2008 | B2 |
7580795 | McCarthy et al. | Aug 2009 | B2 |
8013780 | Lynam | Sep 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8665079 | Pawlicki et al. | Mar 2014 | B2 |
8698894 | Briggance | Apr 2014 | B2 |
9036026 | Dellantoni et al. | May 2015 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9193321 | Dingman | Nov 2015 | B2 |
9524597 | Ricci | Dec 2016 | B2 |
9575160 | Davis et al. | Feb 2017 | B1 |
9586138 | Wei et al. | Mar 2017 | B2 |
9599702 | Bordes et al. | Mar 2017 | B1 |
9689967 | Stark et al. | Jun 2017 | B1 |
9753121 | Davis et al. | Sep 2017 | B1 |
9869762 | Alland et al. | Jan 2018 | B1 |
9954955 | Davis et al. | Apr 2018 | B2 |
9977593 | Ricci | May 2018 | B2 |
10004458 | Toth et al. | Jun 2018 | B2 |
10768298 | Wodrich et al. | Sep 2020 | B2 |
10866306 | Maher et al. | Dec 2020 | B2 |
11275175 | Wodrich et al. | Mar 2022 | B2 |
11454719 | Hess et al. | Sep 2022 | B2 |
11520027 | Suchy | Dec 2022 | B2 |
20030034883 | Sato et al. | Feb 2003 | A1 |
20040239509 | Kisacanin | Dec 2004 | A1 |
20060139181 | Danz et al. | Jun 2006 | A1 |
20060206243 | Pawlicki et al. | Sep 2006 | A1 |
20080211708 | Haberland et al. | Sep 2008 | A1 |
20090147083 | Pawlicki et al. | Jun 2009 | A1 |
20090242310 | Touge | Oct 2009 | A1 |
20100001897 | Lyman | Jan 2010 | A1 |
20100002081 | Pawlicki et al. | Jan 2010 | A1 |
20100245066 | Sarioglu et al. | Sep 2010 | A1 |
20110103650 | Cheng et al. | May 2011 | A1 |
20120062743 | Lynam et al. | Mar 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20130063600 | Pawlicki et al. | Mar 2013 | A1 |
20130093613 | Itoh et al. | Apr 2013 | A1 |
20130215271 | Lu | Aug 2013 | A1 |
20130222592 | Gieseke | Aug 2013 | A1 |
20140218529 | Mahmoud et al. | Aug 2014 | A1 |
20140219506 | Foltin | Aug 2014 | A1 |
20140375476 | Johnson et al. | Dec 2014 | A1 |
20150124096 | Koravadi | May 2015 | A1 |
20150138011 | Hiramaki et al. | May 2015 | A1 |
20150158499 | Koravadi | Jun 2015 | A1 |
20150185319 | Matsuura et al. | Jul 2015 | A1 |
20150251599 | Koravadi | Sep 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160036917 | Koravadi et al. | Feb 2016 | A1 |
20160098612 | Viviani | Apr 2016 | A1 |
20160200240 | Quinlan et al. | Jul 2016 | A1 |
20160210853 | Koravadi | Jul 2016 | A1 |
20170129489 | Pawlicki et al. | May 2017 | A1 |
20170205506 | Voorheis et al. | Jul 2017 | A1 |
20170212231 | Iwai et al. | Jul 2017 | A1 |
20170222311 | Hess et al. | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170276788 | Wodrich | Sep 2017 | A1 |
20170285161 | Izzat et al. | Oct 2017 | A1 |
20170315231 | Wodrich | Nov 2017 | A1 |
20170356994 | Wodrich et al. | Dec 2017 | A1 |
20180015875 | May et al. | Jan 2018 | A1 |
20180045812 | Hess | Feb 2018 | A1 |
20180059236 | Wodrich et al. | Mar 2018 | A1 |
20180065623 | Wodrich et al. | Mar 2018 | A1 |
20180067194 | Wodrich et al. | Mar 2018 | A1 |
20180074191 | Bilik et al. | Mar 2018 | A1 |
20180105176 | Pawlicki et al. | Apr 2018 | A1 |
20180231635 | Woehlte | Aug 2018 | A1 |
20180231657 | Woehlte | Aug 2018 | A1 |
20180299533 | Pliefke et al. | Oct 2018 | A1 |
20190061760 | Pawlicki et al. | Feb 2019 | A1 |
20190072666 | Duque Biarge et al. | Mar 2019 | A1 |
20190072667 | Duque Biarge et al. | Mar 2019 | A1 |
20190072668 | Duque Biarge et al. | Mar 2019 | A1 |
20190072669 | Duque Biarge et al. | Mar 2019 | A1 |
20190120951 | Fischer | Apr 2019 | A1 |
20190154823 | Insana | May 2019 | A1 |
20190217775 | May et al. | Jul 2019 | A1 |
20190339382 | Hess et al. | Nov 2019 | A1 |
20200200898 | Hustava | Jun 2020 | A1 |
20210405156 | Barber | Dec 2021 | A1 |
20220227366 | Shah | Jul 2022 | A1 |
Number | Date | Country |
---|---|---|
2011090484 | Jul 2011 | WO |
Number | Date | Country | |
---|---|---|---|
20230080530 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
63261111 | Sep 2021 | US |