The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
Trailer assist systems are known that may determine an angle of a trailer hitched at a vehicle. Examples of such known systems are described in U.S. Pat. Nos. 9,085,261 and/or 6,690,268, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or vision system or imaging system for a vehicle that utilizes a camera disposed at a rear portion of a vehicle and having a field of view exterior of the vehicle, the field of view encompassing at least a portion of a trailer hitched to the vehicle. The system also includes a control comprising an image processor operable to process image data captured by the camera, with the image data captured by the camera representative of the trailer hitched to the vehicle. The control is operable determine whether the trailer has been previously hitched to the vehicle. Responsive to the control determining that the trailer has not been previously hitched to the vehicle, the control operates in a trailer initial calibration mode comprising a bow transformation. Responsive to the control recognizing the trailer and determining that the trailer has been previously hitched to the vehicle, the control operates in a recognized trailer calibration mode. While operating in the trailer initial calibration mode or in the recognized trailer calibration mode, the control obtains calibration data unique to the hitched trailer. The control, responsive to obtaining the calibration data, scans for the trailer using the calibration data to locate a current position of the trailer and extracts the trailer angle based on the current position of the trailer.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle and trailer maneuvering system or maneuver assist system and/or driving assist system operates to capture images exterior of the vehicle and of a trailer being towed by the vehicle and may process the captured image data to determine a path of travel for the vehicle and trailer and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle and trailer in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and that may provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide a display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a trailer maneuver assist system 12 that is operable to assist in backing up or reversing the vehicle with a hitched trailer that is hitched at the rear of the vehicle via a hitch 14, and the system may maneuver the vehicle 10 and trailer 16 toward a desired or selected location. The trailer maneuver assist system 12 includes at least one exterior viewing vehicle-based imaging sensor or camera, such as a rearward viewing imaging sensor or camera 18 (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a sideward/rearward viewing camera at respective sides of the vehicle), which captures image data representative of the scene exterior and rearward of the vehicle 10, with the field of view of the camera encompassing the hitch 14 and/or trailer 16, and with the camera 18 having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
Implementations herein include a novel system and/or method for the detection of trailer angle using one or more monocular cameras. The system uses a view from a rear camera of a vehicle (e.g., a truck) as an input for determining the trailer angle. To detect the trailer angle, the system performs trailer angle extraction, performs trailer calibration, performs hitch point extraction, calculates beam length, and determines trailer dimensions. Referring now to the block diagram of
The state machine 240 can, in some examples, enter two different calibration modes. A forced calibration mode (i.e., an trailer initial calibration mode) is a trailer calibration mode that the user selects and/or enables after connecting a new trailer to the vehicle. Optionally, the system detects that the trailer has not previously been hitched to the vehicle (i.e., by determining that the trailer does not match any currently stored trailer information). In this mode (i.e., a recognized trailer calibration mode), the user is expected to drive the vehicle in a straight path for a certain period of time and/or a set distance (e.g., several meters) at a constant wheel angle. The system removes all background noise and stores the calibration data into memory. Whenever the same trailer is used (as either detected by the system or selected by the user), the system may use this data to identify the trailer and locate its position. An implicit calibration mode is a mode that occurs without the knowledge of the driver/user. When the user hitches a previously calibrated trailer to the vehicle, the system enters the implicit calibration mode. This mode enables the system to internally recalibrate the trailer while the trailer is in motion (i.e., towed by the vehicle). Thus, the system uses a forced calibration mode for trailers hitched to the vehicle for the first time and the system uses an implicit calibration mode to calibrate trailers that have already been calibrated via the forced calibration mode.
Referring now to
Referring now to
Referring now to
During the scanning mode, the trailer scanning module 700 loads an existing feature matrix and scans and attempts to locate the current position of the trailer using a recent feature extracted image (
Referring now to
Referring now to
Thus, the present invention provides a means for a vision based system for detecting the trailer angle of a hitched trailer using a monocular camera. Image data captured by the camera may be used to determine the presence or absence of a trailer, performing trailer calibration, hitch point extraction, and trailer angle extraction. Additionally, the camera may be used to determine trailer dimensions (such as trailer length, width, or beam length).
The system may utilize aspects of the trailering or trailer angle detection systems or trailer hitch assist systems described in U.S. Pat. Nos. 9,085,261 and/or 6,690,268, and/or U.S. Publication Nos. US-2020-0017143; US-2019-0347825; US-2019-0297233; US-2019-0064831; US-2019-0016264; US-2018-0276839; US-2018-0276838; US-2018-0253608; US-2018-0215382; US-2018-0211528; US-2017-0254873; US-2017-0217372; US-2017-0050672; US-2015-0217693; US-2014-0160276; US-2014-0085472 and/or US-2015-0002670, and/or U.S. patent application Ser. No. 16/850,300, filed on Apr. 16, 2020, which is hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,501; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2014-0022390; US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/844,834, filed May 8, 2019, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6498620 | Schofield et al. | Dec 2002 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
9085261 | Lu et al. | Jul 2015 | B2 |
9264672 | Lynam | Feb 2016 | B2 |
9446713 | Lu et al. | Sep 2016 | B2 |
9558409 | Pliefke et al. | Jan 2017 | B2 |
10071687 | Ihlenburg et al. | Sep 2018 | B2 |
10086870 | Gieseke et al. | Oct 2018 | B2 |
10099614 | Diessner | Oct 2018 | B2 |
10160382 | Pliefke et al. | Dec 2018 | B2 |
10532698 | Potnis et al. | Jan 2020 | B2 |
10552976 | Diessner et al. | Feb 2020 | B2 |
10586119 | Pliefke et al. | Mar 2020 | B2 |
10638025 | Gali et al. | Apr 2020 | B2 |
10706291 | Diessner et al. | Jul 2020 | B2 |
10733757 | Gupta et al. | Aug 2020 | B2 |
10755110 | Bajpai | Aug 2020 | B2 |
11417116 | Joseph et al. | Aug 2022 | B2 |
20140063197 | Yamamoto et al. | Mar 2014 | A1 |
20140085472 | Lu et al. | Mar 2014 | A1 |
20140160276 | Pliefke et al. | Jun 2014 | A1 |
20140267688 | Aich et al. | Sep 2014 | A1 |
20150002670 | Bajpai | Jan 2015 | A1 |
20150217693 | Pliefke et al. | Aug 2015 | A1 |
20160049020 | Kuehnle et al. | Feb 2016 | A1 |
20170050672 | Gieseke et al. | Feb 2017 | A1 |
20170174128 | Hu et al. | Jun 2017 | A1 |
20170217372 | Lu et al. | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170280091 | Greenwood et al. | Sep 2017 | A1 |
20170341583 | Zhang | Nov 2017 | A1 |
20180056868 | Naserian | Mar 2018 | A1 |
20180211528 | Seifert | Jul 2018 | A1 |
20180215382 | Gupta et al. | Aug 2018 | A1 |
20180253608 | Diessner et al. | Sep 2018 | A1 |
20180276838 | Gupta et al. | Sep 2018 | A1 |
20180276839 | Diessner et al. | Sep 2018 | A1 |
20190016264 | Potnis et al. | Jan 2019 | A1 |
20190039649 | Gieseke et al. | Feb 2019 | A1 |
20190042864 | Pliefke et al. | Feb 2019 | A1 |
20190061815 | Sanvicente Herrera | Feb 2019 | A1 |
20190064831 | Gali et al. | Feb 2019 | A1 |
20190118860 | Gali et al. | Apr 2019 | A1 |
20190143895 | Pliefke et al. | May 2019 | A1 |
20190241126 | Murad et al. | Aug 2019 | A1 |
20190297233 | Gali et al. | Sep 2019 | A1 |
20190329821 | Ziebart et al. | Oct 2019 | A1 |
20190347498 | Herman et al. | Nov 2019 | A1 |
20190347825 | Gupta et al. | Nov 2019 | A1 |
20200017143 | Gali | Jan 2020 | A1 |
20200334475 | Joseph et al. | Oct 2020 | A1 |
20200356788 | Joseph et al. | Nov 2020 | A1 |
20200361397 | Joseph et al. | Nov 2020 | A1 |
20200406967 | Yunus et al. | Dec 2020 | A1 |
20210078634 | Jalalmaab et al. | Mar 2021 | A1 |
20210094473 | Gali et al. | Apr 2021 | A1 |
20210170820 | Zhang | Jun 2021 | A1 |
20210170947 | Yunus et al. | Jun 2021 | A1 |
Number | Date | Country | |
---|---|---|---|
20200356788 A1 | Nov 2020 | US |
Number | Date | Country | |
---|---|---|---|
62844834 | May 2019 | US |