The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
The present invention provides a vehicular trailer assist or driving assistance system or vision system for a vehicle that utilizes one or more cameras (preferably one or more CMOS cameras) to capture image data representative of images exterior of the vehicle, and includes a camera disposed at a rear portion of a vehicle and having a field of view exterior of the vehicle, the field of view encompassing at least a portion of a trailer hitched to the vehicle. The system also includes a control comprising an image processor operable to process image data captured by the camera and the image data captured by the camera is representative of the trailer hitched to the vehicle. The control, responsive to image processing of image data captured by the camera, is operable to determine a trailer angle of the trailer relative to the vehicle. Responsive to determining the trailer angle, the control is operable to determine a trailer direction. The determined trailer direction is based at least in part on the determined trailer angle. The control is also operable to determine a virtual destination location that is a predetermined distance from the trailer and in the determined trailer direction. In response to determining the determined trailer direction and the virtual destination location and during a reversing maneuver of the vehicle and trailer, the control controls steering of the vehicle to steer the vehicle and direct the trailer in the determined trailer direction.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle and trailer maneuvering system or maneuver assist system and/or driving assist system operates to capture images exterior of the vehicle and trailer being towed by the vehicle and may process the captured image data to determine a path of travel for the vehicle and trailer and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle and trailer in a rearward direction. The trailer assist system includes an image processor or image processing system that is operable to receive image data from one or more cameras and may provide an output to a display device for displaying images representative of the captured image data. Optionally, the trailer assist system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a trailer maneuver assist system 12 that is operable to assist in backing up or reversing the vehicle with a hitched trailer hitched to the vehicle at a hitch 14, and may maneuver the vehicle 10 and trailer 16 toward a desired or selected destination location. The trailer maneuver assist system 12 includes at least one exterior viewing vehicle-based imaging sensor or camera, such as a rearward viewing imaging sensor or camera 18 (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a sideward/rearward viewing camera at respective sides of the vehicle), which captures image data representative of the scene exterior of the vehicle 10, which includes the hitch 14 and/or trailer 16, with the camera 18 having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
Referring now to
Trailer angle detection determines the trailer angle 24 through the use of rear view camera 18 by measuring or determining or estimating the angle 24 between the towing vehicle 10 and the trailer 16 via processing of image data captured by the rear camera 18 of the towing vehicle 10. After determining the set direction 22, the system 12 controls steering of the towing vehicle (e.g., turns the steering wheel) to direct the trailer when the vehicle 10 is moving or maneuvering in a reverse direction. A driver of the vehicle 10 may be responsible for controlling acceleration/braking (i.e., speed) and gear selection of the towing vehicle 10.
Referring now to
The trailer assist system 12 determines the virtual destination location 40 at a distance far enough from the trailer 16 so that offset error is minimized. For example, the offset error for every 100 meters of distance travelled by the trailer may be less than 5 cm. The offset error is defined as a lateral error between the estimated trailer location at each waypoint of a planned path 42 to the virtual destination location 40 and the actual trailer location during maneuver towards the virtual destination location 40. The system 12 may use the vehicle 10 and trailer 16 dimensions to calculate the virtual destination location 40. The system 12 plans the path 42 to the virtual destination point 40 to direct the trailer 16 from the current location to the virtual destination 40. The system 12 estimates the vehicle control parameters required at each waypoint in the planned path 42.
After determining the planned path 42, the system 12 controls the vehicle to direct the trailer 16 towards the virtual destination point 40. The system may move the virtual destination location 40 by the amount of distance travelled by the trailer by incrementing or decrementing the distance from the trailer 16 to the virtual destination location 40. For example, the distance from the trailer to the virtual destination location may be incremented or decremented the amount of distance travelled by the trailer 16 towards or away from the virtual destination location 40, thus keeping the distance between the trailer and the VDL constant. Put another way, if the trailer 16 travelled, for example, ten meters in the same direction of the destination location 40, then ten meters may be added to the distance from the virtual destination location 40. If the trailer 16 travelled ten meters in the opposite direction of the virtual destination location 40, then ten meters is subtracted from the virtual destination location 40. In this way, the distance between the current trailer 16 position and virtual destination location 40 may remain unchanged and constant. Note that ten meters is used just as an example, and the vehicle and trailer may travel any distance.
Referring now to
Still referring to
The minimum and maximum angles allowed by the VDL changer may be divided into N discrete steps. The VDL changer may map each step of a trailer angle. For example, the minimum angle for the VDL changer (e.g., 0 degrees) may be greater than a minimum trailer threshold angle (which may be a negative value). The maximum angle for the VDL changer, e.g., (N−1)/2, may be less than a maximum trailer threshold angle (which may be a positive value). A center angle for the VDL (N−1) may define when the trailer angle is at zero and may be set as the default value for the VDL changer.
Referring now to
Once the new or updated VDL is set (at an equivalent distance from the trailer as the original VDL), the path to the new VDL is planned. The trailer assist system 12 then defines waypoints along the planned path with necessary steering controls for proper trailer alignment. After planning the path, the system 12 may control steering of the towing vehicle 10 to direct the trailer along the planned path. As the trailer 16 moves, the system 12 may update the VDL changer parameters (e.g., the minimum and maximum values, the jackknife and collision zones, etc.). The user may deactivate the system 12 or otherwise halt the towing vehicle 10 at any point during the maneuver (e.g., by pressing the brake pedal).
Thus, the present invention provides a trailer assist system where the trailer assist is a closed loop system. By calculating the path 42 towards a virtual destination location 40, the trailer assist system 12 obtains prior knowledge of the path 42 in which the trailer 16 will move and the same reference data may be used to identify and correct any deviations/errors in the maneuvers. As previously discussed, the distance between the virtual destination location 40 and the trailer 16 never changes and therefore the system 12 never accumulates errors in trailer angle detection. This helps ensure the trailer 16 never jackknifes. Further, the system 12 ensure the trailer never collides with the towing vehicle 10. The system 12 may compensate for any errors in vehicle/trailer kinematic models and the trailer reaches the desired location accurately.
The trailer assist system may utilize aspects of the trailering or trailer angle detection systems or trailer hitch assist systems described in U.S. Pat. Nos. 9,085,261 and/or 6,690,268, and/or U.S. Publication Nos. US-2019-0064831; US-2019-0016264; US-2018-0276839; US-2018-0276838; US-2018-0253608; US-2018-0215382; US-2018-0211528; US-2017-0254873; US-2017-0217372; US-2017-0050672; US-2015-0217693; US-2014-0160276; US-2014-0085472 and/or US-2015-0002670, which are hereby incorporated herein by reference in their entireties.
The system may utilize aspects of the trailering assist systems or trailer angle detection systems or trailer hitch assist systems described in U.S. Pat. Nos. 9,446,713; 9,085,261 and/or 6,690,268, and/or U.S. Publication Nos. US-2019-0118860; US-2019-0064831; US-2019-0042864; US-2019-0039649; US-2019-0143895; US-2019-0016264; US-2018-0276839; US-2018-0276838; US-2018-0253608; US-2018-0215382; US-2017-0254873; US-2017-0050672; US-2015-0217693; US-2014-0160276; US-2014-0085472 and/or US-2015-0002670, and/or U.S. patent applications, Ser. No. 16/441,220, filed on Jun. 14, 2019, now U.S. Pat. No. 10,638,025, and/or Ser. No. 16/408,613, filed on May 10, 2019, now U.S. Pat. No. 10,733,757, which are all hereby incorporated herein by reference in their entireties.
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Optionally, the system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,501; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2014-0022390; US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is a continuation of U.S. patent application Ser. No. 16/512,458, filed Jul. 16, 2019, now U.S. Pat. No. 11,273,868, which claims the filing benefits of U.S. provisional applications, Ser. No. 62/717,108, filed Aug. 10, 2018, and Ser. No. 62/698,415, filed Jul. 16, 2018, which are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5949331 | Schofield et al. | Sep 1999 | A |
6690268 | Schofield et al. | Feb 2004 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
8874317 | Marczok et al. | Oct 2014 | B2 |
9085261 | Lu et al. | Jul 2015 | B2 |
9446713 | Lu et al. | Sep 2016 | B2 |
9937953 | Lavoie et al. | Apr 2018 | B2 |
11067993 | Gali et al. | Jul 2021 | B2 |
11273868 | Gali | Mar 2022 | B2 |
20090319100 | Kale et al. | Dec 2009 | A1 |
20140085472 | Lu et al. | Mar 2014 | A1 |
20140160276 | Pliefke et al. | Jun 2014 | A1 |
20140309888 | Smit | Oct 2014 | A1 |
20150002670 | Bajpai | Jan 2015 | A1 |
20150158527 | Hafner et al. | Jun 2015 | A1 |
20150197281 | Miller et al. | Jul 2015 | A1 |
20150217693 | Pliefke et al. | Aug 2015 | A1 |
20150344028 | Gieseke et al. | Dec 2015 | A1 |
20160280267 | Lavoie | Sep 2016 | A1 |
20170015312 | Latotzki | Jan 2017 | A1 |
20170017847 | Nakaya | Jan 2017 | A1 |
20170017848 | Gupta et al. | Jan 2017 | A1 |
20170050672 | Gieseke et al. | Feb 2017 | A1 |
20170217372 | Lu et al. | Aug 2017 | A1 |
20170253237 | Diessner | Sep 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170317748 | Krapf | Nov 2017 | A1 |
20170329346 | Latotzki | Nov 2017 | A1 |
20180141658 | Baur | May 2018 | A1 |
20180211528 | Seifert | Jul 2018 | A1 |
20180215382 | Gupta et al. | Aug 2018 | A1 |
20180253608 | Diessner et al. | Sep 2018 | A1 |
20180276838 | Gupta et al. | Sep 2018 | A1 |
20180276839 | Diessner et al. | Sep 2018 | A1 |
20190016264 | Potnis et al. | Jan 2019 | A1 |
20190039649 | Gieseke et al. | Feb 2019 | A1 |
20190042864 | Pliefke et al. | Feb 2019 | A1 |
20190064831 | Gali et al. | Feb 2019 | A1 |
20190066503 | Li et al. | Feb 2019 | A1 |
20190118860 | Gali et al. | Apr 2019 | A1 |
20190143895 | Pliefke et al. | May 2019 | A1 |
20190297233 | Gali et al. | Sep 2019 | A1 |
20190347825 | Gupta et al. | Nov 2019 | A1 |
20200017143 | Gali | Jan 2020 | A1 |
20220204081 | Gali | Jun 2022 | A1 |
Number | Date | Country | |
---|---|---|---|
20220204081 A1 | Jun 2022 | US |
Number | Date | Country | |
---|---|---|---|
62717108 | Aug 2018 | US | |
62698415 | Jul 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16512458 | Jul 2019 | US |
Child | 17654605 | US |