The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.
Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties. Trailer assist systems are known that may determine an angle of a trailer hitched at a vehicle. Examples of such known systems are described in U.S. Pat. Nos. 9,085,261 and/or 6,690,268, which are hereby incorporated herein by reference in their entireties.
The present invention provides a driver assistance system or vision system or imaging system for a vehicle that utilizes a camera disposed at a rear portion of a vehicle and having a field of view exterior of the vehicle, the field of view encompassing at least a portion of a trailer hitched to the vehicle. The system also includes a steering wheel angle sensor operable to determine a steering wheel angle of the vehicle and a wheel revolutions per minute (RPM) sensor operable to determine an RPM of a wheel of the vehicle. The system also includes a control comprising an image processor operable to process image data captured by the camera and sensor data captured by the sensors, with the image data captured by the camera representative of the trailer hitched to the vehicle. The control, responsive to processing of image data captured by the camera, determines a trailer angle of the trailer relative to the vehicle and may determine or be provided with a hitch location relative to a rear axle of the vehicle. The control, responsive to processing of sensor data from the steering wheel angle sensor and the RPM sensor determines a steering wheel angle of the vehicle and a wheel RPM of the vehicle. The control estimates a trailer beam length using the determined trailer angle, the determined steering wheel angle, and the determined wheel RPM. The determined or provided hitch location relative to the rear axle of the vehicle may also be used in the trailer beam length estimation.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
A vehicle and trailer maneuvering system or maneuver assist system and/or driving assist system operates to capture images exterior of the vehicle and of a trailer being towed by the vehicle and may process the captured image data to determine a path of travel for the vehicle and trailer and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle and trailer in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and that may provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide a display, such as a rearview display or a top down or bird's eye or surround view display or the like.
Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes a trailer assist system 12 (such as part of a rear backup assist system) that is operable to assist in backing up or reversing the vehicle with a hitched trailer that is hitched at the rear of the vehicle via a trailer hitch 14, and the system may maneuver the vehicle 10 and trailer 16 toward a desired or selected location. The trailer assist system 12 includes at least one exterior viewing vehicle-based imaging sensor or camera, such as a rearward viewing imaging sensor or camera 18, which may comprise a rear backup camera of the vehicle (and the system may optionally include multiple exterior viewing imaging sensors or cameras, such as a sideward/rearward viewing camera at respective sides of the vehicle), which captures image data representative of the scene exterior and at least rearward of the vehicle 10, with the field of view of the camera encompassing the trailer hitch 14 and/or trailer 16, and with the camera 18 having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (
Trailer beam length is an important if not essential parameter for many automated trailering applications (e.g., reversing, jackknife prevention, parking, etc.) because automated trailering requires accurate position and heading predictions for the vehicle-trailer system. The accuracy of these kinematic predictions depends on the measurement accuracy of parameters such as beam length. Trailer beam length for single-axle trailers is defined as the length from the hitch point to the rear axle. For multi-axle trailers, the effective kinematic trailer beam length is defined as the length from the hitch point to an intermediate point between the multiple axles. Ideally, beam length is measured or estimated or determined using an automatic or online method, as otherwise the user or driver must manually measure every time a different trailer is hitched. Manual measuring of beam length can be difficult, especially for long and/or multi-axle trailers. Additionally, the accuracy of such measurements is highly dependent upon the user and the tool used.
Implementations of the present invention estimate trailer beam length online using vehicle and trailer kinematics with input from vehicle sensors (e.g., a visual hitch point detection system and a trailer angle detection system). Implementations provided estimate the trailer beam length for any trailer currently hitched to the vehicle with minimal user setup requirements and eliminates the need for the driver to manually measure.
Referring now to
Referring now to
In some examples, a gyroscope 19c determines the vehicle yaw rate 34 ({dot over (θ)}G,raw) and an accelerometer 19d determines an acceleration rate 36 (alat,raw) to calculate a vehicle yaw rate 40 at a vehicle yaw rate calculator 38. The vehicle yaw rate 40 may be calculated a number of different ways. For example, the using the steering wheel angle sensor 19a and a bicycle kinematic model of Equation (1) may calculate the vehicle yaw rate 40 (θB):
In Equation (1), V represents the vehicle speed, Lwheelbase represents the vehicle's wheelbase, and δ represents the vehicle front wheel angle (e.g., from a look up table mapping δ to steering wheel angle (SWA)). The vehicle yaw rate 40 (θG) may also be calculated using the gyroscope 19c with Equation (2):
{dot over (θ)}G={dot over (θ)}G,raw{dot over (θ)}G,bias (2)
In Equation (2), {dot over (θ)}G,raw represents the raw gyroscope measurement, while {dot over (θ)}G,bias represents the gyroscope zero bias. The vehicle yaw rate 40 ({dot over (θ)}ws) may also be calculated using differential wheel speeds from the wheel RPM sensors 19b using Equation (3):
In Equation (3), VRL and VRR represent the rear right and the rear left wheel speeds respectively. The track width is represented by Ltw and {dot over (θ)}ws, bias represents zero bias in the yaw rate calculated from wheel speeds. The vehicle yaw rate 40 ({dot over (θ)}a) may also be calculated using the accelerometer 19d using Equation (4):
In Equation (4), V represents the vehicle speed, alat,raw represents the raw lateral acceleration measurement, and alat, bias represents the lateral acceleration zero bias. In some implementations, the vehicle yaw rate 40 may be estimated from any subset of these sensors using Equation (5):
{dot over (θ)}=wB{dot over (θ)}BwG{dot over (θ)}G+wws{dot over (θ)}ws+wa{dot over (θ)}a (5)
In Equation (5), wB,wG,wws,wa (i.e., each w variable) is an individual weight for each respective sensor 19a-d based on, for example, sensor noise characteristics, resolution, and accuracy of estimation. That is, each sensor 19 may be individually weighted and the vehicle yaw rate calculation may be estimated based on any subset of the sensors (with the weights shifting accordingly). For example, a more accurate and/or reliable sensor 19 may be weighted more than a less accurate and/or less reliable sensor 19. A sensor 19 that fails or otherwise fails to meet a threshold of quality and/or reliability may be removed from the calculation entirely (i.e., a weight of zero).
Responsive to calibration driving maneuvers conducted by a driver of the vehicle (and possibly prompted by the system) and the calculated vehicle yaw rate 40, the trailer angle and hitch detection system 20 outputs 30a, 30b, 32 (i.e., φ, {dot over (φ)}, and Lhitch) and the vehicle yaw rate 40 may be used to determine when the relative vehicle-trailer angular velocity reaches an approximate steady-state at steady state determiner 44. That is, these outputs are used to determine when the relative trailer angular velocity is below a threshold amount (e.g., at or near zero). In some examples, a front wheel angle 46 (δ) derived at least in part from a steering wheel angle (SWA) output 42 from the steering wheel angle sensor 19a may be determined a wheel angle converter 48, which in turn may be used to assist in the steady state determination at the determiner 44.
The wheel RPM sensor 19b output, in some implementations, determines a speed calculation (V) of the vehicle at a speed calculator 50. For example, speed may be calculated using the following Equation (6):
V=½(RPMrl+RPMrr)·Lwhlcirc (6)
In Equation (6), V is vehicle speed (e.g., in meters per second), RPMrl and RPMrr are rear left and rear right wheel revolutions per minute, respectively, and Lwhlcirc is the effective wheel circumference (e.g., in meters). In some examples, the driver may be required to drive the vehicle during calibration maneuvers at a speed that exceeds a minimum measurement threshold for the wheel RPM sensors for the particular vehicle (i.e., the wheel revolves at a rate greater than a threshold).
Using the velocity of the vehicle V, the system determines, at threshold determiner 52, if the vehicle is moving (i.e., if V is not equal to zero or is near zero), if the absolute value of the SWA 42 is greater than a minimum threshold steering angle and less than a maximum threshold steering angle, and if the absolute value of the change in SWA 42 is less than a threshold steer rate. If all three conditions are true, the system enables the steady state determiner 44. That is, the steady state determiner 44 always outputs false when not enabled by the threshold determiner 52. Otherwise, the steady state determiner 44 outputs true or false based on the inputs 30a, 30b, 32, 40 as described above.
Referring now to
In Equation (7), φ is the relative vehicle-trailer angle, Lbeam is the trailer beam length, and Lhitch is the distance from the vehicle rear axle to the hitch point. The vehicle yaw rate is represented by {dot over (θ)} and V still represents the vehicle speed. Based on Equation (7), the unsteady state trailer beam length (i.e., when {dot over (φ)} is not equal to zero or near zero) may be estimated using:
The steady state trailer beam length (i.e., when {dot over (φ)} is equal to zero or near zero) may be estimated using:
The vehicle-trailer kinematics of Equations (7)-(9) may be dependent upon the assumption that the vehicle and trailer are moving in the same direction as their wheel direction with proper wheel/ground contact (i.e., neither the vehicle nor the trailer are sliding or slipping).
In some implementations, the calibration driving maneuvers may include sufficient curved paths. For example, the user may not receive the “Calibration Complete” message until sufficient curve to the vehicle's path is detected. The vehicle may instruct the driver to turn the wheel a threshold amount while driving forward to achieve sufficient curve. Still referring to
w
N
=w
N−1+1 (10)
and
The average at the current sample (N) and the previous sample (N−1) is represented by
The determined Lbeam,avg may then be passed through one or more convergence checks 62, and if passed, Lbeam,avg is stored in non-volatile memory of the vehicle where it may be recalled whenever the system detects the respective trailer is hitched.
In some examples, after a minimum number of samples (N), convergence checks are performed over a subset of subsequent beam length calculations. For example, Equation (12) may determine convergence in probability:
Pr(|
In Equation (12),
In some examples, additional criteria to increase accuracy of the estimate includes a monotonicity check:
Thresholdlow≤Pr(Lbeam,n>
In Equation (13), Lbeam,n is the calculated trailer beam length for sample n. If the convergence checks pass, the system may prompt the user with a message. For example, the system may indicate that calibration is complete (e.g., via a display screen in the vehicle). The system may store
The thresholds described herein (e.g., rate of change thresholds, steering wheel angle thresholds, etc.) may be empirically chosen for numerical stability of the solution (i.e., the estimation of the trailer beam length), empirically chosen for measurement and timing errors inherent in incoming CAN data, and/or derived from first principles to ensure operation within the valid regions of the kinematic equations.
Optionally, the trailer assist system bounds the estimated trailer beam length between a minimum beam length value and a maximum beam length value to provide a failsafe against aberrant behavior. The driving maneuver may be agnostic (e.g., free form driving with now prescribed calibration maneuvers the driver has to follow) as long as the maneuver has sufficient curves, while the maneuver may be rejected when the maneuver does not have sufficient curves (e.g., is straight or nearly straight) in order to be more robust with regards to errant estimated trailer beam length due to random variables and disturbances. For example, sensor noise, error in state estimation, asynchronous CAN data, environmental factors such as variability in ground conditions, and user calibration drive maneuver variability may all contribute to an errant estimated trailer beam length.
To derive the maximum steering wheel angle (i.e., Thresholdsteermax), the vehicle yaw rate {dot over (θ)} may be determined using bicycle kinematics of Equation (14):
Combining Equation (9) with Equation (14) yields Equation (15)-(17):
With a constant front wheel angle (δ), the trailer beam length will have extrema when the relative vehicle-trailer angle (ϕext) is equal to Equation (18):
For a particular vehicle (i.e., a fixed Lwheelbase) and given a maximum hitch length (Lhitch,max), the sign of ϕext is chosen to give a positive maximum trailer beam length (LBeam
With a given LBeam
The trailer beam length may be correlated or associated with the specific trailer hitched to the vehicle and recalled whenever the system detects the same trailer is hitched to the vehicle. The correlation (e.g., a trailer identification) may be stored in non-volatile memory with the estimated trailer beam length. The system may prompt or accept a variety of calibration driving maneuvers to accurately estimate the trailer beam length. For example, the vehicle may be driven in a circle with a fixed or variable steering angle. The vehicle may also be driven in arcs with various radii, slaloms, or with multiple right and left turns.
Thus, the system of the present invention determines or estimates or calculates the trailer beam length with high accuracy to enable better performance of other automated trailer features (e.g., backing up, parking, etc.). The driver may switch between trailers and make use of such automated trailer features with minimal setup requirements after initially calibrating each trailer. Furthermore, the accuracy of the trailer beam length estimation is independent from the user skill and measurement tool quality which reduced feature performance variability between users and trailers. The system also eliminates the difficulty of measuring trailer beam length for long and/or multi-axle trailers. The calculation methodology includes convergence criteria and is maneuver agnostic. That is, the driver is not required to perform a prescribed calibration maneuver and instead the system may calibrate from free-form driving.
The system may utilize aspects of the trailering or trailer angle detection systems or trailer hitch assist systems described in U.S. Pat. Nos. 9,085,261 and/or 6,690,268, and/or U.S. Publication Nos. US-2019-0297233; US-2019-0064831; US-2019-0016264; US-2018-0276839; US-2018-0276838; US-2018-0253608; US-2018-0215382; US-2018-0211528; US-2017-0254873; US-2017-0217372; US-2017-0050672; US-2015-0217693; US-2014-0160276; US-2014-0085472 and/or US-2015-0002670, and/or U.S. patent application Ser. No. 15/929,535, filed May 8, 2020 (Attorney Docket MAG04 P3842), which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EYEQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,501; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2014-0022390; US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application is related to U.S. provisional applications, Ser. No. 62/952,748, filed Dec. 23, 2019, Ser. No. 62/938,411, filed Nov. 21, 2019, and Ser. No. 62/868,051, filed Jun. 28, 2019, which are hereby incorporated herein by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
62952748 | Dec 2019 | US | |
62938411 | Nov 2019 | US | |
62868051 | Jun 2019 | US |