This application claims priority under 35 U.S.C. § 119 to patent application no. DE 10 2015 221 356.0, filed on Oct. 30, 2015 in Germany, the disclosure of which is incorporated herein by reference in its entirety.
The disclosure relates to a device and to a method for providing a vehicle surround view for a vehicle, in particular for an agricultural utility vehicle.
Driver assistance systems for vehicles increasingly offer the possibility of displaying a vehicle surround view to the driver of the vehicle on a display unit in order to assist the driver in carrying out various driving manoeuvres. For this purpose, vehicle cameras, which are attached in the vehicle, provide camera images of an environment of the vehicle. In order to generate a vehicle surround view, said camera images are projected onto a projection surface by an image data processing unit of the driver assistance system. The vehicle surround view generated in this way is subsequently displayed to the driver of the vehicle on a display unit or a screen.
The vehicle cameras which are attached to the vehicle body of the vehicle can be calibrated intrinsically or extrinsically to continuously transmit the camera images of the environment of the vehicle to the image data processing unit of the driver assistance system. The camera images obtained from the vehicle cameras are mapped or projected onto a projection surface by the image data processing unit to generate a vehicle surround view. In conventional driver assistance systems, the projection surface is provided for a horizontal driving plane.
Agricultural utility vehicles can also be used in locations on slopes. Furthermore, some construction vehicles comprise for example stabilisers which can be folded out or extended in order to increase the stability of the construction vehicle. Said stabilisers influence the inclination of the construction vehicle in relation to the ground. The stabilisers and/or other actuators, for example excavator gripper arms or shovels, are also used in part to tilt the construction vehicle in a controlled manner. This is especially helpful in the case of a fine ground, in order to produce an oblique side wall, in particular when excavating a trench.
If a vehicle is located on an inclined driving plane or slope plane, or if the inclination of the vehicle is tilted, for example by extendable stabilisers, the change in location of the vehicle cameras which are attached to the vehicle body, relative to the normal, substantially horizontally extending driving plane or standing plane of the vehicle results in projection or image distortions, which reduce the image quality of the displayed vehicle surround view.
One problem addressed by the present disclosure is thus that of providing a method and a device for providing a vehicle surround view for a vehicle, in which sufficient image quality of the vehicle surround view is ensured in the case of any desired inclination of the vehicle or of the driving or standing plane.
This problem is solved according to the disclosure by a device for providing a vehicle surround view for a vehicle having the features described herein.
According to a first aspect, the disclosure thus provides a device for providing a vehicle surround view for a vehicle, comprising:
In one possible embodiment of the device according to the disclosure, the image data processing unit is designed to rotate the projection surface about one or more axes of rotation relative to the normal driving or standing plane and/or to shift said projection surface in a translational manner relative to a coordinate system origin, according to the detected change in location of the at least one vehicle camera.
In one possible embodiment of the device according to the disclosure, the projection surface used by the image data processing unit is a dish-shaped projection surface which is dynamically adapted according to the detected change in location of the at least one vehicle camera.
In one possible embodiment of the device according to the disclosure, an inclination-capture unit is provided, which captures a currently existing inclination of the vehicle relative to a normal driving or standing plane.
In another possible embodiment of the device according to the disclosure, a location-detection unit is provided, which detects a change in location of the at least one vehicle camera relative to the normal driving or standing plane of the vehicle according to the inclination captured by the inclination-capture unit.
In another possible embodiment of the device according to the disclosure, an inclination-compensation unit is provided, which compensates for the inclination captured by the location-capture unit in such a way that a driver's seat provided in a driver's cabin of the vehicle and/or a working assembly of the vehicle is oriented in a substantially horizontal manner.
In another possible embodiment of the device according to the disclosure, the image data processing unit is configured to generate a rotation matrix based on angles of inclination which are captured by the inclination-capture unit.
In another possible embodiment of the device according to the disclosure, projection surface points of the projection surface are multiplied by the generated rotation matrix by means of the image data processing unit to dynamically adapt the projection surface.
In another possible embodiment of the device according to the disclosure, the vehicle cameras are attached to a vehicle body of the vehicle and/or to a driver's cabin of the vehicle.
In another possible embodiment of the device according to the disclosure, the driver's cabin is mounted so as to be rotatable, together with the vehicle cameras which are attached thereto, relative to the vehicle body of the vehicle.
In another possible embodiment of the device according to the disclosure, a rotation-capture unit is provided, which captures a rotation of the driver's cabin relative to the vehicle body of the vehicle.
In another possible embodiment of the device according to the disclosure, the location-detection unit detects the change in location of the at least one vehicle camera according to the rotation of the driver's cabin relative to the vehicle body which is captured by the rotation-capture unit.
In another possible embodiment of the device according to the disclosure, a display unit is provided, which visually displays the generated vehicle surround view to a driver of the vehicle.
In one possible embodiment, the vehicle (F) is placed in a position which is inclined with respect to a normal, substantially horizontally extending, driving or standing plane by means of actuators or stabilisers. Alternatively, the vehicle is located on a slope plane which is tilted with respect to a normal, substantially horizontal, driving or standing plane.
According to another aspect, the disclosure further provides a method for providing a vehicle surround view for a vehicle, having the features disclosed herein.
The disclosure thus provides a method for providing a vehicle surround view for a vehicle, comprising the steps of:
According to another aspect, the disclosure further provides a driver assistance system having the features disclosed herein.
The disclosure thus provides a driver assistance system for a vehicle, comprising a device for providing a vehicle surround view for the vehicle, said device comprising:
According to another aspect, the disclosure further provides a vehicle comprising a driver assistance system of this type. The vehicle is preferably an agricultural vehicle, in particular a construction vehicle, an agricultural utility vehicle or a forestry vehicle.
Hereinafter, various embodiments of the method according to the disclosure and of the device according to the disclosure for providing a vehicle surround view will be described in greater detail with reference to the accompanying drawings, in which:
The block diagram shown in
The device 1 further comprises a location-detection unit 4 which detects a change in location of at least one or all of the vehicle cameras 2-i relative to a normal driving or standing plane of the vehicle F. The normal driving or standing plane preferably extends in a substantially horizontal manner. The image data processing unit 3 projects the camera images KB received from the vehicle cameras 2 onto a projection surface PF to generate the vehicle surround view FRA. Said projection surface PF is adapted by the image data processing unit 3 according to the detected change in location of the at least one vehicle camera 2 relative to the normal driving or standing plane. In this case, the projection surface PF is preferably a three-dimensional, dish-shaped projection surface, as shown in
The vehicle cameras 2 and the driver assistance system are preferably calibrated for the normal, substantially horizontally extending, driving or standing plane. The normal driving or standing plane is preferably the plane which the vehicle uses in normal operation. For most vehicles, the normal reference plane is a horizontally extending plane. For special vehicles, the normal reference plane can have a different orientation.
The image data processing unit 3 preferably comprises at least one processor which rotates the projection surface PF about one or more axes of rotation x, y, z relative to the driving plane FE and/or shifts said projection surface in a translational manner relative to a coordinate original O, according to the detected change in location of the vehicle cameras 2. In one preferred embodiment, the projection surface PF used is a dish-shaped projection surface. Said projection surface is dynamically adapted according to the detected change in location of the vehicle cameras 2. Depending on the application, different projection surfaces can also be used. For example, the projection surface PF can also be formed so as to be elliptical or planar.
PF′=DM·PF
In one possible embodiment of the device according to the disclosure, a driver's cabin KAB is mounted so as to be rotatable relative to the vehicle body KAR of the vehicle F, wherein a rotation-capture unit 8 captures a rotation of the driver's cabin KAB relative to the vehicle body KAR of the vehicle F. In one possible embodiment, the location-detection unit 4 detects the change in location of the at least one vehicle camera 2 relative to the driving plane FE or standing plane additionally according to the rotation of the driver's cabin KAB which is captured by the rotation-capture unit 8.
In the embodiment shown in
In the embodiment shown in
In one possible embodiment, the vehicle cameras 2-5, 2-6 which are attached to the vehicle body KAR are also fisheye cameras having an aperture angle of more than 170°, preferably of 175° or more. In one possible embodiment, the inclination-compensation unit 7 can comprise a swivel apparatus which is provided on the vehicle wheels R, which device keeps the vehicle F in a horizontal position within certain limits. In this case, the swivel device forms a connection between firstly a drive source and secondly a wheel carrier of the wheel R.
The location-detection unit 4 can comprise additional sensors. For example, the location-detection unit 4 can contain location sensors, in particular gyroscopic sensors, for determining the inclination of the driving plane FE and calculating therefrom the change in location of the vehicle cameras 2 relative to the driving plane FE or standing plane. In another possible embodiment, the location-detection unit 6 can use further data which is received for example by a receiver of the driver assistance system FAS. In one possible embodiment, the driver assistance system FAS comprises a GPS receiver for receiving GPS data which is evaluated by the location-detection unit 6. Furthermore, the driver assistance system FAS of the vehicle F can comprise a navigation system which transmits navigation data to the location-detection unit 6. In this case, the location-detection unit 6 additionally evaluates the obtained navigation data and/or GPS data to detect the change in location of the vehicle cameras 2 relative to the normal driving or standing plane. The adaptation of the projection surface PF by the image data processing unit 3 preferably takes place dynamically in order to take into consideration a driving plane FE which changes continuously when the vehicle F is moving. In this case, the recalculation of the projection surface PF is preferably carried out by the data processing unit 3 in real time.
The camera images KB provided by the vehicle cameras 2 are projected onto the calculated projection surface PF to generate the vehicle surround view FRA, which is displayed to the driver FA on the display unit 7. In one possible embodiment, the displayed vehicle surround view FRA is enhanced with additional information, or additional information relating to the vehicle surround view FRA is superimposed thereon. For example, an expected driving trajectory of the vehicle F due to the movement of the vehicle over the vehicle plane FE is displayed to the driver FA in an overlay view on the display unit 7. By means of the continuous dynamic adaptation of the projection surface PF, not only is the image quality of the displayed vehicle surround view FRA considerably improved, but the quality of the additionally superimposed displayed information data is also increased.
In a first step S1, camera images KB of the vehicle environment are generated by vehicle cameras 2.
In another step S2, a change in location of the vehicle cameras 2 relative to the normal driving plane FE or standing plane is detected.
Subsequently, in step S3, the projection surface PF is dynamically adapted according to the detected change in location of the vehicle cameras 2.
Lastly, in step S4, the camera images KB provided by the vehicle cameras 2 are projected onto the adapted projection surface PF′ to generate the vehicle surround view FRA. Said vehicle surround view FRA is subsequently displayed to the driver FA of the vehicle F on a display unit.
In one possible embodiment, the method shown in
Number | Date | Country | Kind |
---|---|---|---|
10 2015 221 356 | Oct 2015 | DE | national |
Number | Name | Date | Kind |
---|---|---|---|
7307655 | Okamoto et al. | Dec 2007 | B1 |
20080309784 | Asari et al. | Dec 2008 | A1 |
20120262580 | Huebner et al. | Oct 2012 | A1 |
20120287232 | Natroshvili et al. | Nov 2012 | A1 |
20130162830 | Mitsuta et al. | Jun 2013 | A1 |
20140036076 | Nerayoff | Feb 2014 | A1 |
Number | Date | Country |
---|---|---|
10253378 | Jun 2003 | DE |
102010041490 | Mar 2012 | DE |
11 2012 004 354 | Jul 2014 | DE |
1115250 | Jun 2012 | EP |
2 511 137 | Oct 2012 | EP |
2013074350 | Apr 2013 | JP |
2014225803 | Dec 2014 | JP |
WO9305640 | Apr 1993 | WO |
WO9516228 | Jun 1995 | WO |
2015048967 | Apr 2015 | WO |
Entry |
---|
European Search Report corresponding to application No. 16195573.7, dated Mar. 22, 2017 (8 pages). |
Number | Date | Country | |
---|---|---|---|
20170120820 A1 | May 2017 | US |