The invention relates to a device and method for displaying objects in the surroundings of a vehicle, of the type specified in the preamble of claims 1 and 11, respectively.
EP 1 447 271 A2 has already disclosed such a device and an associated method. The known device comprises a display device for displaying to the driver the position of the vehicle with respect to the lateral boundaries thereof after a parking process. In order to determine the positions of the lateral boundaries a sensor arrangement with two lateral distance sensors is provided on each side of the vehicle, said distance sensors each sensing objects in the surroundings of the vehicle to the left and to the right and determining the distance of said objects from the vehicle. The lateral distance sensors are each embodied here as ultrasound sensors with a narrow detection range, such as are usually used to measure parking spaces. The distance sensors here fulfil a double function and can be used both to measure the length of the parking space when the vehicle travels past before the parking process and to sense the parking space boundaries during the parking process.
Since the lateral distance sensors with their narrow detection range can only sense a partial area of the lateral surroundings of the vehicle, an evaluation device is provided which stores in memory the position, with respect to the vehicle, of objects or object areas which have already been sensed, and continuously updates said position on the basis of the route travelled along as the vehicle moves. In the course of the movement of the vehicle, the detection ranges of the lateral sensors therefore pass over the respective lateral surroundings of the vehicle, with the result that the lateral surroundings of the vehicle have been completely scanned after the parking process and can be displayed on the display device. The display device then displays a virtual plan view of the vehicle and the lateral surroundings of the vehicle with the lateral boundaries of the parking space.
This display can assist the driver, directly after the parking process, in making the decision as to which vehicle door can be opened to what extent and whether it is necessary to correct the parked position of the vehicle. However, the known display is suitable only to a limited degree for assisting the driver at a later time in carrying out the parking process itself or in carrying out a process for removing the vehicle from a parking space, since the surroundings of the vehicle may have changed since the parking process.
The object of the invention is to further develop a device and method for displaying objects in the surroundings of a vehicle of the type mentioned in the preamble of claim 1 or claim 11 to the effect that the assistance provided to the driver in carrying out parking processes and processes for removing the car from the parking space is improved.
This object is achieved according to the invention by means of the features of claim 1. Further features which advantageously refine the invention can be found in the dependent claims.
The advantage achieved with the invention is that information about the surroundings of the vehicle are already displayed on the display device when said surroundings of the vehicle have not yet been completely sensed by the sensor arrangement. In this context, the area of the surroundings of the vehicle for which no object information is yet available, since said area has not yet been passed over by the detection range of the sensor arrangement, is continuously determined. The dimensions of this area are then additionally displayed on the display device in order to permit the driver to estimate the surroundings of the vehicle realistically despite the still incomplete information.
For a particularly simple and reliable sensing of objects in the surroundings of the vehicle, the sensor arrangement preferably comprises at least one ultrasound sensor.
The sensor arrangement can particularly easily and cost-effectively comprise a sensor of a parking-space-measuring device which is present in any case and which is arranged on a side wall of the vehicle and provided for measuring parking spaces. Furthermore, the sensor arrangement can comprise at least one distance sensor which is arranged on a front area or rear area of the vehicle and is a component of a parking aid which is present in any case.
In order to give the driver a rapid and informative overview, the display device can be designed to represent the surroundings of the vehicle in a plan view. In this context, in addition to the surroundings of the vehicle, a virtual or schematic plan view of the vehicle can also be represented on the display device.
The surroundings of the vehicle are represented on the display device in the form of a plurality of segments, wherein the segments can be represented in a marked fashion. The marking can then depend on whether or not at least one object has been determined in the respective segment or whether or not the segment has already been passed over by the detection range of the sensor arrangement.
In order to improve even further the assistance for the driver, surroundings of the vehicle which completely enclose the vehicle are preferably displayed on the display device.
An exemplary embodiment of the invention is explained below in more detail on the basis of a graphic illustration, in which:
The sensors 11-14 are each embodied here as ultrasound distance sensors which are known per se, and are components of a parking aid of the vehicle 1, wherein the front distance sensors 11 and the rear distance sensors 12 are intended for determining the distance of the vehicle 1 from obstacles in front or obstacles behind and for outputting said distance to the driver in conjunction with the parking aid as an optical and/or acoustic signal. The detection ranges D1 and D2 of the front sensors 11 or rear sensors 12 each have, when viewed from above, approximately the shape of a circular segment which is produced as a sum of the circular-segment-shaped detection ranges of the individual sensors 11 and 12 which overlap one another and are each seen from above. The detection ranges D1 and D2 cover the front or the rear of the vehicle respectively over its entire width and point from the vehicle 1 over a maximum extent of approximate 4-6 meters. In this context, the term maximum extent of the respective detection range D1-D4 is intended to mean the distance, from the vehicle, in which an object can still just be sensed by the associated sensor.
The lateral sensors 13 and 14 are each arranged on the left-hand or right-hand side wall of the vehicle 1 in an associated corner region of the front bumper and provided for measuring parking spaces as the vehicle 1 travels past. The sensors 13 and 14 have, when viewed from above, a circular-segment-shaped detection range D3 and D4, respectively, with a maximum extent of approximately 4-6 meters, which extends from the side wall in the transverse direction of the vehicle. The detection ranges D3 and D4 cover the associated side wall of the vehicle 1 over only part of its longitudinal extent. In the exemplary embodiment according to
The sensor arrangement 10 composed of the sensors 11-14 has in total a detection range D which is formed by the sum of the detection ranges D1-D4, wherein the detection range D does not completely enclose the vehicle 1, but rather encloses it only partially. The sensor arrangement 10 is therefore not able to sense objects in the entire surroundings around the vehicle 1 by direct measurement. Instead, in each case a blind spot B1 or B2, respectively, adjoins the vehicle 1 in the left-hand and right-hand lateral areas, which blind spot B1 or B2, respectively, runs outside the detection range D and therefore objects cannot directly be sensed by the sensor arrangement 10 in said blind spot B1 or B2, respectively.
In order also to acquire object information for the blind spots B1 and B2, the device for displaying objects also contains storage means for storing the position of objects which have already been sensed by the sensor arrangement 10, a route-sensing device for sensing the route travelled along by the vehicle 1, and a processing device for continuously newly determining the position of the object relative to the vehicle 1 on the basis of the sensed route and the stored object position. Once the position of a stationary object has been sensed by the sensor arrangement 10, said position can therefore also be displayed relative to the vehicle 1 even if the object has already left the detection range D again owing to a movement of the vehicle 1.
The route-sensing device comprises at least one wheel sensor measuring the distance travelled by a vehicle wheel, and a steering angle sensor for measuring the steering angle which has been set and determining therefrom the route which has been travelled along in the transverse direction. In this context, the route-sensing device is also a component of the parking aid of the vehicle 1 and is intended for determining, during the measuring of the parking space, the route which has been travelled along as the vehicle travels past the parking space. The processing device is preferably embodied in an integral fashion with a control device of the parking aid.
In the process of removing a vehicle from a parking space which is shown in
In the further course of the process of removing the vehicle from a parking space, the vehicle moves further in a forward direction until the position shown in
The evaluation device is also designed to determine the region E in the surroundings of the vehicle which has been already passed over by the detection range D of the sensor arrangement 10. For this purpose, information about the geometrical extent of the detection range D is stored in the evaluation device. In this context, a continuously identical, pre-stored detection range D can be provided. Alternatively, the detection range D could, however, also be adapted on the basis of the ambient conditions or the functional status of the sensor arrangement 10, for example by correspondingly reducing the assumed detection range D1 when sensors are not available, or are only available to a limited degree, owing to a functional fault. On the basis of the detection range D and the sensed route, the evaluation device then determines which region E of the surroundings of the vehicle has already been passed over by the detection range D of the sensor arrangement 10. Information about the position of stationary objects is then available for this area, while no information is yet available about the rest of the area of the surroundings of the vehicle.
The extent of the area E can be defined here by the evaluation device in such a way that the maximum possible detection range D of the sensor arrangement 10 is always used as a basis i.e. that detection range D which would be possible given a free line of sight of the sensor arrangement 10, irrespective of the detection range D being bounded by an object. Alternatively, the extent of the area E is, as indicated in
The objects in the surroundings of the vehicle and the extent of the area E for which object information is available are displayed to the driver of the vehicle 1 on a display device 20 which is arranged in the passenger compartment of the vehicle 1, for example on a dashboard. Furthermore, there may also be provision for the driver to be additionally warned in an optical or acoustic fashion if a predefined front, rear or lateral minimum distance from one of the objects is undershot or if the vehicle 1 leaves the area E for which object information is available.
The corresponding display device 20 is shown in
As an alternative to the display illustrated in
Number | Date | Country | Kind |
---|---|---|---|
10 2009 024 062 | Jun 2009 | DE | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/EP2010/003408 | 6/7/2010 | WO | 00 | 11/18/2011 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2010/139487 | 12/9/2010 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6704653 | Kuriya et al. | Mar 2004 | B2 |
6919917 | Janssen | Jul 2005 | B1 |
7155888 | Diekhans | Jan 2007 | B2 |
7366595 | Shimizu et al. | Apr 2008 | B1 |
7640108 | Shimizu et al. | Dec 2009 | B2 |
8013721 | Yamanaka et al. | Sep 2011 | B2 |
8022818 | la Tendresse | Sep 2011 | B2 |
8395490 | Yuda | Mar 2013 | B2 |
8874324 | Eggers | Oct 2014 | B2 |
8885045 | Yanagi | Nov 2014 | B2 |
8922394 | Choi | Dec 2014 | B2 |
9098928 | Mizuta | Aug 2015 | B2 |
20100220189 | Yanagi | Sep 2010 | A1 |
20100315215 | Yuda | Dec 2010 | A1 |
20110010041 | Wagner et al. | Jan 2011 | A1 |
20120065877 | Jecker et al. | Mar 2012 | A1 |
20120154592 | Mizuta | Jun 2012 | A1 |
20150043782 | Lucas | Feb 2015 | A1 |
20150332103 | Yokota | Nov 2015 | A1 |
Number | Date | Country |
---|---|---|
43 33 112 | Mar 1995 | DE |
102006000245 | Jan 2007 | DE |
10 2007 032698 | Sep 2008 | DE |
10 2008 001648 | Nov 2009 | DE |
102013006048 | Oct 2014 | DE |
1 447 271 | Aug 2004 | EP |
2438464 | Apr 2012 | EP |
2013191050 | Sep 2013 | JP |
2014002752 | Jan 2014 | JP |
2014094737 | May 2014 | JP |
2014063186 | May 2014 | KR |
20140084760 | Jul 2014 | KR |
2008055567 | May 2008 | WO |
WO 2010139487 | Dec 2010 | WO |
WO 2016020342 | Feb 2016 | WO |
Entry |
---|
Beamforming-Based Acoustic Imaging for Distance Retrieval; Lay-Ekuakille, A. ; Trotta, Amerigo ; Vendramin, G. Instrumentation and Measurement Technology Conf. Proc. 2008. IMTC 2008. IEEE; DOI: 10.1109/IMTC.2008.4547274; Pub Year: 2008 , pp. 1466-1470. |
Autonomous blimp control using model-free reinforcement learning in a continuous state and action space; Rottmann, A. et al.; Intelligent Robots and Systems, 2007. IROS 2007. IEEE/RSJ International Conference on; DOI: 10.1109/IROS.2007.4399531; Pub. Yr: 2007 , pp. 1895-1900. |
Urban traffic avoiding car collisions fuzzy system based on ultrasound; Alonso, L. ; Oria, J.P. ; Arce, J. ; Fernandez, M. Automation Congress, 2008. WAC 2008. World; Publication Year: 2008 , pp. 1-6. |
3D geometry representation using multiview coding of image tiles; Y. Gao; G. Cheung; T. Maugey; P. Frossard; J. Liang; 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); Year: 2014; pp. 6157-6161, DOI: 10.1109/ICASSP.2014.6854787. |
International Search Report issued in PCT/EP2010/003408, mailed on Nov. 5, 2010, with translation, 6 pages. |
German Search Report issued in the corresponding German application No. 10 2009 024 062.4, mailed on Apr. 26, 2012, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20120065877 A1 | Mar 2012 | US |