The present application relates to vehicles moving relative to an object, and is particularly directed to a vehicle driver assistance apparatus for assisting a vehicle driver in maneuvering the vehicle relative to an object, such as a truck driver backing up a truck trailer to the dock edge of a truck loading dock.
In some trucks, a camera may be mounted on the truck to view a target on the loading dock to indicate to the truck driver the distance between the truck trailer and the edge of the loading dock. In some other trucks, a number of lasers may be used in conjunction with a number of cameras to determine the distance between the truck trailer and the edge of the loading dock. It would be desirable to provide a relatively simple and reliable vehicle driver assistance apparatus to assist a vehicle driver in maneuvering the vehicle (such as the truck trailer) relative to an object (such as the edge of the loading dock).
In accordance with one embodiment, a vehicle driver assistance apparatus is provided for a vehicle. The vehicle driver assistance apparatus comprises an image capture device arranged to capture image data which is representative of an object in vicinity of the vehicle. The vehicle driver assistance apparatus further comprises an electronic controller arranged to (i) process the captured image data to identify an edge line associated with the object, and (ii) process the captured image data further based upon the identified edge line to provide a signal which is indicative of a distance between the vehicle and the edge line of the object.
In accordance with another embodiment, a vehicle driver assistance apparatus is provided for a vehicle having an image capture device which captures successive images of an object in vicinity of the vehicle. The vehicle driver assistance apparatus comprises a driver display device, and means for presenting on the driver display device the distance between the vehicle and an edge line of the object based upon mathematical calculations involving at least one edge line contained in the captured successive images of the object in vicinity of the vehicle.
In accordance with yet another embodiment, a method is provided of processing images having a first edge line of an object in vicinity of a vehicle to determine a distance between the vehicle and the object. The method comprises the steps of receiving a first image containing the first edge line of the object in vicinity of the vehicle at a first time, receiving a second image containing the first edge line in vicinity of the vehicle at a second time which is after the first time, and determining the distance between the vehicle and the object based upon a comparison of the relative positions of the first edge line in the first image and the first edge line in the second image.
Referring to
Image capture device 130 having an optical axis L1 is mounted on trailer 110. Image capture device 130 is mounted near top of trailer 110. Other mounting locations for image capture device 130 on trailer 110 are possible. As shown in
Also as shown in
Image capture device 130 may comprise any conventional type of high-speed digital video camera which captures image data in real time. Camera 130 may comprise a backward looking camera. For example, camera 130 may comprise an imaging device including a Sunex® miniature ultra-wide-angle lens and an Omnivision® High Dynamic Range imaging chip. As another example, camera 130 may comprise part of a conventional vehicle visioning system, as is known. Certain components of the vehicle visioning system may depend upon the particular type of vehicle in which the visioning system is installed. Optional light source 140 may be provided on trailer 110 to illuminate the area in vicinity of loading dock 120 when trailer is being backed up towards dock edge 122 of loading dock 120.
Referring to
Controller 202 provides one or more signals to driver display device 210 located in the driver compartment of the vehicle when controller 202 executes an application program stored in program storage memory 206. Controller 202 may also provide one or more signals to driver alert device 212. Driver alert device 212 may be located in the vehicle and may include any combination visual, audible, and haptic devices, for example. Alternatively, or in addition to, driver alert device 212 may be located outside of the vehicle and may include any combination of visual and audible devices, for example.
Components of apparatus 200 may be powered on when vehicle ignition is turned on. Components of apparatus 200 co-operate to provide an indication to the vehicle driver of the running distance between trailer 110 and dock edge 122 of loading dock 120 as trailer 110 is backing up to loading dock 120 as shown in
Referring to
Reference to
In step 302 in
In step 306, controller 202 determines angle θ1 (
A determination is then made in step 314 as to whether the calculated running distance value R1 from step 310 is less than a predetermined distance value. If determination in step 314 is negative (i.e., trailer 110 has not yet backed up to within a predetermined distance from dock edge DE1 appearing on driver display device 210 shown in
In
The above-described process repeats itself to provide updated running distances until the last updated running distance is within a predetermined distance value such as shown in the display screen 430 of
When the last updated running distance is within the predetermined distance value, the determination in step 314 in
Alternatively, or in addition to presenting a message on driver display device 210, driver alert device 212 may be activated to indicate to the vehicle driver that trailer 110 is within the predetermined distance from dock edge 122 of loading dock 120. Also, vehicle brakes may be automatically activated (i.e., without vehicle driver intervention) when the last updated running distance is determined to be within the predetermined distance value from dock edge 122 of loading dock 120. The process of
Another embodiment is illustrated in flow diagram 600 of
It should be apparent that the last equation (i.e., Equation (8)) suggests that only the angular difference (i.e., Δα) needs to be known to solve for the unknown running distance of R2 since the quantities of Δh and D1 are known values. Accordingly, if a value for Δα can be determined, the value of R2 in Equation (8) above can be solved by using the computing capability of controller 202. Moreover, the angles α1 and α2 need not be determined relative to only horizontal extension line HL, but can be determined relative to any arbitrary reference line. For purposes of describing the embodiment of
In step 602 of
A determination is made in step 608 as to whether the top and bottom lines from steps 604 and 606 are parallel to each other. If determination in step 608 is negative (i.e., the top and bottom lines are not parallel to each other), the process returns back to steps 604 and 606 to identify top and bottom horizontal lines again. However, if determination in step 608 is affirmative (i.e., the top and bottom lines are parallel to each other), the process proceeds to step 610. In step 610, a determination is made of the angle α1 of the higher contrast top line L2 relative to the optical axis L1 of the camera 130, as shown in
Then, in step 616, camera 130 captures the next frame of image data. In step 618, controller 202 processes the captured next image to identify the higher contrast top horizontal line DE2 as shown in
In step 626, a determination is made of the next angle α2 of the higher contrast top horizontal line L2 relative to the optical axis L1 of the camera 130, as shown in
The retrieved value of Δh from step 630, the known approximate distance value of D1 from step 614, and the calculated angular difference value of Δα from step 628 are plugged into Equation (8) above. As shown in step 632, Equation (8) is then solved for the unknown variable of R2 which represents the current running distance from trailer 110 to dock edge 122 of loading dock 120. After the value of R2 is determined in step 632, the R2 value is displayed as display screen 420 (
In step 636, the value of the angle α1 is set equal to the value of the angle α2, the initial frame data is set equal to the next frame data. Then, in step 638, a determination is made as to whether the just calculated value of R2 is less than a predetermined distance value. If determination in step 638 is negative (i.e., the current running distance from the trailer 110 to the dock edge 122 has not yet reached below the predetermined distance), the process proceeds back to 614 to capture the next frame of data and to move the trailer 110 back another approximate distance towards dock edge 122. The above-described process in
When the last updated running distance is within the predetermined distance value, the determination in step 638 in
Alternatively, or in addition to presenting a message on driver display device 210, driver alert device 212 may be activated to indicate to the vehicle driver that trailer 110 is within the predetermined distance from dock edge 122 of loading dock 120. Also, vehicle brakes may be automatically activated (i.e., without vehicle driver intervention) when the last updated running distance is determined to be within the predetermined distance value from dock edge 122 of loading dock 120.
It should be apparent from the embodiment of
It should be apparent that running distance of the vehicle from dock edge 122 of loading dock 120 is determined by performing frame-by-frame mathematical analysis and calculations of angles associated with a higher contrast horizontal line which is contained in successive image frames and is representative of dock edge 122 of loading dock 120 and. Each angle value may be based on approximate height of dock edge 122 and approximate distance the vehicle has moved between successive image frames. Accordingly, each corresponding running distance value is based on approximate height of dock edge 122 and approximate distance the vehicle has moved between successive image frames.
It should also be apparent that the vehicle driver is provided with an alert or a warning that the vehicle is closely approaching or has moved to within a predetermined distance from an object (such as a fixed object like dock edge 122 of loading dock 120). The alert or warning is provided without the use of any external device on loading dock 120. Accordingly, a relatively simple and reliable way of indicating the running distance between the vehicle and the dock edge 122 of loading dock 120 is provided to assist the vehicle driver in maneuvering the vehicle relative to the loading dock 120.
It should be understood that the three display screens 410, 420, 430 and the time spacing between times T1 and T2 and between times T2 and T3 shown in
It should also be understood that an assumption has been made that a higher contrast horizontal line can be identified to represent the dock edge. However, it is conceivable that such a higher contrast horizontal line may not be identifiable. If this should be the case, the controller 202 may provide a signal or message to driver display device 210 to indicate to the vehicle driver that an identifiable horizontal line is unable to be found in the frames of image data being analyzed. Accordingly, the vehicle driver is alerted of the absence of a dock edge line on the driver display device 210.
Although the above description describes either one or two horizontal lines in successive image frames being identified and compared with each other, it is conceivable that any number of horizontal lines in successive frames be identified and compared. Further, although the above description describes identified dock edge lines as being green, yellow, or red, it is conceivable any combination of colors including other than green, yellow, and red are possible.
Also, although the above description describes use of one electronic controller, it is conceivable that any number of electronic controllers may be used. As an example, camera 130 may have its own dedicated electronic controller. Moreover, it is conceivable that any type of electronic controller may be used. Suitable electronic controllers for use in vehicles are known and, therefore, have not been described.
Also, although the above description describes apparatus 200 as comprising both driver display device 210 and display alert device 220, it is conceivable that apparatus 200 may include only driver display device 210 and not a driver alert device. It is also conceivable that apparatus 200 may include only driver alert device 220 and not a driver display device.
Further, although the above description describes camera 130 as being a video camera which captures real-time image data, it is conceivable that a picture camera which captures still-frames be used. As an example, camera 130 may comprise a color camera which provides single screen shots of color pictures. It should be apparent that a single video frame from a video camera may be thought of as a single screen shot.
Although the above description describes a camera being mounted on back of the vehicle, it is conceivable that the camera may be mounted at a location which is other than on back of the vehicle. As examples, the camera may be mounted on one side of the vehicle or at front of the vehicle. It is also conceivable that the camera captures image data which is representative of objects in vicinity of other than back of the vehicle. As an example, the camera may capture image data which is representative of objects in front of the vehicle. In this example, rules contained in algorithms for determining running distance in the forward direction may need to be modified accordingly based on the fact that objects are now in front of the vehicle and not in back of the vehicle.
Also, although the above description describes apparatus 200 being used in a heavy vehicle such as a truck, it is conceivable that apparatus 200 may be used in a non-heavy vehicle such as a passenger car. It is also conceivable that apparatus 200 may be used in other types of “vehicles” such as busses, recreational vehicles, railway cars, bulldozers, fork lifts, and the like where movement of the vehicle toward an object is possible.
While the present invention has been illustrated by the description of example processes and system components, and while the various processes and components have been described in detail, applicant does not intend to restrict or in any way limit the scope of the appended claims to such detail. Additional modifications will also readily appear to those skilled in the art. The invention in its broadest aspects is therefore not limited to the specific details, implementations, or illustrative examples shown and described. Accordingly, departures may be made from such details without departing from the spirit or scope of applicant's general inventive concept.
Number | Name | Date | Kind |
---|---|---|---|
4214266 | Myers | Jul 1980 | A |
4942533 | Kakinami | Jul 1990 | A |
5574426 | Shisgal | Nov 1996 | A |
6078849 | Brady | Jun 2000 | A |
6172601 | Wada | Jan 2001 | B1 |
6693524 | Payne | Feb 2004 | B1 |
6865138 | Li | Mar 2005 | B1 |
6923080 | Dobler et al. | Aug 2005 | B1 |
7375621 | Hines | May 2008 | B1 |
7620518 | Schmid | Nov 2009 | B2 |
8164628 | Stein | Apr 2012 | B2 |
8643724 | Schofield et al. | Feb 2014 | B2 |
20040016870 | Pawlicki | Jan 2004 | A1 |
20040155811 | Albero | Aug 2004 | A1 |
20050073433 | Gunderson | Apr 2005 | A1 |
20060256198 | Nishiuchi | Nov 2006 | A1 |
20070208482 | Thiede | Sep 2007 | A1 |
20080042865 | Shephard | Feb 2008 | A1 |
20080088707 | Iwaki | Apr 2008 | A1 |
20080167781 | Labuhn | Jul 2008 | A1 |
20090309710 | Kakinami | Dec 2009 | A1 |
20100045448 | Kakinami | Feb 2010 | A1 |
20110082613 | Oetiker | Apr 2011 | A1 |
20120170808 | Ogata | Jul 2012 | A1 |
20120327239 | Inoue | Dec 2012 | A1 |
20130138276 | Soderi | May 2013 | A1 |
20130138314 | Viittala | May 2013 | A1 |
20130242101 | Schneider | Sep 2013 | A1 |
20130297173 | Takagi | Nov 2013 | A1 |
20130322688 | Tsuchiya | Dec 2013 | A1 |
20140078302 | Hamdan | Mar 2014 | A1 |
20140297171 | Minemura | Oct 2014 | A1 |
20140300722 | Garcia | Oct 2014 | A1 |
20140375804 | Bulan | Dec 2014 | A1 |
20150014533 | Nakamura | Jan 2015 | A1 |
20150235091 | Kumano | Aug 2015 | A1 |
Number | Date | Country |
---|---|---|
2004110521 | Sep 2002 | JP |
WO2010044127 | Apr 2010 | WO |
Entry |
---|
International Searching Authority, International Search Report, report, Aug. 11, 2015, 14 pages, European Patent Office, Rijswijk, Netherlands. |
Number | Date | Country | |
---|---|---|---|
20150294166 A1 | Oct 2015 | US |