Vehicular control system with trailering assist function

Information

  • Patent Grant
  • 10089541
  • Patent Number
    10,089,541
  • Date Filed
    Monday, October 2, 2017
    7 years ago
  • Date Issued
    Tuesday, October 2, 2018
    6 years ago
Abstract
A vehicular control system includes a camera having an exterior field of view at least rearward of the vehicle and operable to capture image data. A trailer is attached to the vehicle and image data captured by the camera includes image data captured when the vehicle is maneuvered with the trailer at an angle relative to the vehicle. The vehicular control system determines a trailer angle of the trailer and is operable to determine a path of the trailer responsive at least to a steering angle of the vehicle and the determined trailer angle of the trailer. The vehicular control system determines an object present exterior of the vehicle and the vehicular control system distinguishes a drivable surface from a prohibited space, and the vehicular control system plans a driving path for the vehicle that neither impacts the object nor violates the prohibited space.
Description
FIELD OF THE INVENTION

The present invention relates to vehicles with cameras mounted thereon and in particular to vehicles with one or more exterior-facing cameras, such as rearward facing cameras and/or the like.


BACKGROUND OF THE INVENTION

Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.


SUMMARY OF THE INVENTION

The present invention provides a camera for a vision system that utilizes one or more cameras or image sensors to capture image data of a scene exterior (such as forwardly) of a vehicle and provides a display of images indicative of or representative of the captured image data.


The vehicular vision system of the present invention includes at least one camera disposed at a vehicle and having an exterior field of view rearward of the vehicle. The camera is operable to capture image data. An image processor operable to process captured image data. The vision system is operable to determine a trailer angle of a trailer that is towed by the vehicle, and the vision system is operable to determine a path of the trailer responsive to a steering angle of the vehicle. The vision system is operable to display information for viewing by the driver to assist the driver in driving the vehicle with the trailer.


The vision system may display images of a road in the direction of travel of the vehicle and trailer and may display an overlay to indicate to the driver of the vehicle a steering path for the vehicle. For example, the vision system may display images of a road in the forward direction of travel of the vehicle and trailer and the overlay may indicate a steering path for the vehicle that tows the trailer around an obstacle, such as around a corner of an intersection or the like. For example, the vision system may display images of a road in the rearward direction of travel of the vehicle and trailer and may indicate a steering path for the vehicle to drive the trailer into a selected location, such as a parking space or the like. Optionally, trailer data (such as physical characteristic data or the like) may be input into the vision system to provide data pertaining to physical characteristics of the trailer.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a plan view of a vehicle with a vision system that incorporates cameras in accordance with the present invention;



FIG. 2 is a plan view of a vehicle and trailer;



FIG. 3 is a plan view of the vehicle and trailer shown as the vehicle pulls the trailer along a curve;



FIG. 4 is an enlarged plan view of the vehicle and trailer of FIG. 3;



FIGS. 5-10 are schematics showing of the vehicle and trailer of FIGS. 3 and 4;



FIG. 11 is a perspective view of the side of the vehicle and trailer, such as viewed by a side camera of the vision system of the present invention;



FIG. 12 is another perspective view of the side of the vehicle and trailer as in FIG. 11, showing motion vectors;



FIGS. 13-15 are perspective views of an intersection on which predicted driving paths are mapped;



FIG. 16 is another perspective view of the side of the vehicle and trailer as in FIG. 11, showing predicted driving paths when the vehicle is backing up with the trailer;



FIGS. 17-19 are plan views of parking spaces at which a vehicle may park a trailer;



FIG. 20 is a schematic showing direction indicators that assist the driver in steering the vehicle and trailer in accordance with the present invention;



FIG. 21 is a schematic view of the vehicle camera detection ranges of the vision system of the present invention;



FIG. 22 is another schematic view similar to FIG. 21, showing how a trailer camera's captured images may serve to detect and warn of narrowing vehicles entering the blind spot area;



FIG. 23 is an example of a trailer target sticker utilizing a Barker coding of lengths, with seven concentric circles in black and white on gray background, shown with plus or positive ones as white circles and minus or negative ones as black circles, whereby the coding is realized from the outer rings to the inner rings; and



FIG. 24 is a table showing exemplary caravan classes.





DETAILED DESCRIPTION OF THE INVENTION

Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes one or more imaging sensors or cameras (such as a rearward facing imaging sensor or camera 14a and/or a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and/or a sidewardly/rearwardly facing camera 14c, 14b at the sides of the vehicle), which capture images exterior of the vehicle, with the cameras having a lens for focusing images at or onto an imaging array or imaging plane of the camera (FIG. 1). The vision system 12 is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle. Optionally, the vision system may process image data to detect objects, such as objects to the rear of the subject or equipped vehicle during a reversing maneuver, or such as approaching or following vehicles or vehicles at a side lane adjacent to the subject or equipped vehicle or the like.


Driver assistant systems made to assist the driver when maneuvering a trailer are known. It is known in vehicle vision systems to overlay/map so called ‘driving tunnels’ on top of the outside (the vehicle's) view, captured by image capturing devices, especially cameras, to visualize the predicted way the vehicle would take when the chosen steering direction is kept by. The steering direction may be generally detected by steering angle sensors on the steering column. When the steering angle is changing, the driving tunnel may be adapted by the vision system algorithms. For doing that correctly, the vehicle's maneuvering trajectories may be regarded respectively. The driving tunnels may be superimposed on a display when the vehicle is maneuvering backward in one dimension for parking maneuvers.


As described in International Publication No. WO 2013/109869, published Jul. 25, 2013, which is hereby incorporated herein by reference in its entirety, overlays and display view modes may be displayed or shown for aiding the driver when the driver is trying to maneuver to a trailer hitch head or when he is driving close to curb stones or the like. In such situations, driving tunnel overlays come into use.


In U.S. patent application Ser. No. 13/774,317, filed Feb. 22, 2013, now U.S. Pat. No. 9,269,263, which is hereby incorporated herein by reference in its entirety, it is suggested to also use driving tunnels when the vehicle is driving in forward direction.


Driver assistant systems made to assist the driver when pulling or pushing a trailer without having a specific trailer angle sensor are described, such as in International Publication No. WO 2012/103193, published Aug. 2, 2012, which is hereby incorporated herein by reference in its entirety. Such trailer angle sensing systems may detect the trailer nicking angle (relative to the car) by targets on the trailer and the vision system's cameras instead employing special angle sensor on the rear hatch or door of the vehicle. In some systems, when attaching a trailer to the vehicle, the driver has to enter its properties to put the trailer driving aid system into a position to calculate the driving aids overlays properly, when backing up with a trailer attached.


Wireless camera data transmission is known already, especially WLAN. To attach wireless cameras onto vehicles or trailers at or after assembly is known. Especially analog image transmission is common use.


The present invention provides a vision system that (a) enables the (re-) identification of a just hooked on trailer to a trailer hitch by a trailer code sticker visible to the vision system, (b) is capable to determine the distance between the trailer (effective turning) axis and the hitch's nicking point by the trailer's nicking behavior while the team (vehicle and trailer) is in motion, (c) is capable to determine the trailer's total length by trigonometric size comparing of known size to unknown size image features, (d) is capable to determine the trailer's width by side camera image evaluation, (e) is capable to estimate or determine the trailer's tendency to oscillate when driving forward and is capable to cope with that oscillation, and (f) is capable to store the acknowledged properties of a certain trailer in an according data file, which may be reloaded when an already known trailer is re-identified. Another aspect of the present invention is the technical realization of how a wireless (such as, for example, via a BLUETOOTH® communication protocol) trailer camera (such as an after market camera) can be integrated into the (OEM-) vehicle vision system and utilized in the trailer driving aid.


For the (re-)identification of a just hooked on trailer to a trailer hitch it is herein suggested to fix a unique code sticker to the concerning trailer. This may be done by the vehicle and/or trailer owner or by the trailer manufacturer. Preferably, the sicker may be mounted in the center view of the vehicle vision system's (rear-) camera view. The sticker may consist by a one dimensional (1 D) code or a two dimensional (2D) code or even by a three dimensional (3D) hologram or the like, or may consist of a kind of display (such as a LCD or E-ink display or the like). As an aspect of the present invention, the sticker may be made of a durable material, which may have a dull surface with a high contrast, either in black and white (or gray scale) or in color. The material may have fluorescent or self-illuminating properties in a visible or non-visible wavelength band. The material may have the capabilities to reflect light better which is in a non-visible wavelength (band) as like infrared or near infrared light or ultra violet light or the like. The sticker may even have quite low visibility in the visible wavelength light and may appear as like the surrounding coating, which may make the sticker nearly indiscriminatable for a (normal viewer), but the sticker may be highly visible for a camera filtering different wavelengths or emphasizing a particular wavelength or wavelengths. The code may at least in part be embodied by a new or known 2D code (Semacode), such as like QR-code, DataMatrix, Cool-Data-Matrix, Aztec-Code, UPCODE, Trillcode, Quickmark, ShotCode, mCode, Beetagg and High Capacity Color Barcode and/or the like. The sticker and detection system may utilize aspects of the trailer angle detection systems described in U.S. patent application Ser. No. 14/036,723, filed Sep. 25, 2013, now U.S. Pat. No. 9,446,713, and/or Ser. No. 13/979,871, filed Jul. 16, 2013, now U.S. Pat. No. 9,085,261, which are hereby incorporated herein by reference in their entireties.


The target sticker may have at least a region on which a code is placed that may possess a minimal auto correlation, such as a Barker-Code (known for use in synchronization methods in RADAR systems or in checking microchips) or the like, but heretofore not known in automotive vision systems. For use in a vision system as a target that is in a camera view, the minus ones and plus ones may be expressed in black and white concentric circles having a gray background. By that the code is rotation invariant. An according example is shown in FIG. 23. When using a Barker code based target, no corner or edge discrimination or feature tracking may be necessary for finding the target (via image processing of captured image data). For finding the target during run time it may be sufficient to run a maximum signal search by comparing the pattern matching one every test position. The matching may be rated as higher when there is less difference between a tested area and the compared Barker code pattern. The search may run or scan over the whole captured image (such as captured by a rear viewing vehicle camera). It may be preferred to run the search exclusively over an area in which the target is expected to be present. With a hooked on trailer attached at the rear of the vehicle, the distance of a mounted target to the (vehicle's rear) camera is comparably steady. Thus, no scale variants may have to be considered in the maximum signal search. There will typically be one substantial peak emerging out of the noise in matching distances which may be mostly even regardless of the pattern or illumination (pattern) that the real world's image portion may be around the pattern.


As an addition or alternative to the sticker that may be seen or read out by a vehicle camera (such as the rear vehicle camera), the sticker or identification element may have a wireless transponder, such as a passive or active RFID transponder or the like. Such a device may be low cost and may be uniquely codable and suitable for such external use at a vehicle and trailer.


Optionally, instead of having a sticker, the information may be stored by a control device attached to the trailer. The control device may be wired or wireless. Preferably, a wireless camera or other device may be in use, transponding or communicating the trailer's identification and/or properties (and optionally camera image data) to a receiver in or at the attached vehicle.


The sticker's or transponder's code may be unique or at least very rare to exclude double seizures, since the main purpose is to distinguish each trailer from another. The trailers may have some properties which matter to the vision system, necessary to switch on or calculate the driving aids correctly. There may by properties which may be collectable when hooking on the trailer (such as trailer color), and other properties while driving (such as the trailer's cornering trajectories), but there may be other properties that may stay undetected unless these become either provided by driver entry, which is quite inconvenient, when the driver has to do it all time he hooks on a trailer, or provided by a data base from a storage media or from a remote device or system. The data base may contain a static and a dynamic data set.


The static data set may be similar/identical for a group or a type of similar/identical trailers. These may be provided by the trailer's manufacturer or vehicle vision system's manufacturer or vehicle's manufacturer or by a service provider. The static data may contain essentially the general data out of the individual trailers data sheet, such as, for example, dimensions, maximum load, count of axles, own weight, mass center when empty, suspension parameters and/or the like. The data (base) may be stored locally within the vehicle and updated from time to time and/or may be called any time a trailer is attached to the vehicle or during vehicle service, such as from a remote data storage/server or a cloud via any kind of data communication system (such as WiMax, WiBro, UMTS, HSPA, LTE, CDMA or the like) installed in the vehicle or attached to the system or via a OEM car (garage) service device.


The database's dynamic data set may contain parameters which may be acquired during driving. As discussed below, there may be a method or algorithm to determine the distance from the hitch to the trailer's axis center. Other dynamically acquired parameters may regard to the trailer load extension, the trailer total weight or the mass center when the trailer is loaded, dampening capabilities of the suspension system and the tires. Optionally, the trailer tires' inflation status may be monitored as well.


All data may serve to compute in a driving assistance system which aids the driver to dampen the lateral swinging of the trailer when driving. This is mostly interesting when driving forward with higher speeds. A single axis or single axle trailer with two suspended tires mostly (arched) behaves as a PT2 system (assuming the tires are not skidding laterally). System parameters are the total mass, the mass center point regarding the lateral turning point (axis), the spring and dampening capabilities of the suspension in combination to the tires. The stimulus is mostly the curvature acceleration (speed, speed change ratio, turning angle and turning angle change ratio). This is mostly depending on the pulling vehicle's driving style. A trailer stability assist system may operate to keep the stimulating frequencies (and its harmonics) low in amplitude which are close to the resonance frequency ω0 of the trailer PT2 system. A stability system may be capable to steady an already swinging trailer system by anti cyclical stimulation (within PT2 systems harmonics). There may be advanced phasings (e.g. about 90 degrees) which act best as anti cyclical/becalming stimuli.


The system may be capable to estimate the trailer's weight by dividing the difference of the (average) acceleration when the vehicle is accelerating with the trailer and the (average) acceleration of the vehicle without a trailer attached when the same force coming from the engine's torque is pulling on the team (m=F/(at−av)).


Knowing the trailer mass, the system is capable to estimate the trailer's mass center's height by observing the trailer's nicking angle while crossing a curve. The radius the trailer is passing the curve can be determined by the steering angle and the equations relating the trailer shown below. The lateral force to the trailer's mass is given by the mass multiplied with the squared speed divided by the radius:

Fz=m v2/r.

With Fd being a depending on Fz regarding the lever length and turning angles, the spring rate D of the trailer system may be calculated:

D=Fd/y;

during y is the way of spring compression.


The resonance frequency of the system is given by ω0:







ω
o

=


D
m







Alternatively, ω0 may become observed directly on the dynamic swing oscillation of the trailer.


The trailer's axis distance to the hitch (It) is calculatable in two particular cases. The first case is: the pulling or pushing (both possible) vehicle is not changing its direction which means its steering angle is zero but the trailer has an angle γ2 to the car (at least at the beginning). Referring to FIG. 9A and FIG. 9B, since the length (b) equates to the driving distance (d) between measuring increments (tn) to (tn+1) between the points (p(tn) and p(tn+1), is known and the center of the trailer axis is always pointing to the vehicle's hitch turning point, the triangle enclosed between the trailer axis center, the vehicles hitch at a measuring increment (tn) and the vehicles hitch at a measuring increment (tn+1) has two known angles γ2 angle α2. The turning angle of the trailer axis is described by the angle β2. The angle α2 of the point of time (n) equates to ¶−γ2 of the consecutive measuring increment of tn+1 (see equation (2) below). The flank (a) of the triangle has always the length of the trailer. All angles can (finally) be described by the trailer angle γ2 at different consecutive points of time:








(
1
)







I
t


=



Sin







α
2



(

t
n

)



-
d


Sin







β
2



(

t
n

)









α2(tn)=¶−γ2(tn+1) given that: α1=0; from (tn) to (tn+1)  (2)
β2=¶−α2−γ2  (3a)
β2=¶−(α22)  (3b)
β2(t)=¶−((γ2(tn)+γ2(tn+1))  (3c)
d=(p(tn)−p(tn+1)  (4)








(
5
)







I
t


=


Sin






(


(


-


γ
2



(

t

n
+
1


)



)

-

(

(


p


(

t
n

)


-

p


(

t

n
+
1


)



)






Sin






(


-

(

(



γ
2



(

t
n

)


+


γ
2



(

t

n
+
1


)



)

)











    • given that: α1(t0)=0
      • and α1(t1)=0
      • and (γ2(tn)+γ2(tn+1)≠¶.





With reference to FIG. 9B, It is calculatable at the time stamps (t1), (t2) and (t3) given that: α1 (t0 to t3)=0 and α2(t0 to t3) ≠0 (the trivial case α2=0 which is identical to β2=0 would produce a division by zero).

α2(t0)=¶−γ2(t1); given that: α1=0; from (t0) to (t1)
α2(t1)=¶−γ2(t2); given that: α1=0; from (t1) to (t2)
α2(t2)=¶−γ2(t3); given that: α1=0; from (t2) to (t3)
α2(t3)=¶−γ2(t4); given that: α1=0; from (t3) to (t4)


The second particular case the trailer's axis distance to the hitch (It) is calculatable is when the pulling vehicle is driving in a constant turn, which means angle α1 stays constant and unequal zero over a certain time until (tn) and the trailer angle α2 is in a steady state (not changing any more between time increments). In fact (tn) is given at the time angle α2 is in steady state (α2(tn)=α2(tn−1)). Since pushed trailers are Metha stable in practice a steady state can't be reached without permanently changes of α1.



FIG. 10 shows an example of a vehicle-hitch-trailer system. A trailer is swinging into the driving direction along a curved driving path the pulling vehicle describes. The path is given by the way points p(t0) to p(t2). At to the trailer has an angle α2(t0) to the vehicle. The vehicle's (front-) steering wheel's angle α1(t0) is <0 (negative compared to the trailer's angle depending on the reference system). The vehicle's axis or wheels have the length lv, the hitch's (turning) head has the distance to the rear vehicle axis Ik and the trailers axis (of a one axis trailer) has the distance It to the hitch's head (see also FIG. 2).



FIGS. 7 and 8 show a similar scene with more waypoints, showing the trailers nicking relative to the vehicle (staying focused on the vehicle); α1 stays constant over the whole time. In FIG. 7, the turning circles of the relevant axis are schematized, also the triangles spanned between the trailers axis center, the turning center of the trailer and the hitch head. The trailer's turning center may be in ascertainable before reaching tn. The angle α2(tn) is identical.



FIGS. 3 and 4 show the fully swung in case α2(tn) when the vehicle steering angle is <0. At that case (tn) the vehicle's axis (rear and˜average of both front) extensions are meeting in the center of the turning circle. In the scheme in FIGS. 5 and 6, the same scene is schematized showing the front wheels all describe an own (different) circle with wider or closed radius as the front wheels center rv respectively. It becomes aware that the vehicle's rear axis center has a smaller radius rh with the same center as the front wheels center rv. Since the hitch is an extension of the vehicle's center line, orthogonal to the rear axis, its head radius rk to its turning point is identically to that of the vehicle's rear axis center, but shifted sidewards by the length of the hitch (to the axis) Ik.



FIG. 8 shows the dynamically change of the trailer angle α2 relative to the vehicle over consecutive time steps α2(t0) to α2(tn), combining FIGS. 6 and 8. The time stamp's properties are drawn lighter the more the age is. In this schematic it is noticeable that the trailer's hitch radius is only then identically to the rear axis radius of the car when the trailer is in a swung in condition {α2(tn)} and the α1 is kept constant. If α1 would change the common center would be left.


The only calculatable case is the swung in case as like shown in FIG. 6. At that time the trailer length is given by the equation:

It=Sin α2(tn)−rk
given that: (α2(tn)=α2(tn−1)){steady state}
and given that: (α1(tn)=α1(tn−1)){steady state}
and given that: α2(tn)≠0 and α1(tn)≠0 {non trivial case}.  (6)


Another aspect of the present invention may be to cumulate the acquired measuring results of It as an average of some or all (plausible) results which were measured each time one of both cases mention above appear. The average value may be stored within the system or may be remotely provided in a manner as mentioned above as a property dedicated to a specific trailer which becomes reloaded from the storage media at a time a known trailer becomes hooked onto the vehicle again.


As an additional aspect to the present invention, the system may be capable to determine the trailers (20) total length by trigonometric size comparing of the known size ‘It’ between vehicle (10) rear axis (21) and the trailer's axis (22) in FIG. 11 to the unknown size ‘Ie’ of the trailer's rear end (23) to the trailer's axis (22) (of a one axis trailer, the axis common center accordingly when there are more than one axis or axle). This may happen as soon the system may calculate or estimate the trailer axis distance It the first time.


Since the trailer is always following the pulling vehicle, it is steadily present within the vehicle vision system camera views 14a, 14c and 14d (not in the front camera 14b). By object detection and tracking methods, such as, for example, by image difference subtraction, the static items within each camera scene while the vehicle plus trailer is in motion can become discriminated. The system may include or provide methods to also distinguish the vehicle's own components within the view. A method may be to comprehend these tracked points motion vectors which are identical (within a specific tolerance band) to one object during grouping substantially other tracked points motion vectors to another. In an exemplary case shown in FIG. 12, the surrounding world's points motion vectors (30) have the common property to point to a common vanishing center. The motion vectors of points (31) which are dedicated to the hooked on trailer have the common property to move substantially into another direction as the vanishing point and the vectors are comparably short often sidewards and do not disappear over a high number of consecutive frames. The vehicle's own components point's (32) may have the common property to be nearly fully static and never disappearing, when the vehicle is in motion.


As a use case for the trailer angle detection system, the system may calculate the paths that the vehicle front wheels, the rear wheels and the trailer's wheels will take when the driver is continuing the driving direction according the current steering angle. As a more useful and sophisticated solution, the vision system may be able to do a three dimensional (3D) world reconstruction or at least a lateral object detection and distance estimation/calculation. An optimal system may also be able to do an object and road surface classification for interpreting the environmental conditions. The system may be able to distinguish the drivable surface from prohibited space and objects which ought not to be hit by the vehicle and the trailer that is towed by the vehicle. This may happen by regard of known or provided context information. Such information may include mapping information (such as, for example, OpenStreetMap® information or the like), visual data from a remote device (such as, for example, information or data from or captured by a parking lot camera with wireless camera signal) or data from a parking space management system or that like (which provides the position of free parking spaces). Within the reconstructed 3D space, the system may plan a driving path for the vehicle and the trailer in a way that neither one of the wheels runs over or scratches at an object or violates the prohibited driving space (which may be a pedestrian banquette or the flower bed around the parking lot).



FIG. 13 shows a stylized scene of an intersection on which predicted driving paths are mapped. The vehicle's front wheels pair is shown in dark gray, the rear wheels lighter and the trailer's wheels the lightest. In this example, the vehicle is driving forward and the steering wheels' path is chosen too narrow and thus does not prevent the rear wheel from scratching the stylized pedestrian banquette (of course since the rear wheels' path and trailer's path are dependent on the path the front wheels take). A possible or nearly ideal path within the same situation is shown in FIG. 14. Here, the vehicle strikes out before turning into the intersecting road which means the front wheels first describe a curve to the left before bending to the right. Though the front wheels do not cross the center line of the road and thus don't encroach into the other lane. The rear wheels and the trailer's wheels describe a more narrow curves but do not contact the pedestrian banquette.


A trailer driving aid system may use any kind of overlays to highlight a possible or ideal path that the vehicle is supposed to follow. In FIG. 15, pylons are inserted into the scenes view by overlay in the displayed images to highlight the suggested path from the system. At more progressive system setup, the system may control the steering wheel (such as by actuators or the like) in part to lead the driver into the possible or ideal path or fully for autonomous driving. FIG. 16 shows a stylized side rear view generated by image skewing and distortion with superimposed, predicted driving paths when the vehicle is backing up with a trailer. As can be seen in FIG. 16, the vehicle steering wheels firstly have to describe a curve against the desired driving direction of the trailer so later the trailer turns into about the same curvature. Though the front wheels do not scratch the pedestrian banquette.


An additional aspect of the present invention may be to use an augmented view which may show a scene's viewing angle which may or may not be generated by image skewing and distortion of the vehicle's on board surround view cameras but may be generatable artificially out of the three dimensional (3D) reconstruction data of the scene. The scene may be generated by adding real time captured sensor data which may cover a part of the scene which may be close and another part which may come from a record of the scene. FIG. 17 shows an example of a parking lot scene where a parking space is fully known, such that it can be shown in the augmented vision top view. The choosable or selectable parking spaces for the trailer may have been detected and the size dedicated within the 3D reconstruction. The currently chosen trailer parking position is shown in solid black, the optional parking position are shown in gray. Similar to the earlier Figures, the vehicle's front wheels' pair of predicted driving paths is shown in dark gray, the rear wheels lighter and the trailer's wheels the lightest.



FIG. 18 shows the identical scene view with a different parking space chosen by the driver. The predicted driving path suggestion has changed accordingly. Optionally, when a trailer parking position is definitely chosen the other possible positions may be hidden or removed or disappear such as shown in FIG. 19. The driving paths may be corrected while closing to the parking space. The system may control the steering wheel (by actuators) in part or in a full manner also the brake and the accelerator. Systems without a steering wheel actuator may have a quite simple direction indicators (such as, for example, indicators such as shown in FIG. 20) for trailer pushing or pulling aid indicating to the driver the direction he or she may turn or continue to turn the steering wheel for following an suggested (ideal) driving path for maneuvering the trailer well.


As an additional aspect to the present invention, the vehicle may have just a rear camera or a full surround view system with front, rear and side cameras. The side camera's image as like shown in FIG. 15 and/or the rear camera's image may be used for extending the (known art) vehicle blind spot detection area to the full extension of vehicle plus trailer (Iv+Ik+It+Ie) (prior unknown) plus some safety margin to aid the driver during lane changes in a way that to not just the vehicle may not interfere with other vehicles but also the pulled trailer. FIG. 21 shows a schematic view of the surround views vehicle camera detection ranges. The system may detect the black vehicle on the fast lane early enough for warning the driver that a vehicle is within the blind spot though the rear camera is partially masked by the presence of the trailer within the rear view. As mentioned earlier, the system may have an additional trailer camera attached which feeds camera image data into the vision system. The blind spot warning may benefit from that image data. The schematic of FIG. 22 shows how a trailer camera's captured image may serve to detect and warn the driver of approaching vehicles entering the blind spot area.


Optionally, and as an aid to drivers of vehicles with trailers: because we already determine the trailer's and the vehicle's weight by comparing F/at=mt versus F/av=mv, we are able to tell which driving license class is required for the specific assembly. The information may be displayed immediately after the weight of the vehicle and trailer have been determined. The according class may be selected or provided by a look up table which comes from a server or which is stored locally. The display/table may be adapted according to the nation the vehicle is driving at and/or plans to drive at (for example, when driving from Netherland to Italy with a Caravan Trailer there are four nations' caravan rules to be considered). Other nation specific rules for trailers may be stored and displayed as well. Table 1 (FIG. 24) is an example table of caravan classes valid in Germany from Jan. 19, 2013.


Therefore, the present invention provides a means for determining the trailer angle and determining a path of travel of a trailer that is towed behind a vehicle (or pushed by the vehicle, such as when the vehicle and trailer are reversing). The present invention determines the properties or characteristics of the trailer and then calculates the path of travel of the trailer. The system of the present invention may display the path of travel or proposed steering path on a display screen to indicate to the driver of the vehicle the selected or appropriate path of travel to follow with the steering wheels of the vehicle, such that the trailer follows a desired path.


The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.


The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.


The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (preferably a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.


For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/0145313; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145501; WO 2012/0145343; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2012/145822; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661 and/or WO 2013/158592 and/or U.S. patent application Ser. No. 14/082,573, filed Nov. 18, 2013, and published May 22, 2014 as U.S. Publication No. US-2014-0139676; Ser. No. 14/082,574, filed Nov. 18, 2013, now U.S. Pat. No. 9,307,640; Ser. No. 14/082,575, filed Nov. 18, 2013, now U.S. Pat. No. 9,090,234; Ser. No. 14/082,577, filed Nov. 18, 2013, now U.S. Pat. No. 8,818,042; Ser. No. 14/071,086, filed Nov. 4, 2013, now U.S. Pat. No. 8,886,401; Ser. No. 14/076,524, filed Nov. 11, 2013, now U.S. Pat. No. 9,077,962; Ser. No. 14/052,945, filed Oct. 14, 2013, and published Apr. 17, 2014 as U.S. Publication No. US-2014-0104426; Ser. No. 14/046,174, filed Oct. 4, 2013, and published Apr. 10, 2014 as U.S. Publication No. US-2014-0098229; Ser. No. 14/016,790, filed Oct. 3, 2013, and published Mar. 6, 2014 as U.S. Publication No. US-2014-007206; Ser. No. 14/036,723, filed Sep. 25, 2013, now U.S. Pat. No. 9,446,713; Ser. No. 14/016,790, filed Sep. 3, 2013, and published Mar. 6, 2014 as U.S. Publication No. US-2014-0067206; Ser. No. 14/001,272, filed Aug. 23, 2013, now U.S. Pat. No. 9,233,641; Ser. No. 13/970,868, filed Aug. 20, 2013, now U.S. Pat. No. 9,365,162; Ser. No. 13/964,134, filed Aug. 12, 2013, now U.S. Pat. No. 9,340,227; Ser. No. 13/942,758, filed Jul. 16, 2013, and published Jan. 23, 2014 as U.S. Publication No. US-2014-0025240; Ser. No. 13/942,753, filed Jul. 16, 2013, and published Jan. 30, 2014 as U.S. Publication No. US-2014-0028852; Ser. No. 13/927,680, filed Jun. 26, 2013, and published Jan. 2, 2014 as U.S. Publication No. US-2014-0005907; Ser. No. 13/916,051, filed Jun. 12, 2013, now U.S. Pat. No. 9,077,098; Ser. No. 13/894,870, filed May 15, 2013, and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503; Ser. No. 13/887,724, filed May 6, 2013, and published Nov. 14, 2013 as U.S. Publication No. US-2013-0298866; Ser. No. 13/852,190, filed Mar. 28, 2013, and published Aug. 29, 2013 as U.S. Publication No. US-2013-0222593; Ser. No. 13/851,378, filed Mar. 27, 2013, now U.S. Pat. No. 9,319,637; Ser. No. 13/848,796, filed Mar. 22, 2012, and published Oct. 24, 2013 as U.S. Publication No. US-2013-0278769; Ser. No. 13/847,815, filed Mar. 20, 2013, and published Oct. 31, 2013 as U.S. Publication No. US-2013-0286193; Ser. No. 13/800,697, filed Mar. 13, 2013, and published Oct. 3, 2013 as U.S. Publication No. US-2013-0258077; Ser. No. 13/785,099, filed Mar. 5, 2013, and published Sep. 19, 2013 as U.S. Publication No. US-2013-0242099; Ser. No. 13/779,881, filed Feb. 28, 2013, now U.S. Pat. No. 8,694,224; Ser. No. 13/774,317, filed Feb. 22, 2013, now U.S. Pat. No. 9,269,263; Ser. No. 13/774,315, filed Feb. 22, 2013, and published Aug. 22, 2013 as U.S. Publication No. US-2013-0215271; Ser. No. 13/681,963, filed Nov. 20, 2012, now U.S. Pat. No. 9,264,673; Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898; Ser. No. 13/653,577, filed Oct. 17, 2012, now U.S. Pat. No. 9,174,574; and/or Ser. No. 13/534,657, filed Jun. 27, 2012, and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, and/or U.S. provisional applications, Ser. No. 61/901,127, filed Nov. 7, 2013; Ser. No. 61/905,461, filed Nov. 18, 2013; Ser. No. 61/905,462, filed Nov. 18, 2013; Ser. No. 61/895,610, filed Oct. 25, 2013; Ser. No. 61/895,609, filed Oct. 25, 2013; Ser. No. 61/893,489, filed Oct. 21, 2013; Ser. No. 61/886,883, filed Oct. 4, 2013; Ser. No. 61/879,837, filed Sep. 19, 2013; Ser. No. 61/879,835, filed Sep. 19, 2013; Ser. No. 61/878,877, filed Sep. 17, 2013; Ser. No. 61/875,351, filed Sep. 9, 2013; Ser. No. 61/869,195, filed. Aug. 23, 2013; Ser. No. 61/864,835, filed Aug. 12, 2013; Ser. No. 61/864,836, filed Aug. 12, 2013; Ser. No. 61/864,837, filed Aug. 12, 2013; Ser. No. 61/864,838, filed Aug. 12, 2013; Ser. No. 61/856,843, filed Jul. 22, 2013, Ser. No. 61/845,061, filed Jul. 11, 2013; Ser. No. 61/844,630, filed Jul. 10, 2013; Ser. No. 61/844,173, filed Jul. 9, 2013; Ser. No. 61/844,171, filed Jul. 9, 2013; Ser. No. 61/842,644, filed Jul. 3, 2013; Ser. No. 61/840,542, filed Jun. 28, 2013; Ser. No. 61/838,619, filed Jun. 24, 2013; Ser. No. 61/838,621, filed Jun. 24, 2013; Ser. No. 61/837,955, filed Jun. 21, 2013; Ser. No. 61/836,900, filed Jun. 19, 2013; Ser. No. 61/836,380, filed Jun. 18, 2013; Ser. No. 61/834,129, filed Jun. 12, 2013; Ser. No. 61/833,080, filed Jun. 10, 2013; Ser. No. 61/830,375, filed Jun. 3, 2013; Ser. No. 61/830,377, filed Jun. 3, 2013; Ser. No. 61/825,752, filed May 21, 2013; Ser. No. 61/825,753, filed May 21, 2013; Ser. No. 61/823,648, filed May 15, 2013; Ser. No. 61/823,644, filed May 15, 2013; Ser. No. 61/821,922, filed May 10, 2013; Ser. No. 61/819,835, filed May 6, 2013; Ser. No. 61/819,033, filed May 3, 2013; Ser. No. 61/816,956, filed Apr. 29, 2013; Ser. No. 61/815,044, filed Apr. 23, 2013; Ser. No. 61/814,533, filed Apr. 22, 2013; Ser. No. 61/813,361, filed Apr. 18, 2013; Ser. No. 61/810,407, filed Apr. 10, 2013; Ser. No. 61/808,930, filed Apr. 5, 2013; Ser. No. 61/807,050, filed Apr. 1, 2013; Ser. No. 61/806,674, filed Mar. 29, 2013; Ser. No. 61/793,592, filed Mar. 15, 2013; Ser. No. 61/772,015, filed Mar. 4, 2013; Ser. No. 61/772,014, filed Mar. 4, 2013; Ser. No. 61/770,051, filed Feb. 27, 2013; Ser. No. 61/770,048, filed Feb. 27, 2013; Ser. No. 61/766,883, filed Feb. 20, 2013; Ser. No. 61/760,366, filed Feb. 4, 2013; Ser. No. 61/760,364, filed Feb. 4, 2013; Ser. No. 61/756,832, filed Jan. 25, 2013; Ser. No. 61/754,804, filed Jan. 21, 2013; Ser. No. 61/736,103, filed Dec. 12, 2012; Ser. No. 61/734,457, filed Dec. 7, 2012; Ser. No. 61/733,598, filed Dec. 5, 2012; and/or Ser. No. 61/733,093, filed Dec. 4, 2012, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and published Jan. 31, 2013 as International Publication No. WO 2013/016409, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, and published Jan. 3, 2013 as U.S. Publication No. US-2013/0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361, and/or Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Pat. No. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606 and/or 7,720,580, and/or U.S. patent application Ser. No. 10/534,632, filed May 11, 2005, now U.S. Pat. No. 7,965,336; and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO/2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. No. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009, now U.S. Pat. No. 9,487,144, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties.


Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent application Ser. No. 12/091,525, filed Apr. 25, 2008, now U.S. Pat. No. 7,855,755; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


While the above description constitutes a plurality of embodiments of the present invention, it will be appreciated that the present invention is susceptible to further modification and change without departing from the fair meaning of the accompanying claims.

Claims
  • 1. A vehicular control system, said vehicular control system comprising: a camera disposed at a vehicle and having an exterior field of view at least rearward of the vehicle;wherein said camera is operable to capture image data;wherein a trailer is attached to the vehicle;an image processor operable to process captured image data;wherein image data captured by said camera during maneuvering of the vehicle and the trailer includes image data captured by said camera when the vehicle is maneuvered with the trailer at an angle relative to the vehicle;wherein said vehicular control system, responsive at least in part to image processing by said image processor of image data captured by said camera, determines a trailer angle of the trailer;wherein said vehicular control system is operable to determine a path of the trailer responsive at least to a steering angle of the vehicle and the determined trailer angle of the trailer relative to the vehicle;wherein said vehicular control system determines an object present exterior of the vehicle which ought not to be impacted during maneuvering of the vehicle and the trailer;wherein said vehicular control system determines drivable surfaces and prohibited spaces and distinguishes drivable surfaces from prohibited spaces;wherein said vehicular control system, responsive to determination of objects present exterior of the vehicle which ought not to be impacted during maneuvering of the vehicle and trailer and distinguishing drivable surfaces from prohibited spaces, constructs a three dimensional model of the region encompassed by the field of view of said camera; andwherein, in accordance with the constructed three dimensional model, and based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system plans a driving path for the vehicle and trailer that avoids detected objects and that avoids determined prohibited spaces so that wheels of the vehicle and trailer do not run over or contact detected objects and do not violate determined prohibited spaces during maneuvering of the vehicle and trailer.
  • 2. The vehicular control system of claim 1, wherein, based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system classifies the determined object.
  • 3. The vehicular control system of claim 1, wherein said vehicular control system determines the object which ought not to be impacted during maneuvering of the vehicle and the trailer based at least in part on mapping information.
  • 4. The vehicular control system of claim 1, wherein said vehicular control system determines the object which ought not be impacted during maneuvering of the vehicle and the trailer based at least in part on image processing by said image processor of image data captured by said camera.
  • 5. The vehicular control system of claim 1, wherein said vehicular control system determines the object which ought not to be impacted during maneuvering of the vehicle and the trailer based at least in part on data wirelessly transmitted to the vehicle.
  • 6. The vehicular control system of claim 1, wherein, based at least in part on image processing by said image processor of image data captured by said camera, said vehicular control system estimates height of center of mass of the trailer.
  • 7. The vehicular control system of claim 6, wherein said vehicular control system estimates the height of the center of mass of the trailer during maneuvering of the vehicle and the trailer around a curve.
  • 8. The vehicular control system of claim 1, wherein, based at least in part on image processing by said image processor of image data captured by said camera, said vehicular control system estimates the trailer's nicking angle while the vehicle steers around a curve.
  • 9. The vehicular control system of claim 1, wherein, based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system classifies a road surface viewed by said camera.
  • 10. The vehicular control system of claim 1, wherein said vehicular control system displays, at a display screen of the vehicle, images derived from image data captured by said camera, and wherein an overlay overlaid over the displayed images indicates a steering path for the vehicle that maneuvers the trailer around the determined obstacle.
  • 11. The vehicular control system of claim 10, wherein the steering path for the vehicle is determined at least in part responsive to the steering angle of the vehicle.
  • 12. The vehicular control system of claim 1, wherein said vehicular control system, based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, is operable to determine a distance between the trailer turning axis and a nicking point of the trailer by the trailer's nicking behavior while the vehicle and trailer are in motion.
  • 13. The vehicular control system of claim 1, wherein said vehicular control system, based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, is operable to determine the trailer's width.
  • 14. The vehicular control system of claim 1, wherein said vehicular control system gathers trailer data pertaining to physical characteristics of the trailer, and wherein said vehicular control system is operable to determine the path of the trailer at least in part responsive to the trailer data gathered.
  • 15. The vehicular control system of claim 1, wherein said vehicular control system is operable to identify a certain trailer and to store properties of that certain trailer in a data file that is accessed when that certain trailer is connected at the vehicle and identified by said vehicular control system.
  • 16. The vehicular control system of claim 1, wherein, when the vehicle and the trailer are maneuvering, and based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system estimates a dimension of the trailer.
  • 17. The vehicular control system of claim 16, wherein the dimension of the trailer comprises width of the trailer.
  • 18. The vehicular control system of claim 16, wherein the dimension of the trailer comprises length of the trailer.
  • 19. A vehicular control system, said vehicular control system comprising: a camera disposed at a vehicle and having an exterior field of view at least rearward of the vehicle;wherein said camera is operable to capture image data;wherein a trailer is attached to the vehicle;an image processor operable to process captured image data;wherein image data captured by said camera during maneuvering of the vehicle and the trailer includes image data captured by said camera when the vehicle is maneuvered with the trailer at an angle relative to the vehicle;wherein said vehicular control system, responsive at least in part to image processing by said image processor of image data captured by said camera, determines a trailer angle of the trailer;wherein said vehicular control system is operable to determine a path of the trailer responsive at least to a steering angle of the vehicle and the determined trailer angle of the trailer relative to the vehicle;wherein said vehicular control system determines an object present exterior of the vehicle which ought not be impacted during maneuvering of the vehicle and the trailer based at least in part on image processing by said image processor of image data captured by said camera;wherein said vehicular control system determines drivable surfaces and prohibited spaces and distinguishes drivable surfaces from prohibited spaces;wherein said vehicular control system, responsive to determination of objects present exterior of the vehicle which ought not to be impacted during maneuvering of the vehicle and trailer and distinguishing drivable surfaces from prohibited spaces, constructs a three dimensional model of the region encompassed by the field of view of said camera;wherein, in accordance with the constructed three dimensional model, and based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system plans a driving path for the vehicle and trailer that avoids detected objects and that avoids determined prohibited spaces so that wheels of the vehicle and trailer do not run over or contact detected objects and do not violate determined prohibited spaces during maneuvering of the vehicle and trailer; andwherein said vehicular control system is operable to identify a certain trailer and to store properties of that certain trailer in a data file that is accessed when that certain trailer is connected at the vehicle and identified by said vehicular control system.
  • 20. The vehicular control system of claim 19, wherein, based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system classifies the determined object.
  • 21. The vehicular control system of claim 20, wherein, when the vehicle and the trailer are maneuvering, and based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system estimates a dimension of the trailer.
  • 22. A vehicular control system, said vehicular control system comprising: a camera disposed at a vehicle and having an exterior field of view at least rearward of the vehicle;wherein said camera is operable to capture image data;wherein a trailer is attached to the vehicle;an image processor operable to process image data captured by said camera;wherein image data captured by said camera during maneuvering of the vehicle and the trailer includes image data captured by said camera when the vehicle is maneuvered with the trailer at an angle relative to the vehicle;wherein said vehicular control system, responsive at least in part to image processing by said image processor of image data captured by said camera, determines a trailer angle of the trailer;wherein said vehicular control system is operable to determine a path of the trailer responsive at least to a steering angle of the vehicle and the determined trailer angle of the trailer relative to the vehicle;wherein said vehicular control system determines an object present exterior of the vehicle which ought not be impacted during maneuvering of the vehicle and the trailer based at least in part on image processing by said image processor of image data captured by said camera;wherein said vehicular control system determines drivable surfaces and prohibited spaces and distinguishes drivable surfaces from prohibited spaces;wherein said vehicular control system, responsive to determination of objects present exterior of the vehicle which ought not to be impacted during maneuvering of the vehicle and trailer and distinguishing drivable surfaces from prohibited spaces, constructs a three dimensional model of the region encompassed by the field of view of said camera;wherein, in accordance with the constructed three dimensional model, and based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system plans a driving path for the vehicle and trailer that avoids detected objects and that avoids determined prohibited spaces so that wheels of the vehicle and trailer do not run over or contact detected objects and do not violate determined prohibited spaces during maneuvering of the vehicle and trailer;wherein said vehicular control system is operable to identify a certain trailer and to store properties of that certain trailer in a data file that is accessed when that certain trailer is connected at the vehicle and identified by said vehicular control system;wherein, based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system classifies at least one of (i) the determined object and (ii) a road surface viewed by said camera; andwherein, when the vehicle and the trailer are maneuvering, and based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system estimates width of the trailer.
  • 23. The vehicular control system of claim 22, wherein, based at least in part on image processing by said image processor of image data captured by said camera, said vehicular control system estimates height of center of mass of the trailer during maneuvering of the vehicle and the trailer around a curve.
  • 24. The vehicular control system of claim 22, wherein, based at least in part on image processing by said image processor of image data captured by said camera, said vehicular control system estimates the trailer's nicking angle while the vehicle steers around a curve.
  • 25. The vehicular control system of claim 22, wherein, based at least in part on image processing by said image processor of image data captured by said camera during maneuvering of the vehicle and the trailer, said vehicular control system classifies a road surface viewed by said camera.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/413,464, filed Jan. 24, 2017, now U.S. Pat. No. 9,779,313, which is a continuation of U.S. patent application Ser. No. 14/102,981, filed Dec. 11, 2013, now U.S. Pat. No. 9,558,409, which claims the filing benefits of U.S. provisional application Ser. No. 61/736,104, filed Dec. 12, 2012, which is hereby incorporated herein by reference in its entirety. U.S. patent application Ser. No. 14/102,981 is also a continuation-in-part of U.S. patent application Ser. No. 14/036,723, filed Sep. 25, 2013, now U.S. Pat. No. 9,446,713, which claims the filing benefits of U.S. provisional applications, Ser. No. 61/868,843, filed Aug. 22, 2013; Ser. No. 61/834,128, filed Jun. 12, 2013; Ser. No. 61/758,537, filed Jan. 30, 2013; and Ser. No. 61/705,877, filed Sep. 26, 2012, which are hereby incorporated herein by reference in their entireties.

US Referenced Citations (515)
Number Name Date Kind
4200361 Malvano Apr 1980 A
4214266 Myers Jul 1980 A
4218698 Bart et al. Aug 1980 A
4236099 Rosenblum Nov 1980 A
4247870 Gabel et al. Jan 1981 A
4249160 Chilvers Feb 1981 A
4266856 Wainwright May 1981 A
4277804 Robison Jul 1981 A
4281898 Ochiai Aug 1981 A
4288814 Talley et al. Sep 1981 A
4355271 Noack Oct 1982 A
4357558 Massoni et al. Nov 1982 A
4381888 Momiyama May 1983 A
4420238 Felix Dec 1983 A
4431896 Lodetti Feb 1984 A
4443057 Bauer Apr 1984 A
4460831 Oettinger et al. Jul 1984 A
4481450 Watanabe et al. Nov 1984 A
4491390 Tong-Shen Jan 1985 A
4512637 Ballmer Apr 1985 A
4529275 Ballmer Jul 1985 A
4529873 Ballmer Jul 1985 A
4546551 Franks Oct 1985 A
4549208 Kamejima et al. Oct 1985 A
4571082 Downs Feb 1986 A
4572619 Reininger Feb 1986 A
4580875 Bechtel Apr 1986 A
4600913 Caine Jul 1986 A
4603946 Kato Aug 1986 A
4614415 Hyatt Sep 1986 A
4620141 McCumber et al. Oct 1986 A
4623222 Itoh Nov 1986 A
4626850 Chey Dec 1986 A
4629941 Ellis Dec 1986 A
4630109 Barton Dec 1986 A
4632509 Ohmi Dec 1986 A
4638287 Umebayashi et al. Jan 1987 A
4647161 Müller Mar 1987 A
4653316 Fukuhara Mar 1987 A
4669825 Itoh Jun 1987 A
4669826 Itoh Jun 1987 A
4671615 Fukada Jun 1987 A
4672457 Hyatt Jun 1987 A
4676601 Itoh Jun 1987 A
4690508 Jacob Sep 1987 A
4692798 Seko et al. Sep 1987 A
4697883 Suzuki Oct 1987 A
4701022 Jacob Oct 1987 A
4713685 Nishimura et al. Dec 1987 A
4717830 Botts Jan 1988 A
4727290 Smith Feb 1988 A
4731669 Hayashi et al. Mar 1988 A
4741603 Miyagi May 1988 A
4768135 Kretschmer et al. Aug 1988 A
4772942 Tuck Sep 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan Dec 1988 A
4817948 Simonelli Apr 1989 A
4820933 Hong Apr 1989 A
4825232 Howdle Apr 1989 A
4838650 Stewart Jun 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4855822 Narendra et al. Aug 1989 A
4862037 Farber et al. Aug 1989 A
4867561 Fujii et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4881019 Shiraishi et al. Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4886960 Molyneux Dec 1989 A
4891559 Matsumoto et al. Jan 1990 A
4892345 Rachael, III Jan 1990 A
4895790 Swanson et al. Jan 1990 A
4896030 Miyaji Jan 1990 A
4907870 Brucker Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4916374 Schierbeek Apr 1990 A
4917477 Bechtel et al. Apr 1990 A
4937796 Tendler Jun 1990 A
4953305 Van Lente et al. Sep 1990 A
4956591 Schierbeek Sep 1990 A
4961625 Wood et al. Oct 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
4974078 Tsai Nov 1990 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam Sep 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi et al. Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5487116 Nakano et al. Jan 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5517419 Lanckton May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555312 Shima et al. Sep 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650764 McCullough Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5668663 Varaprasad et al. Sep 1997 A
5670935 Schofield Sep 1997 A
5675489 Pomerleau Oct 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724187 Varaprasad et al. Mar 1998 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5757949 Kinoshita et al. May 1998 A
5760826 Nayer Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5861814 Clayton Jan 1999 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5914815 Bos Jun 1999 A
5923027 Stam et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5940120 Frankhouse et al. Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5964822 Alland et al. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6001486 Varaprasad et al. Dec 1999 A
6009336 Harris et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6087953 DeLine et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6116743 Hoek Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124886 DeLine et al. Sep 2000 A
6139172 Bos et al. Oct 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6172613 DeLine et al. Jan 2001 B1
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6176505 Capik Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6222460 DeLine et al. Apr 2001 B1
6243003 DeLine et al. Jun 2001 B1
6250148 Lynam Jun 2001 B1
6259412 Duroux Jul 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6366213 DeLine et al. Apr 2002 B2
6370329 Teuchert Apr 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433676 DeLine et al. Aug 2002 B2
6433817 Guerra Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6480104 Wall et al. Nov 2002 B1
6483429 Yasui et al. Nov 2002 B1
6485155 Duroux et al. Nov 2002 B1
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6513252 Schierbeek et al. Feb 2003 B1
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6559761 Miller et al. May 2003 B1
6574033 Chui et al. Jun 2003 B1
6578017 Ebersole et al. Jun 2003 B1
6587573 Stam Jul 2003 B1
6589625 Kothari et al. Jul 2003 B1
6593565 Heslin et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam Aug 2003 B1
6612394 Wessman Sep 2003 B2
6627918 Getz et al. Sep 2003 B2
6631994 Suzuki et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6650233 DeLine et al. Nov 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678056 Downs Jan 2004 B2
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6693524 Payne Feb 2004 B1
6700605 Toyoda et al. Mar 2004 B1
6703925 Steffel Mar 2004 B2
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6721659 Stopczynski Apr 2004 B2
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjönell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6801125 McGregor Oct 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6847487 Burgner Jan 2005 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6956468 Lee Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
7004593 Weller et al. Feb 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7006127 Mizusawa et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7046448 Burgner May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7133661 Hatae et al. Nov 2006 B2
7149613 Stam et al. Dec 2006 B2
7158015 Rao et al. Jan 2007 B2
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7224324 Quist et al. May 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7249860 Kulas et al. Jul 2007 B2
7253723 Lindahl et al. Aug 2007 B2
7255451 McCabe et al. Aug 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7360932 Uken et al. Apr 2008 B2
7370983 DeWind et al. May 2008 B2
7375803 Bamji May 2008 B1
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7423821 Bechtel et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7425889 Widmann Sep 2008 B2
7459664 Schofield et al. Dec 2008 B2
7483058 Frank et al. Jan 2009 B1
7526103 Schofield et al. Apr 2009 B2
7541743 Salmeen et al. Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7565006 Stam et al. Jul 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7633383 Dunsmoir et al. Dec 2009 B2
7639149 Katoh Dec 2009 B2
7676087 Dhua et al. Mar 2010 B2
7690737 Lu Apr 2010 B2
7720580 Higgins-Luthman May 2010 B2
7792329 Schofield et al. Sep 2010 B2
7843451 Lafon Nov 2010 B2
7855778 Yung et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7881496 Camilleri Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7930160 Hosagrahara et al. Apr 2011 B1
8010252 Getman Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8038166 Piesinger Oct 2011 B1
8063752 Oleg Nov 2011 B2
8094170 Kato et al. Jan 2012 B2
8095310 Taylor et al. Jan 2012 B2
8098142 Schofield et al. Jan 2012 B2
8164628 Stein et al. Apr 2012 B2
8218007 Lee et al. Jul 2012 B2
8224031 Saito Jul 2012 B2
8260518 Englert Sep 2012 B2
8411998 Huggett et al. Apr 2013 B2
8755984 Rupp Jun 2014 B2
8838353 Wu Sep 2014 B2
8909426 Rhode Dec 2014 B2
9085261 Lu Jul 2015 B2
9102272 Trombley Aug 2015 B2
9126525 Lynam Sep 2015 B2
9156496 Greenwood Oct 2015 B2
9233710 Lavoie Jan 2016 B2
9264672 Lynam Feb 2016 B2
9283892 Trombley Mar 2016 B2
9315212 Kyrtsos Apr 2016 B1
9335162 Kyrtsos May 2016 B2
9342747 Kuehnle May 2016 B2
9446713 Lu Sep 2016 B2
9555803 Pawlicki Jan 2017 B2
9558409 Pliefke et al. Jan 2017 B2
9607242 Lavoie Mar 2017 B2
9610975 Hu Apr 2017 B1
9779313 Pliefke et al. Oct 2017 B2
20010001563 Tomaszewski May 2001 A1
20020113873 Williams Aug 2002 A1
20020145662 Mizusawa Oct 2002 A1
20020145663 Mizusawa Oct 2002 A1
20020149673 Hirama Oct 2002 A1
20030133014 Mendoza Jul 2003 A1
20030137586 Lewellen Jul 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030234512 Holub Dec 2003 A1
20040130441 Lee Jul 2004 A1
20050000738 Gehring Jan 2005 A1
20050074143 Kawai Apr 2005 A1
20050206225 Offerle Sep 2005 A1
20050219852 Stam et al. Oct 2005 A1
20050236894 Lu Oct 2005 A1
20050236896 Offerle Oct 2005 A1
20050237385 Kosaka Oct 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060050018 Hutzel Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060098094 Lott May 2006 A1
20060103727 Tseng May 2006 A1
20060152351 Daura Luna Jul 2006 A1
20060244579 Raab Nov 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070242339 Bradley Oct 2007 A1
20080147321 Howard et al. Jun 2008 A1
20080158357 Connell Jul 2008 A1
20080192132 Bechtel et al. Aug 2008 A1
20080231701 Greenwood Sep 2008 A1
20090005932 Lee Jan 2009 A1
20090045924 Roberts, Sr. Feb 2009 A1
20090113509 Tseng et al. Apr 2009 A1
20090143967 Lee et al. Jun 2009 A1
20090160987 Bechtel et al. Jun 2009 A1
20090190015 Bechtel et al. Jul 2009 A1
20090256938 Bechtel et al. Oct 2009 A1
20100097519 Byrne Apr 2010 A1
20110050903 Vorobiev Mar 2011 A1
20120045112 Lundblad Feb 2012 A1
20120200706 Greenwood Aug 2012 A1
20120265416 Lu Oct 2012 A1
20140085472 Lu et al. Mar 2014 A1
20140160276 Pliefke Jun 2014 A1
20140172232 Rupp Jun 2014 A1
20140200759 Lu Jul 2014 A1
20140218506 Trombley Aug 2014 A1
20140249691 Hafner Sep 2014 A1
20140277942 Kyrtsos Sep 2014 A1
20140303849 Hafner Oct 2014 A1
20140343793 Lavoie et al. Nov 2014 A1
20150002670 Bajpai Jan 2015 A1
Foreign Referenced Citations (13)
Number Date Country
102009046676 May 2011 DE
59114139 Jul 1984 JP
6080953 May 1985 JP
6414700 Jan 1989 JP
4114587 Apr 1992 JP
05050883 Mar 1993 JP
6227318 Aug 1994 JP
0769125 Mar 1995 JP
07105496 Apr 1995 JP
2630604 Jul 1997 JP
2003083742 Mar 2003 JP
WO2011014497 Feb 2011 WO
WO2012103193 Aug 2012 WO
Non-Patent Literature Citations (8)
Entry
Borenstein et al., “Where am I? Sensors and Method for Mobile Robot Positioning”, University of Michigan, Apr. 1996, pp. 2, 125-128.
Bow, Sing T., “Pattern Recognition and Image Preprocessing (Signal Processing and Communications)”, CRC Press, Jan. 15, 2002, pp. 557-559.
Vlacic et al., (Eds), “Intelligent Vehicle Technologies, Theory and Applications”, Society of Automotive Engineers Inc., edited by SAE International, 2001.
Van Leuven et al., “Real-Time Vehicle Tracking in Image Sequences”, IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308.
Van Leeuwen et al., “Requirements for Motion Estimation in Image Sequences for Traffic Applications”, IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272.
Van Leeuwen et al., “Motion Estimation with a Mobile Camera for Traffic Applications”, IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63.
Van Leeuwen et al., “Motion Interpretation for In-Car Vision Systems”, IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140.
Pratt, “Digital Image Processing, Passage—ED.3”, John Wiley & Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771.
Related Publications (1)
Number Date Country
20180025237 A1 Jan 2018 US
Provisional Applications (5)
Number Date Country
61736104 Dec 2012 US
61868843 Aug 2013 US
61834128 Jun 2013 US
61758537 Jan 2013 US
61705877 Sep 2012 US
Continuations (2)
Number Date Country
Parent 15413464 Jan 2017 US
Child 15722150 US
Parent 14102981 Dec 2013 US
Child 15413464 US
Continuation in Parts (1)
Number Date Country
Parent 14036723 Sep 2013 US
Child 14102981 US