Trailering assist system with trailer angle detection

Information

  • Patent Grant
  • 11820424
  • Patent Number
    11,820,424
  • Date Filed
    Monday, December 7, 2020
    3 years ago
  • Date Issued
    Tuesday, November 21, 2023
    5 months ago
Abstract
A method for backing up a trailer hitched to a vehicle includes mounting a rear backup camera at a vehicle and hitching a trailer at the hitch of the vehicle. Responsive to processing captured image data by a processor of the control, at least a portion of the trailer tongue of the trailer is detected using an edge detection algorithm. Responsive to processing at the control of image data captured by the rear backup camera during a backing up maneuver of the trailer hitched to the vehicle, trailer angle of the trailer relative to a longitudinal axis of the vehicle is determined. The control, based at least in part on trailer angle of the trailer, controls a steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle to back up the trailer hitched to the vehicle along a rearward trajectory.
Description
FIELD OF THE INVENTION

The present invention relates generally to rear vision systems for vehicles and, more particularly, to rear vision systems having a rearward facing camera at a rear portion of a vehicle.


BACKGROUND OF THE INVENTION

Rear backup cameras and vision systems are known for use in vehicles. Examples of such systems are described in U.S. Pat. Nos. 7,859,565; 6,611,202; 6,222,447; 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties. Such systems may display images for viewing by the driver of the vehicle that provide a view rearward of the vehicle.


SUMMARY OF THE INVENTION

The present invention provides a means for detecting the angle of a trailer being pulled behind a vehicle by using a rear view camera or multi-camera surround view system or the like. A camera is mounted at a rear end or rear portion of the pulling vehicle, with the camera having a rearward field of view (such as a wide angle rearward field of view) rearward of the vehicle. A processor, such as a digital processor or FPGA or digital signal processor (DSP) or ASIC or camera imager SOC or other suitable processing means or the like, may process the images or image data (as captured by the rearward facing camera) of the trailer being pulled or towed by the vehicle and may determine the angle of the trailer in relation to the pulling vehicle in real time. Optionally, the control or system may, responsive to processing of the captured images, generate an alert to the driver of the vehicle and/or control or operate one or more accessories or systems of the trailer or vehicle (such as a brake system or steering system or display system or the like), such as in response to the determination of the angle of the trailer.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a trailer attached to a vehicle equipped with a rear vision system in accordance with the present invention;



FIG. 2 is a perspective view of another trailer attached to a vehicle equipped with a rear vision system in accordance with the present invention;



FIG. 3 is a perspective view of another trailer attached to a vehicle equipped with a rear vision system in accordance with the present invention;



FIG. 4 is a graphical depiction of an output of a histogram algorithm for detecting a vertical portion of the cross-shaped target on the trailer of FIG. 3;



FIG. 5 is a graphical depiction of an output of the histogram algorithm for detecting a horizontal portion of the cross-shaped target on the trailer of FIG. 3;



FIG. 6 is a perspective view of another trailer attached to a vehicle equipped with a rear vision system in accordance with the present invention, showing colored patterns at the tongue of the trailer for detection by the rear vision system;



FIG. 7 is another perspective view of the trailer of FIG. 6, showing detection of the colored patterns at the tongue of the trailer;



FIG. 8 is a perspective view of an image captured of another trailer attached to a vehicle equipped with a rear vision system in accordance with the present invention, with a feature extraction and matching algorithm running, and with each feature point having a descriptor that can be matched from frame to frame;



FIGS. 9A and 9B illustrate a hitch ball detection algorithm, with FIG. 9A showing a raw image and FIG. 9B showing an image after segmenting the image into similar regions in accordance with the present invention;



FIG. 10 is an image showing the result of detecting edges of the segmented image of FIG. 9B in accordance with the present invention; and



FIG. 11 shows an image that represents the final result of the hitch ball detection algorithm of the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Referring now to the drawings and the illustrative embodiments depicted therein, a rear vision system for a vehicle is operable to detect the angle of a trailer 10 that is pulled behind a vehicle 12 by using rear view camera or multi-camera surround view system. A camera is mounted at the rear end portion of the pulling vehicle 12. An image processor (such as a digital processor or FPGA or DSP or ASIC or camera imager SOC or other suitable processor or processing means) is operable to process the images of the trailer and determines the angle of the trailer in relation to the pulling vehicle in real time.


The detection of the trailer angle relative to the vehicle is accomplished by detecting a portion of the trailer and determining the location of the detected portion relative to the towing vehicle, whereby the angle can be determined or calculated based on known geometries of the trailer and vehicle and the location of the camera on the vehicle. For example, the system may operate to track and determine a sideward movement of a trailer portion or target and, utilizing the known geometries, such as the distance of the trailer portion or target from the camera and/or the distance of the trailer portion or target from the pivot point or joint at which the trailer tongue attached to the trailer hitch of the vehicle, determine the angular movement of the trailer about the trailer hitch and relative to the vehicle, such as to determine a sway or swing of the trailer relative to the towing vehicle or to determine a rearward trajectory or path of travel of the trailer during a reversing maneuver of the vehicle and trailer, or the like.


The detection of the trailer portion or target can be done with an added target 14 on the trailer 10 or without an added target on the trailer (whereby the camera and processor may operate to detect a particular known or selected portion of the trailer). If a target or icon or indicia or the like is added to the trailer, the added target may, for example, be printed on a paper sheet, a plastic sheet or a reflective sheet or the like, which may be adhered to or otherwise disposed at or attached to a forward portion of the trailer, or optionally, the target may be otherwise established at or on the trailer, such as, for example, by painting or etching or otherwise establishing a target or icon or the like on a wall or panel or other forward portion of the trailer (such as a portion or structure of the tongue or frame of the trailer) or the like.


An example of a using an added target to a trailer is shown in FIG. 1, where a target 14 (such as a cross-shaped figure as illustrated) is added to a forward wall 10a of a trailer 10. By processing the images captured by the rearward facing camera and recognizing or identifying the target in the captured image and finding or determining the coordinates of the target in a captured image, the trailer angle relative to the vehicle can be determined. By repeating the above steps on multiple frames of the continuous video images, the change of trailer angle can be tracked continuously and in real time.


The target 14 may comprise any suitable shape or icon or indicia or the like. For example, the target may be in the shape of a cross, a triangle, a circle, or any other shape or shapes, or multiples of any suitable shape or shapes, or any suitable combination of shapes (preferably having sharp lines or structure that can be readily detected and identified via image processing). The target can be mounted on a wall or a surface or a structure of the trailer that is near or facing the camera at the rear end of the pulling vehicle. Optionally, the target can be mounted at or established on the tongue 10b of the trailer, which is the structure that connects the trailer to the trailer hitch 12a of the towing or pulling vehicle 12.


The target may be detected and recognized via any suitable image processing and algorithms. For example, suitable or preferred target detecting algorithms include a regressive 2-D histogram that searches for and detects the center of a pattern (such as a pattern like the cross or intersecting lines in FIG. 1). Another suitable type of algorithm is an edge detection algorithm. Optionally, and desirably, a target with designed high contrast edges may be used. With such an application, the processor can detect and track the trailer angle at the rear of the vehicle by detecting and locating and tracking or monitoring the coordinates of these signature edges. Optionally, the edge detection function or algorithm may comprise a Sobel gradient edge detection algorithm or other edge detection algorithms commercially available, and/or edge detection algorithms of the types described in U.S. Pat. Nos. 7,720,580; 7,038,577; 6,353,392 and/or 6,313,454, which are hereby incorporated herein by reference in their entireties. Another suitable type of algorithm is image pattern matching, which will be described in detail below.


In the case of detection without an added target, a part or structure of the trailer may be identified and used as the “target” for image processing. Because an added target (such as described above) can be worn off, blown away by wind, or soiled by dirt, the addition of a separate target may affect or reduce the trailer angle detection accuracy in certain situations. Thus, a trailer angle detection system that does not include an add-on target on the trailer may be preferred. An example of such a trailer angle detection and tracking system is shown in FIG. 2, where a part or portion or region or structure of the trailer tongue 10b serves as the “target” and is used as an image pattern or pattern matching template. As shown in FIG. 2, a small triangle structure on the tongue (shown as outlined by a polygon at 16) may be used as the pattern or structure that is to be matched to the template. An image processing algorithm, such as an image pattern matching algorithm running on the digital processor, finds and matches the template with the structure in the captured image. If the pattern is matched, the matched pattern's coordinates in the image and the rotating angle of the matched pattern is determined by the algorithm. The trailer angle can then be calculated from these values. By repeated performance of the above mentioned pattern matching on the continuous or sequential or subsequent frames from the camera video output, the trailer angle can be continuously tracked and/or monitored.


The coordinates of the detected target in a captured image can be further transformed to the angle of trailer by applying a set of formulas. The formula can be implemented in the processor in the form of a set of arithmetic formulas, or may be implemented in the form of a look up table or tables. The formula is formed and determined by the dimensional characteristics of the trailer, the distance between the trailer body (or the location of the target or detected pattern or portion of the trailer) and the pulling vehicle (and/or the distance between the target and the pivoting joint at which the trailer is attached to the vehicle), the camera mounting position and angle, and the camera lens distortion characteristics.


Applications of Trailer Detection:


The rear view camera-based trailer angle detection can be used in, but is not limited to, several applications, including a trailer sway detection system (that detects a sway or swing of the trailer while the vehicle is towing the trailer in a forward direction along a road or highway), a rearward backup assist system (that detects the angle of the trailer and determines a reversing path or trajectory of the trailer during a reversing maneuver of the vehicle and trailer), and a trailer hitching system (that detects the trailer and guides the driver during backing up of the vehicle towards a trailer so that the trailer hitch of the vehicle is generally aligned with the tongue of the trailer).


Trailer Sway Detection and Alert/Compensation System. When a trailer is pulled behind a vehicle at a relatively high speed, a lateral swing or sway of trailer can cause instability of the trailer and its pulling vehicle. By detecting the trailer angle in real time, the system of the present invention can detect the onset of lateral swing or sway of the trailer and may, responsive to such a detection, alert the driver of the swing or sway or control one or more vehicle or trailer systems or accessories to compensate for the detected swing or sway of the trailer. For example, the system may, responsive to a detection of a threshold degree of a lateral swing or sway of the trailer relative to the vehicle, be operable to generate an alert signal to the driver of the vehicle (such as an audible alert or visual alert or haptic alert or the like) to alert the driver of a potentially hazardous situation. Optionally, the system may control the brakes and/or steering of the vehicle and/or trailer to control the vehicle and trailer, such as in response to a detection of a threshold degree of sway or swing of the trailer relative to the vehicle. For example, the system may provide a closed loop control of the trailer angle by using individual braking of the pulling vehicle wheels and/or the trailer wheels to control or adjust or correct for the trailer swing or sway. Optionally, a steering wheel angle control (that may control or adjust the steering angle of the vehicle's wheels) or the like can also be part of closed loop control of trailer sway.


The trailer angle detection based on real time target or target-less image processing and/or algorithms can provide high speed and real time reading of the trailer angle of the trailer being towed by the pulling or towing vehicle. This reading can be used in real time trailer sway control. Optionally, the threshold level or degree of sway or swing of the trailer relative to the vehicle may be selected or preset, or may be dynamic, whereby the threshold degree may vary responsive to the speed of the vehicle and/or load of the trailer and/or the like. Optionally, and desirably, the system may only generate the alert and/or control the vehicle/trailer system or systems responsive to the detected swing or sway reaching or exceeding the threshold level and while the vehicle is traveling forwardly along the road.


Projection of Trailer Position During Trailer Backing Up. The normal view of a backup camera on a trailer pulling vehicle is typically blocked by the trailer, and thus such a backup camera cannot provide visual backup assistance to the driver when the trailer is attached to the vehicle. However, the camera system of the present invention is operable to detect the angle of the trailer axis with respect to the pulling vehicle, and with the knowledge of the trailer dimensional characteristics (such as wheel position and distance from the vehicle and the like), the processor can calculate and project a trajectory or reversing path of the trailer in the form of graphic overlay on the camera display or video display (typically disposed in the vehicle, such as at or in or near an interior rearview mirror assembly of the vehicle) to indicate to the driver viewing the video display a path or trajectory of where the trailer is backing up to. In addition, when the trailer pulling or towing vehicle is equipped with side view cameras, the added views provided by the side cameras (typically having fields of view directed generally rearwardly and sidewardly with respect to the direction of forward travel of the equipped vehicle) can provide additional scene information of the trailer to assist the driver of the vehicle (viewing the images at a video display of the vehicle) during a reversing or backing up maneuver. The calculated graphical trailer path can be overlaid to the side camera image to further assist the driver of the vehicle during a reversing or backing up maneuver.


Optionally, the system may provide an alert (such as an audible alert or visual alert or haptic alert or the like) to alert the driver of a potentially hazardous situation during the reversing maneuver, such as responsive to detection of an object rearward of the trailer and in the path of the trailer (such as via processing of images captured by sideview cameras of the towing vehicle and/or processing of images captured by a rearward viewing camera at the rear of the trailer or the like). The alert may comprise any suitable alert, such as an alarm or tone or audible alert or a visual alert such as a highlighting of the displayed video images or the like in response to a detection of an object rearward of or at or near the rearward path of the trailer. Optionally, the system may control the brakes of the vehicle and/or trailer to slow or stop rearward movement of the vehicle and trailer in response to detection of an object in the rearward path of travel of the trailer and a determination that a collision may occur between the trailer and object.


Trailer Hitching. Backing up a vehicle to hitch a trailer is not always intuitive process. If the position of the trailer hitching part is detected by identifying the tongue of the trailer that is to be attached to the vehicle, the processor can calculate a trajectory of the vehicle's hitch and guide the driver to turn the steering wheel of the vehicle and follow the trajectory to back the vehicle up to and in alignment with the trailer tongue for hitching the trailer to the vehicle. It is also envisioned that the control system may automatically turn the steering wheel of the vehicle to follow the calculated trajectory to position the vehicle's hitch at the trailer tongue for hitching the trailer to the vehicle. During the backing up process, a real time detection and tracking of a target at or on the trailer provides feedback and adjustment to the turning or control of the steering wheel of the vehicle.


Thus, the present invention provides a trailer monitoring system that may detect the angle of a trailer being towed by a vehicle relative to a longitudinal axis of the towing vehicle. The trailer angle detection and monitoring system thus can detect and monitor the angle of the trailer relative to the vehicle while the trailer is being towed along a road or highway, and may be operable to adjust or control one or more systems of the vehicle and/or trailer (such as a brake system of the vehicle and/or trailer and/or a steering system of the vehicle or the like) to correct for or adjust responsive to a detected sway or swing of the trailer during forward towing of the trailer. Optionally, the trailer angle detection system may assist the driver in backing up the trailer, such as via providing a graphic overlay at a video display of the vehicle, so as to guide the driver during a reversing maneuver with the trailer rearward of the vehicle. Optionally, the trailer angle detection system may assist the driver in backing up to an unhitched trailer to assist the driver in aligning the vehicle hitch with the tongue of the trailer.


Cross Shaped Target Histogram Algorithm:


As discussed above, a pattern searching image algorithm based on image histogram can be used to detect the target. A cross-shaped target (such as a cross-shaped target 114 at the trailer 110 of FIG. 3) may be chosen as one type of the target for the histogram algorithm. The target can be a white cross on a black background (such as shown in FIG. 3) or a black cross on a white background (such as shown in FIGS. 1 and 2). Optionally, a colored cross shape pattern may also be used, but it is preferred that the colors be selected so that the cross and the background have good contrast in the captured images.


An initial pattern search may be conducted in the image to roughly locate the cross shaped target. This may be achieved by a similar pattern matching that involves a pre-recorded pattern template. After the rough location of the center of the cross is located, a smaller window is drawn around the cross center. The window may be, for example, about 32 by 32 pixels in size. The histogram is separately done in rows (vertical histogram) and columns (horizontal histogram). Each row and column is one pixel wide. The pixel brightness is summed in each row and column. The row and column with maximum summed values represent or are indicative of the center of the target or cross. The center of the target or the pixel coordinate of the center of the cross is used to represent the trailer angle. The histogram computation of the center of the target may be performed on every frame that the rear view camera generates. The target center coordinate may be used as the center of the histogram window for next frame. This provides a good tracking of the target when the trailer turns and target moves in the image. A graphical depiction of the histogram is shown in the FIGS. 4 and 5, with FIG. 4 showing a vertical histogram and FIG. 5 showing a horizontal histogram.


Sub-Pixel Resolution:


In order to reach a higher or enhanced accuracy for a trailer angle reading from the cross-shaped target, the sub-pixel resolution of the target location may be utilized. One example of such sub-pixel searching may, once the histogram of row and column in the search window produces a location of the cross target center coordinates, compute a sub-pixel center using the center of mass method. The center of mass computation is performed separately to columns and rows. In column center of mass computation, the following steps are taken:

    • 1. The average column sum is first subtracted from each column;
    • 2. Treating the column sums as mass, the center of mass is computed by the following equation:








Center





of





Mass

=





p
i



s
i






s
i




;






    • where pi is the pixel location and si is the column sum; and

    • 3. The numerator and denominator are computed separately and converted from integer to floating point numbers, and the center of mass in column dimension coordinates is computed accordingly in the form of floating point number.





The center of mass in row dimension is computed in the same method as the column center of mass. The new and higher resolution center of target coordinate is thus calculated. With this approach, about a one degree of trailer angle resolution and accuracy may be achievable.


Some deviation of the above formulas and computation can also produce the same or similar results. For example, instead of subtracting by the column average, each column could be subtracted by the minimum sum across all columns. The methods other than center of mass method described above can also be used to reach sub-pixel resolution. It is in the same spirit of using a plurality of pixels to reach sub-pixel resolution.


Color Pattern Matching:


As described above, the system may utilize pattern matching of a feature on a trailer tongue to measure the trailer angle with a rear view camera in the trailer pulling vehicle. A pattern template may be pre-generated and stored in the system memory and may be searched and matched in the real time camera images. When a pattern is matched, the position and angle of the pattern are calculated. The position and angle information of the matched pattern in the image can be used to calculate the trailer angle. However, using a native feature on the trailer sometimes may not produce accurate enough position and angle information, because the feature to be matched does not always stand distinctly from its surrounding; therefore pattern matching algorithm may produce inaccurate results. In order to address this potential issue, a technique of color pattern matching with a simple and effective is discussed below.


The color pattern matching technique of the present invention has the advantage of having one more matching criteria (pattern and color), than the pattern matching technique described above, which relies on only monochrome geometric features and patterns. As a result, the matching reliability and accuracy of color pattern matching is improved and enhanced as compared to that of monochrome pattern matching. In the implementation of this invention, certain colors are preferable than others. For example, red and green color patterns are more readily discernible and processable as compared to, for example, blue, yellow and orange colors, since there are often dividing lines and signs on the road surface and parking ground or areas with blue, yellow and orange colors, yet there are relatively fewer red and green dividing lines and signs on the road surface and parking ground or areas. In addition, a pattern with a mix of different colors can be used to further increase pattern matching accuracy.


Single Target:


In order to increase the pattern matching accuracy of the system, a simple color target may be disposed at or on the trailer tongue structure and the system may process the images to detect the orientation of the color target instead of using existing or native features or patterns on the trailer tongue. The color target may comprise a color tape applied to certain portions of the trailer or trailer tongue (such as colored duct tape or the like that one can easily purchase from a hardware store). Optionally, the color target may comprise a colored feature or structure that is designed in and embedded into the trailer tongue structure by the manufacturer of the trailer, or the color target may be provided as a special part supplied to the customers from automakers, such as the automaker of the towing vehicle that has the imaging system of the present invention installed therein. For example, one part of a trailer tongue that can be used to put a colored tape on may comprise the lever above the hitch ball that is used to lift and unlock the hitch, while another part of the trailer tongue that may have the tape applied thereto is the pulling arm or arms of the trailer. The tape may comprise any suitable colors, such as, for example, red or green, or other colors or mix of colors. The tape may have a preferred rectangular shape and may have, for example, a width of about one inch and a length of about six inches or longer. Other dimensional sizes may also be used in accordance with the present invention. Optionally, the rectangular target can also be combination of two or more colors to increase pattern matching accuracy. The target may be attached to the trailer structure via any suitable attaching means, such as by gluing, bolting, printing or other means.


Multiple Targets:


Optionally, and with reference to FIGS. 6 and 7, multiple targets 214a, 214b, 214c can be used at or on the trailer 210 or trailer tongue 210b or trailer pulling arms 210c to increase pattern matching accuracy. When a trailer swings to a large angle, certain vertical parts of the trailer may partially or entirely block a target that is placed on one arm of the trailer. However, when multiple targets are used on both sides of the trailer or on both pulling arms 210c, at least one target is available and viewable at all trailer angles. For normal and small trailer angles, both targets 214b, 214c can be simultaneously used to increase accuracy. In certain unfavorable lighting conditions, like strong reflection of sun light or street light from a target to camera, the target may appear to lose some of its color due to washout effect. However, because of the nature of the narrow angular reflection of strong light from the sun or a street light, one target may be washed out but another target on a different pulling arm is most likely not washed out. This will limit or substantially preclude or avoid the loss of pattern tracking and detection accuracy in such unfavorable lighting conditions.


Matching a Rectangular Target:


A target 214a, 214b, 214c with an elongated rectangular shape may be selected for a pattern matching target for use with the system of the present invention. For example, a piece of tape of a selected color (such as, for example, green tape for targets 214a and 214c, and red tape for target 214b, or green or red tape may be used for all three targets, or other colored tape or other coatings or markers may be used for one or more of the targets) may be placed on a horizontal part of trailer tongue that is visible to the pulling vehicle's rear view camera. The tape target shows as a rectangle or trapezoidal color object (such as highlighted in FIG. 7) in the rear view camera's image. When the trailer turns to a different angle relative to pulling vehicle, the tape target turns in the camera image as well.


During such movements, the primary characteristics of each of the tape targets, such as shape and dimensional size, changes very slightly because the rear view camera usually has a wide angle field of view and thus optical distortion of field of view exists. However, one property of the target, the center line of the tape pattern, still reflects the angle of the tape in the captured images. In the pattern matching technique of the present invention, the algorithm rotates the pattern template with fine steps and finds the best match of the template to the target in the real time image. If the pattern matching template 215 (FIG. 7) is extracted from the image when the trailer is in its straight position, or zero trailer angle, the angle of template rotation is the angle of tape target rotation from its straight or non-angled orientation. The pattern matching computation can be realized in the form of cross-correlation of pattern matching template and the image to be matched. Other types of algorithmic computation can also be used to realize the same result.


Thus, for example, the system may process the captured images to determine the location of the colored targets at the trailer (where the target or targets may comprise the same color or different colors, such as one red target and two green targets or the like), and when the targets are detected, the system may calculate the angles of the targets and/or may match the detected targets to a template that is indicative of the angles of the targets, whereby the angle of the trailer relative to the vehicle may be determined by the system. Thus, the present invention provides a target pattern design (such as shape and color) at the trailer or trailer tongue that the imaging system can readily discern and process to determine the trailer angle relative to the vehicle. The advantage of this target pattern design includes the following:

    • 1. The target is relatively simple and inexpensive—one can buy colored duct or painting or masking tape from a hardware store, and may cut a piece of 6 inches or longer part and stick it to the pulling arm of the trailer. Other types of off-the-shelf tapes or paints may be used. Such a colored target thus is not specially designed and made just for the trailer and imaging system, so the cost may be low.
    • 2. The processing is performed via a simpler algorithm—a pattern matching template 215 (FIG. 7) may be readily constructed as a rectangle with a colored rectangle shape in the middle. The simplicity of the pattern template reduces the template size in the system memory and the computation requirement of the pattern matching algorithm.
    • 3. Because of its rectangular nature, the angle of the matched pattern directly correlates to the angle of pulling arm in the camera image and to the trailer angle, thus reducing computational complexity and increasing trailer angle detection accuracy.


Color Image Components Used in Pattern Matching:


A color image captured by the camera or image sensor contains both color and brightness information, which sometimes are also called chrominance and luminance information. In color pattern matching algorithms such as are described herein, both chrominance and luminance information of images are used to increase the matching performance (such as the accuracy and speed of matching) of the system of the present invention.


As one type of representation, a color image can be represented by Color Hue (H), Color Saturation (S) and Luminance (L). Hue information tells the color type of an object (green, blue or yellow), while saturation tells the vividness of the color, and luminance tells the brightness of the object in the image. One can extract one or more layers of the above color information from a color image and process them individually. Another type of color representation is Red (R), Green (G) and Blue (B) components of color image. R, G, B color components can usually be affected directly by a change in brightness, which is a common condition in vehicle systems running in real world conditions. In contrast, color hue is not affected by image brightness change. Therefore, in real world vehicle running conditions, one can reliably use color hue as a critical matching criteria. Color hue and saturation are also typically more sensitive than RGB component in telling small color differences. A small and subtle color change of two adjacent objects, such as one target that is a real target, and another one that is a nearby object with close shape and color, can be better distinguished by color hue than by RGB component. Color saturation and luminance may also play an important role in accurate color pattern matching. Optionally, the system may mix a green component in RGB color space and HSL color space to produce optimized color pattern matching result. The following are several examples of different ways of implementations of the algorithms suitable for use in the trailer detecting system of the present invention.


Example #1

In this example, luminance and saturation are used as the primary components, while hue is the secondary component, for image processing and target matching. A pattern template is extracted by luminance and saturation components of the target as the pattern template. Color hue is the secondary criterion that is extracted from the match template (the color tape) to narrow down or verify that a matched pattern is the true target. In processing, the luminance layer and the saturation layer of the image are extracted out of the original image and then summed together as a new image. The pattern matching of the L+S template in the new image is performed. One or several potential matches may be reached with different scores. Only one of them should be the real target. On each potential match, a color hue value is checked against the color hue value of the original color template. In other words, the color hue is used to narrow down the potential matches to a match with the real target. In situations where only one target is matched to the L+S template, the hue is applied to check the confidence level of the match.


Example #2

In this example, color saturation is used as the primary component, while color hue is the secondary component, for image processing and target matching. The pattern template is extracted by saturation (S) components of the target as the pattern template. The color hue (H) is the secondary criterion that is extracted from the match template (the color tape) to narrow down or verify that a matched pattern is the true target. In processing, the luminance layer and the saturation layer of the image are extracted out of original image and then summed together as a new image. The pattern matching of the S template in the new image is performed. One or several potential matches may be reached with different scores. Only one of them should be the real target. On each potential match, a color hue value is checked against the color hue value of the original color template. In other words, the color hue is used to narrow down the potential matches to a match with the real target. In situations where only one target is matched to the S template, the hue (H) is applied to check the confidence level of the match.


Example #3

In this example, such as for applications with a known and defined color tape target, one can use the direct primary color components (R, G or B) as the primary component of pattern matching. For example, when the tape is green, a green component is used with the luminance component as the primary matching components. The pattern template is extracted by Luminance+Green components of the target as the pattern template. The color hue is the secondary criterion that is extracted from the match template (the color tape) to narrow down or verify that a matched pattern is the true target. In processing, the luminance layer and the green layer of the image are extracted out of original image and then summed together as a new image. The pattern matching of the L+G template in the new image is performed. One or several potential matches may be reached with different scores. Only one of them should be the real target. On each potential match, a color hue value is checked against the color hue value of the original color template. In other words, the color hue is used to narrow down the potential matches to a match with the real target. In situations where only one target is matched to the L+G template, the hue is applied to check the confidence level of the match.


Thus, the system of the present invention may provide one, two or three (or more) color targets disposed at a trailer surface, such as at or on one or more portions of the trailer tongue and/or pull arms, whereby the system processes the image data captured by the imaging device to determine when a detected target or trailer portion matches a target template. For example, and with reference to FIGS. 6 and 7, three color tape targets 214a, 214b, 214c are placed on the trailer tongue 210b and pulling arms 210c, and the system processes captured image data to determine if any portions of the captured image data are indicative of the pattern template 215. As shown in FIG. 7, the system may achieve pattern matching of one or all three color patterns to determine the angle of the trailer relative to the towing vehicle.


Real Time Control of Camera Exposure and Color:


In real world driving conditions, illumination at the scene, and especially at the target, is not controlled due to light illuminating the scene from various sources, including sunlight, streetlights and the like. In some conditions, for example, sun light or street light at night is directly reflected by the target to the camera. The resulting glare-like reflection may cause the target to be saturated and washed out, and thus the target may lose color information. The pattern matching accuracy thus may be degraded because of target saturation. In other conditions, a shadow of the trailer or of the pulling vehicle cast on the target region may cause the target to darken, and this may also affect the pattern matching accuracy.


A method to remedy these issues may include actively controlling the camera's exposure to maintain a targeted or desired target image quality. Such an active camera exposure control can be achieved at the camera level or via a communication link between the camera and controller or ECU or the like. The communication link can be a CAN, LIN and/or other forms of serial and parallel communication protocols or channels. For example, the system may use an existing LIN bus connection between the rear view camera and the ECU of the vehicle, such as by utilizing aspects of the camera and communication systems produced by Magna Electronics Inc. of Mich. The control algorithm continuously monitors pixel values of the target area in the captured images, and once the control algorithm detects that the target image is near saturation, the system may lower the image pixel values by reducing exposure time or gain or both. Likewise, when a shadow is cast over the target, the active exposure control can detect the darkening of the target and may raise the pixel value levels by increasing exposure time or gain or both.


When the camera is running in different scene environments, the auto white balancing algorithm in the camera may actively adjust color of the image to reach a white balance of the whole image. However, if a scene contains a significantly large and colorful object, for example, if a large area of green grass, or a large brown colored building is in the image, the auto white balancing algorithm will adjust the color parameters of the camera to make the whole image white balanced. As the result, some local object's color may be changed significantly from its real color. This will change the detected color of the target and possibly affect the color pattern matching accuracy. Similar to the above active exposure control, the active camera control through a camera—ECU communication or within the camera can actively control the camera color parameters to maintain the color of the target region to be constant or within a predetermined or selected range and thus maintain the pattern matching accuracy.


The active control of the camera exposure and white balance and color could result in the overall image quality to be non-favorable for a driver viewing the images in a vehicle display. However, this is not typically a concern since, when a trailer is hitched to the pulling vehicle, the rear view image is rarely used because of the blockage of its field of view by the trailer. Furthermore, even in rare occasions that the driver desires or needs to view the space between vehicle and trailer, that area is viewable since the control of exposure and color is tailored to a best image quality around that area which is close to the target.


Feature Extraction and Matching Implemented in Hardware:


The present invention may provide an algorithm that detects features in a reference frame for detecting the trailer in subsequent frames. The algorithm first detects corners in a reference frame. A descriptor is built for each corner found that can be statistically matched in subsequent frames. As can be seen with reference to FIG. 8, a trailer scene can be imaged or shown with the feature extraction algorithm running. Each feature point has a descriptor that can be matched from frame to frame.


The algorithm can be run at, for example, about 30 frames per second at 640×480 resolution when implemented on a PC. This algorithm may be ported to run on a single-board computer, such as a computer that has an ARM Cortex-A8 processor and Texas Instruments C64x+ DSP and/or the like. The algorithm can be ported and designed to other digital processors, such as DSP, FPGA, ASIC or system-on-chip (SOC) in a separate ECU while the video is streamed into the ECU through signal cable from a separate camera. Optionally, above digital processors that are small and consuming lower power, may be integrated into a compact camera and eliminate or reduce or obviate the need to have a separate ECU and cable.


The algorithm is suitable for use in the trailer angle detection system of the present invention because instead of asking the customer to affix a target to their trailer, the system can use existing, natural features of the trailer instead. This eliminates the need of using any added targets that are added onto or incorporated on the trailer, such as the targets described above, and also eliminates the potential user errors from the customer measuring distances and entering numbers into the system graphical user interface.


Trailer Straight Algorithm:


Optionally, the present invention may provide a trailer straight algorithm to determine when the vehicle and trailer are straight in order to calibrate and apply an offset correction to the angle calculation. Such an algorithm or approach combines vehicle dynamics with computer vision techniques. The trailer angle detection system may be on the vehicle network, which allows it to receive vehicle information, such as individual wheel speed, steering wheel angle, and/or the like. When the vehicle pulls trailer and drives in a generally or substantially straight path, the angle of the trailer is at its zero (or substantially zero) degree angle and the system detects an offset angle to perform a calibration of the system. The following describes how the algorithms run and perform the calibration.


The first part of the algorithm looks at the average wheel speed for the left and right sides of the vehicle. When the mean speed of both sides is greater than zero and the difference between the two sides is within a given tolerance value, the second part of the algorithm engages. The second part of the algorithm looks at the angular movement of the trailer. This may be done using a target that is affixed to the trailer, but it could be extended to also look at the feature points, such as discussed above. If the angular movement of the trailer is within a tolerance level (in other words, not moving very much), and the first part still holds true (that there is straight or substantially straight movement of the vehicle), the angle calculation is averaged over a given period of time or distance traveled to calculate the offset, which is stored in the system memory and applied to subsequent angle calculations.


An alternative and simpler algorithm may function to read vehicle steering wheel angle through a vehicle network or bus. When the steering wheel angle is within a tolerance range of zero degrees, and vehicle wheel speed is greater than zero over a given period of time or distance traveled, the trailer angle is read and averaged as the offset that is stored in the system memory and applied to subsequent angle calculations.


Hitch Ball Detection:


Optionally, the present invention may provide a hitch ball detection algorithm for detecting the hitch ball at the rear of the vehicle. The purpose of this algorithm is to locate the hitch ball to potentially assist the customer in hitching the trailer to the vehicle. The first step is to segment the image, such as can be seen with reference to FIGS. 9A and 9B (and such as by utilizing aspects of the system described in Felzenszwalb, P. F. and D. P. Huttenlocher, Efficient graph-based image segmentation. International Journal of Computer Vision, 2004. 59(2), p. 167-181, which is hereby incorporated herein by reference in its entirety). Once the image has been segmented, edges are detected, such as shown in FIG. 10. After edges are found, the Hough transform (such as by utilizing aspects of the system described in Duda, R. O. and P. E. Hart, Use of the Hough transformation to detect lines and curves in pictures. Communications of the ACM, 1972. 15(1), p. 11-15, which is hereby incorporated herein by reference in its entirety) is then run to detect circles. The Hough transform may be selected because of its ability to be invariant to missing data points, which could be the case with the image quality due to outdoor operating conditions. The search space is limited to look at the region where the hitch could possibly be located using the CAD data for the particular vehicle. The limited search space reduces false positives and improves the runtime of the algorithm.


Thus, the trailer monitoring or trailer angle detection system of the present invention may detect a target on or of a trailer or a portion of the trailer and may, such as via image processing, and may calculate and/or determine the angle of the trailer relative to the vehicle, such as relative to the longitudinal axis of the vehicle, such as via algorithmic processing. The image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data. For example, the processing may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or PCT Application No. PCT/US2010/047256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686 and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or U.S. provisional applications, Ser. No. 61/583,381, filed Jan. 5, 2012; Ser. No. 61/579,682, filed Dec. 23, 2011; Ser. No. 61/570,017, filed Dec. 13, 2011; Ser. No. 61/568,791, filed Dec. 9, 2011; Ser. No. 61/567,446, filed Dec. 6, 2011; Ser. No. 61/567,150, filed Dec. 6, 2011; Ser. No. 61/565,713, filed Dec. 1, 2011; Ser. No. 61/563,965, filed Nov. 28, 2011; Ser. No. 61/559,970, filed Nov. 15, 2011; Ser. No. 61/556,556, filed Nov. 7, 2011; Ser. No. 61/554,663, filed Nov. 2, 2011; Ser. No. 61/550,664, filed Oct. 24, 2011; Ser. No. 61/552,167, filed Oct. 27, 2011; Ser. No. 61/548,902, filed Oct. 19, 2011; Ser. No. 61/540,256, filed Sep. 28, 2011; Ser. No. 61/539,049, filed Sep. 26, 2011; Ser. No. 61/537,279, filed Sep. 21, 2011; Ser. No. 61/513,745, filed Aug. 1, 2011; Ser. No. 61/511,738, filed Jul. 26, 2011; Ser. No. 61/503,098, filed Jun. 30, 2011, which are all hereby incorporated herein by reference in their entireties.


Typically, a rearward facing camera for a rear vision system or backup assist system is activated responsive to the driver of the equipped vehicle shifting the gear actuator into a reverse gear position, whereby video images captured by the camera are displayed at the video display screen. When the reversing maneuver is completed, such as when the driver of the vehicle finally shifts the gear actuator out of the reverse gear position (and into either a park or neutral position or a forward gear position), display of the images captured by the camera ceases and the camera is often deactivated. The vision display system may operate to display the rearward images at the video mirror display responsive to the driver of the vehicle shifting the vehicle into a reverse gear such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. U.S. Pat. Nos. 5,550,677; 5,670,935; 6,498,620; 6,222,447 and/or 5,949,331, and/or PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011 and published Jun. 28, 2012 as U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.


During forward travel of the vehicle, such as when the vehicle shifter is shifted to a forward or drive gear position, the rear camera may capture images of the trailer for determining and monitoring the trailer angle, as discussed above. Such operation of the rear camera during forward travel (and associated processing of the captured images and the like) may be responsive to the vehicle speed reaching a threshold level and a signal indicative of the vehicle towing a trailer (such as a signal indicative of a connection of a trailer wiring harness to a vehicle wiring harness or the like), such that the activation of the rear camera and subsequent or corresponding monitoring of the trailer angle only occurs in situations when it is desired or appropriate.


The rearward facing camera or camera module may comprise any suitable camera or imaging sensor, and may utilize aspects of the cameras or sensors described in U.S. Pat. Nos. 7,965,336 and/or 7,480,149, and/or U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,720,580; 7,965,336; 7,339,149; 7,038,577 and 7,004,606, and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO 2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO 2009/046268, which are all hereby incorporated herein by reference in their entireties.


Optionally, the rearward facing camera may have a wide angle rearward field of view, such as a wide angle rearward field of view that encompasses about 185 degrees (fields of view larger and smaller than this may be contemplated while remaining within the spirit and scope of the present invention). Thus, during a reversing maneuver, the rearward facing camera and video processor and video display screen can operate to display entire images (or substantially entire images) captured by the rearward facing camera (such as, for example, images encompassed by the about 185 degree field of view of the camera), in order to provide video images to the driver of the vehicle of a wide area or region or blind zone immediately rearward of the vehicle to assist the driver of the vehicle in making the reversing maneuver. The rearward facing camera and/or video processor and/or video display screen and/or backup assist system may utilize aspects of the systems described in U.S. Pat. Nos. 5,550,677; 5,760,962; 5,670,935; 6,201,642; 6,396,397; 6,498,620; 6,717,610; 6,757,109; 7,005,974 and/or 7,265,656, which are hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor and processor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. provisional application Ser. No. 60/618,686, filed Oct. 14, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip (such as of the display or camera system or image processor or the like) may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149, and/or U.S. patent applications, Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. US-2010-0097469, which are hereby incorporated herein by reference in their entireties.


The display is operable to display the captured rearward images and may comprise a video display and may utilize aspects of the video display devices or modules described in U.S. Pat. Nos. 6,690,268; 7,184,190; 7,274,501; 7,370,983; 7,446,650 and/or 7,855,755, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The video display may be operable to display images captured by one or more imaging sensors or cameras at the vehicle. The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 6,198,409; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, which are all hereby incorporated herein by reference in their entireties.


The video display screen may disposed at an interior rearview mirror assembly of the vehicle (such as in a mirror casing and behind a reflective element of a mirror assembly such that displayed information is viewable through the reflective element of the mirror assembly). The interior mirror assembly may comprise an electro-optic reflective element, such as an electrochromic reflective element, having a transflective mirror reflector (such as one or more thin metallic films or coatings disposed on a surface of a substrate of the reflective element, such as disposed on the front surface of the rear substrate, commonly referred to as the third surface of the mirror reflective element) that is partially transmissive of visible light therethrough and partially reflectant of visible light incident thereon, such as a mirror reflective element of the types described in U.S. Pat. Nos. 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, which are all hereby incorporated herein by reference in their entireties). Thus, the video display screen, when operating to display video images or the like, is viewable through the transflective mirror reflector and the mirror reflective element by the driver of the vehicle and, when the video display screen is not operating to display video images or the like, the video display screen is not readily viewable or observable or discernible to the driver of the vehicle, such that the presence of the video display screen is rendered covert by the transflective mirror reflector and the driver of the vehicle normally views the mirror reflector and reflective element to view the reflected rearward image at the mirror reflective element. Optionally, the video display screen may be disposed elsewhere in the vehicle, such as at or in an accessory module or windshield electronics module or overhead console or center stack region of the instrument panel or elsewhere at the instrument panel or other areas of the vehicle, while remaining within the spirit and scope of the present invention.


Optionally, the vision display system may operate to display the rearward images at the video mirror display and the bird's-eye or top down or panoramic images/view at the navigation or infotainment screen, and may do so responsive to the driver of the vehicle shifting the vehicle into a reverse gear (such as by utilizing aspects of the vision systems described in U.S. Pat. Nos. U.S. Pat. Nos. 5,550,677; 5,670,935; 6,498,620; 6,222,447 and/or 5,949,331, and/or PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, and/or PCT Application No. PCT/US2010/047256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, which are hereby incorporated herein by reference in their entireties).


Optionally, the mirror assembly may include one or more displays, such as the types disclosed in U.S. Pat. Nos. 5,530,240 and/or 6,329,925, which are hereby incorporated herein by reference in their entireties, and/or display-on-demand transflective type displays, such as the types disclosed in U.S. Pat. Nos. 7,855,755; 7,626,749; 7,581,859; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent applications, Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties, so that the displays are viewable through the reflective element, while the display area still functions to substantially reflect light, in order to provide a generally uniform prismatic reflective element even in the areas that have display elements positioned behind the reflective element. The thicknesses and materials of the coatings on the substrates, such as on the third surface of the reflective element assembly, may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are all hereby incorporated herein by reference in their entireties.


Optionally, the vehicle may include one or more other accessories at or within the mirror assembly or otherwise associated with or near the mirror assembly, such as one or more electrical or electronic devices or accessories, such as a blind spot detection system, such as disclosed in U.S. Pat. Nos. 5,929,786; 8,058,977; 5,786,772; 7,720,580; 7,492,281; 7,038,577 and 6,882,287, a communication module, such as disclosed in U.S. Pat. No. 5,798,688, a voice recorder, microphones, such as disclosed in U.S. Pat. Nos. 7,657,052; 6,243,003; 6,278,377 and/or 6,420,975, speakers, antennas, including global positioning system (GPS) or cellular phone antennas, such as disclosed in U.S. Pat. No. 5,971,552, transmitters and/or receivers, such as a garage door opener or the like or a vehicle door unlocking system or the like (such as a remote keyless entry system), a digital network, such as described in U.S. Pat. No. 5,798,575, a high/low headlamp controller, such as a camera-based headlamp control, such as disclosed in U.S. Pat. Nos. 5,796,094 and/or 5,715,093 and/or U.S. patent application Ser. No. 12/781,119, filed May 17, 2010 and published Nov. 17, 2011 as U.S. Publication No. US 2011-0280026, a memory mirror system, such as disclosed in U.S. Pat. No. 5,796,176, a hands-free phone attachment, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962 and/or 5,877,897, a remote keyless entry receiver, lights, such as map reading lights or one or more other lights or illumination sources, such as disclosed in U.S. Pat. Nos. 6,690,268; 5,938,321; 5,813,745; 5,820,245; 5,673,994; 5,649,756; 5,178,448; 5,671,996; 4,646,210; 4,733,336; 4,807,096; 6,042,253; 5,669,698; 7,195,381; 6,971,775 and/or 7,249,860, an imaging system or components or circuitry or display thereof, such as an imaging and/or display system of the types described in U.S. Pat. Nos. 7,881,496; 7,526,103; 7,400,435; 6,690,268 and 6,847,487, and/or U.S. patent application Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. US-2010-0097469 and/or Ser. No. 12/508,840, filed Jul. 24, 2009 and published Jan. 28, 2010 as U.S. Publication No. US-2010-0020170, an alert system, such as an alert system of the types described in PCT Application No. PCT/US2010/25545, filed Feb. 26, 2010 and published Sep. 2, 2010 as International Publication No. WO 2010/099416, a video device for internal cabin surveillance (such as for sleep detection or driver drowsiness detection or the like) and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962 and/or 5,877,897, a remote keyless entry receiver, a seat occupancy detector, a remote starter control, a yaw sensor, a clock, a carbon monoxide detector, status displays, such as displays that display a status of a door of the vehicle, a transmission selection (4wd/2wd or traction control (TCS) or the like), an antilock braking system, a road condition (that may warn the driver of icy road conditions) and/or the like, a trip computer, a tire pressure monitoring system (TPMS) receiver (such as described in U.S. Pat. Nos. 6,124,647; 6,294,989; 6,445,287; 6,472,979; 6,731,205 and/or 7,423,522), and/or an ONSTAR® system, a compass, such as disclosed in U.S. Pat. Nos. 5,924,212; 4,862,594; 4,937,945; 5,131,154; 5,255,442 and/or 5,632,092, a control system, such as a control system of the types described in PCT Application No. PCT/US10/38477, filed Jun. 14, 2010 and published Dec. 16, 2010 as International Publication No. WO 2010/144900, and/or any other accessory or circuitry or the like (with the disclosures of the above-referenced patents and patent applications and PCT applications being hereby incorporated herein by reference in their entireties).


The accessory or accessories may be positioned at or within a mirror casing of the interior rearview mirror assembly and may be included on or integrated in the printed circuit board positioned within the mirror casing, such as along a rear surface of the reflective element or elsewhere within a cavity defined by the casing, without affecting the scope of the present invention. The user actuatable inputs described above may be actuatable to control and/or adjust the accessories of the mirror assembly/system and/or an overhead console and/or an accessory module/windshield electronics module and/or the vehicle. The connection or link between the controls and the systems or accessories may be provided via vehicle electronic or communication systems and the like, and may be connected via various protocols or nodes, such as BLUETOOTH®, SCP, UBP, J1850, CAN J2284, Fire Wire 1394, MOST, LIN, FLEXRAY™, Byte Flight and/or the like, or other vehicle-based or in-vehicle communication links or systems (such as WIFI and/or IRDA) and/or the like, depending on the particular application of the mirror/accessory system and the vehicle. Optionally, the connections or links may be provided via wireless connectivity or links, such as via a wireless communication network or system, such as described in U.S. Pat. No. 7,004,593, which is hereby incorporated herein by reference in its entirety, without affecting the scope of the present invention.


Optionally, a display and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


Changes and modifications in the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

Claims
  • 1. A method for backing up a trailer hitched to a vehicle, said method comprising: mounting a rear backup camera at an exterior rear portion of a vehicle equipped with a trailer hitch;wherein the rear backup camera comprises a CMOS imaging array sensor;wherein the rear backup camera, when mounted at the exterior rear portion of the vehicle, is operable to capture frames of image data;hitching a trailer tongue of a trailer at the hitch of the vehicle to form a pivoting joint attaching the trailer to the vehicle;wherein, with the trailer hitched to the vehicle, the rear backup camera at least partially views the trailer tongue of the trailer;providing a control at the vehicle;the control comprising a processor operable to process image data captured by the rear backup camera;capturing image data using the rear backup camera during a backing up maneuver of the trailer hitched to the vehicle;providing to the control image data captured by the rear backup camera;with the trailer hitched to the vehicle and during the backing up maneuver of the trailer hitched to the vehicle, processing at the control image data captured by the rear backup camera;responsive to processing at the control of image data captured by the rear backup camera, detecting at least a portion of the trailer tongue of the trailer that is viewed by the rear backup camera, wherein the detected at least a portion of the trailer tongue of the trailer comprises structure of the trailer tongue of the trailer;wherein processing of image data captured by the rear backup camera at the control to detect the at least a portion of the trailer tongue comprises use of an edge detection algorithm;responsive to processing at the control of image data captured by the rear backup camera, determining during the backing up maneuver of the trailer hitched to the vehicle trailer angle of the trailer relative to a longitudinal axis of the vehicle; andwherein the control, based at least in part on trailer angle of the trailer relative to the longitudinal axis of the vehicle, controls a steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle to back up the trailer hitched to the vehicle along a rearward trajectory.
  • 2. The method of claim 1, wherein the processor of the control comprises a field-programmable gate array (FPGA).
  • 3. The method of claim 1, wherein the processor of the control comprises a digital signal processor (DSP).
  • 4. The method of claim 1, wherein the processor of the control comprises a system on chip (SOC).
  • 5. The method of claim 1, wherein the edge detection algorithm comprises a Sobel edge detection algorithm.
  • 6. The method of claim 1, wherein processing of image data captured by the rear backup camera at the control comprises use of pattern matching.
  • 7. The method of claim 1, wherein processing of image data captured by the rear backup camera at the control to detect the at least a portion of the trailer tongue comprises use of pattern matching.
  • 8. The method of claim 1, comprising providing to the control distance between the pivoting joint and the detected at least a portion of the trailer tongue of the trailer.
  • 9. The method of claim 1, comprising providing steering angle of the vehicle to the control during the backing up maneuver of the trailer hitched to the vehicle.
  • 10. The method of claim 1, wherein the rear backup camera mounted at the exterior rear portion of the vehicle has a field of view of at least 180 degrees.
  • 11. The method of claim 10, comprising providing to the control location of the rear backup camera at the exterior rear portion of the vehicle.
  • 12. The method of claim 1, comprising alerting of a potentially hazardous situation during backing up of the trailer hitched to the vehicle along the rearward trajectory.
  • 13. The method of claim 1, wherein the control comprises an electronic control unit (ECU) that is located within the vehicle separate from the exterior rear portion of the vehicle at which the rear backup camera is mounted, and wherein the ECU comprises the processor operable to process image data captured by the rear backup camera, and wherein image data captured by the rear backup camera is provided to the ECU.
  • 14. The method of claim 13, comprising providing a vehicle communication bus at the vehicle, and further comprising providing to the ECU steering angle of the vehicle during the backing up maneuver of the trailer hitched to the vehicle via the vehicle communication bus of the vehicle.
  • 15. The method of claim 14, wherein the vehicle communication bus of the vehicle comprises a controller area network (CAN) communication bus.
  • 16. The method of claim 14, wherein the vehicle communication bus of the vehicle comprises a local interconnect network (LIN) communication bus.
  • 17. The method of claim 13, wherein the ECU communicates with the rear backup camera via a serial data link.
  • 18. The method of claim 17, wherein the ECU controls exposure of the rear backup camera via the serial data link.
  • 19. The method of claim 17, wherein the ECU controls white balance of the rear backup camera via the serial data link.
  • 20. The method of claim 13, wherein the rear backup camera is part of a multi-camera bird's-eye top down view system of the vehicle.
  • 21. The method of claim 13, comprising providing at least one trailer characteristic of the trailer hitched to the vehicle to the ECU via a vehicle communication bus of the vehicle.
  • 22. The method of claim 1, wherein during the backing up maneuver of the trailer hitched to the vehicle, the rear backup camera captures at least 30 frames of image data per second.
  • 23. The method of claim 1, wherein the control comprising the processor is integrated into the rear backup camera.
  • 24. The method of claim 1, comprising providing to the control distance from the vehicle to a wheel of the trailer.
  • 25. The method of claim 1, comprising equipping the vehicle with side view cameras, each having a field of view at least rearward and sideward of the vehicle.
  • 26. The method of claim 25, wherein the side view cameras view the trailer during the backing up maneuver, and wherein image data captured by the side view cameras during the backing up maneuver is provided to the control and is processed at the control.
  • 27. The method of claim 25, wherein the side view cameras and the rear backup camera are part of a multi-camera bird's-eye top down view system of the vehicle.
  • 28. The method of claim 27, wherein the side view cameras comprise a first side view camera disposed in an exterior rearview mirror assembly attached at a driver side of the vehicle and a second side view camera disposed in an exterior rearview mirror assembly attached at a passenger side of the vehicle.
  • 29. The method of claim 28, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays bird's-eye top down video images that are at least partially derived from at least one selected from the group consisting of (i) image data captured by the rear backup camera and image data captured by the first side view camera disposed in the exterior rearview mirror assembly attached at the driver side of the vehicle and (ii) image data captured by the rear backup camera and image data captured by the second side view camera disposed in the exterior rearview mirror assembly attached at the passenger side of the vehicle.
  • 30. The method of claim 28, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays bird's-eye top down video images that are at least partially derived from (i) image data captured by the rear backup camera, (ii) image data captured by the first side view camera disposed in the exterior rearview mirror assembly attached at the driver side of the vehicle and (iii) image data captured by the second side view camera disposed in the exterior rearview mirror assembly attached at the passenger side of the vehicle.
  • 31. The method of claim 1, comprising providing to the control frames of image data captured by the rear backup camera during the backing up maneuver of the trailer hitched to the vehicle, and further comprising, during the backing up maneuver of the trailer hitched to the vehicle, processing image data captured by the rear backup camera at the control to detect, in real time, change in trailer angle of the trailer relative to the longitudinal axis of the vehicle.
  • 32. The method of claim 31, wherein the control, based at least in part on change in trailer angle of the trailer relative to the longitudinal axis of the vehicle during the backing up maneuver of the trailer hitched to the vehicle, controls the steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle.
  • 33. The method of claim 32, comprising during the backing up maneuver of the trailer hitched to the vehicle to back up the trailer hitched to the vehicle, providing steering angle of the vehicle to the control via a vehicle communication bus of the vehicle.
  • 34. The method of claim 33, wherein the vehicle communication bus of the vehicle comprises a controller area network (CAN) communication bus.
  • 35. The method of claim 33, wherein the vehicle communication bus of the vehicle comprises a local interconnect network (LIN) communication bus.
  • 36. The method of claim 1, comprising providing steering angle of the vehicle to the control during the backing up maneuver of the trailer hitched to the vehicle, and wherein the control, based at least in part on steering angle of the vehicle, controls the steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle to back up the trailer hitched to the vehicle along the rearward trajectory.
  • 37. The method of claim 36, wherein steering angle of the vehicle is provided to the control via a vehicle communication bus of the vehicle.
  • 38. The method of claim 37, wherein the vehicle communication bus of the vehicle comprises a controller area network (CAN) communication bus.
  • 39. The method of claim 1, comprising adhering a target at the at least a portion of the trailer tongue of the trailer that is viewed by the rear backup camera.
  • 40. The method of claim 39, wherein the target comprises at least one selected from the group consisting of (i) an icon viewed by the rear backup camera, (ii) a shape viewed by the rear backup camera and (iii) indicia viewed by the rear backup camera.
  • 41. The method of claim 39, wherein the target comprises at least one selected from the group consisting of (i) an icon viewed by the rear backup camera, (ii) a shape viewed by the rear backup camera, (iii) indicia viewed by the rear backup camera, (iv) a pattern viewed by the rear backup camera and (v) a colored pattern viewed by the rear backup camera.
  • 42. The method of claim 39, wherein processing at the control of image data captured by the rear backup camera comprises at least one selected from the group consisting of (a) pattern matching and (b) color matching.
  • 43. The method of claim 1, comprising providing to the control location at the vehicle of the rear backup camera mounted at the exterior rear portion of the vehicle.
  • 44. The method of claim 43, wherein location at the vehicle of the rear backup camera mounted at the exterior rear portion of the vehicle is derived, at least in part, from processing at the control of image data captured by the rear backup camera.
  • 45. The method of claim 1, comprising providing steering angle of the vehicle to the control during the backing up maneuver of the trailer hitched to the vehicle, and further comprising calculating at the control a reversing path of the trailer during the backing up maneuver of the trailer hitched to the vehicle based at least on (i) processing at the control of image data captured by the rear backup camera, (ii) trailer angle of the trailer relative to the longitudinal axis of the vehicle and (iii) steering angle of the vehicle.
  • 46. The method of claim 1, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays rear backup video images derived, at least in part, from image data captured by the rear backup camera, and wherein a graphic overlay overlays the rear backup video images displayed on the video display screen, the graphic overlay indicating to the driver of the vehicle at least a backing up direction of the trailer hitched to the vehicle.
  • 47. A method for backing up a trailer hitched to a vehicle, said method comprising: mounting a rear backup camera at an exterior rear portion of a vehicle equipped with a trailer hitch;wherein the rear backup camera comprises a CMOS imaging array sensor;wherein the rear backup camera, when mounted at the exterior rear portion of the vehicle, is operable to capture frames of image data;equipping the vehicle with side view cameras, each having a field of view at least rearward and sideward of the vehicle;wherein the rear backup camera and the side view cameras are part of a multi-camera bird's-eye top down view system of the vehicle;hitching a trailer tongue of a trailer at the hitch of the vehicle to form a pivoting joint attaching the trailer to the vehicle;wherein, with the trailer hitched to the vehicle, the rear backup camera at least partially views the trailer tongue of the trailer;providing a control at the vehicle;the control comprising a processor operable to process image data captured by the rear backup camera;capturing image data using the rear backup camera during a backing up maneuver of the trailer hitched to the vehicle;providing to the control image data captured by the rear backup camera;with the trailer hitched to the vehicle and during the backing up maneuver of the trailer hitched to the vehicle, processing at the control image data captured by the rear backup camera;responsive to processing at the control of image data captured by the rear backup camera, detecting at least a portion of the trailer tongue of the trailer that is viewed by the rear backup camera, wherein the detected at least a portion of the trailer tongue of the trailer comprises structure of the trailer tongue of the trailer;wherein processing of image data captured by the rear backup camera at the control to detect the at least a portion of the trailer tongue comprises use of an edge detection algorithm;responsive to processing at the control of image data captured by the rear backup camera, determining during the backing up maneuver of the trailer hitched to the vehicle trailer angle of the trailer relative to a longitudinal axis of the vehicle; andwherein the control, based at least in part on trailer angle of the trailer relative to the longitudinal axis of the vehicle, controls a steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle to back up the trailer hitched to the vehicle along a rearward trajectory.
  • 48. The method of claim 47, wherein the control comprises an electronic control unit (ECU) that is located within the vehicle separate from the exterior rear portion of the vehicle at which the rear backup camera is mounted, and wherein the ECU comprises the processor operable to process image data captured by the rear backup camera, and wherein image data captured by the rear backup camera is provided to the ECU.
  • 49. The method of claim 48, wherein the ECU communicates with the rear backup camera via a serial data link.
  • 50. The method of claim 49, wherein the rear backup camera mounted at the exterior rear portion of the vehicle has a field of view of at least 180 degrees.
  • 51. The method of claim 48, comprising adhering a target at the at least a portion of the trailer tongue of the trailer that is viewed by the rear backup camera.
  • 52. The method of claim 51, wherein the target comprises at least one selected from the group consisting of (i) an icon viewed by the rear backup camera, (ii) a shape viewed by the rear backup camera and (iii) indicia viewed by the rear backup camera.
  • 53. The method of claim 48, comprising providing steering angle of the vehicle to the control during the backing up maneuver of the trailer hitched to the vehicle, and wherein the control, based at least in part on steering angle of the vehicle, controls the backing up maneuver of the trailer hitched to the vehicle to back up the trailer hitched to the vehicle.
  • 54. The method of claim 48, wherein processing at the ECU of image data captured by the rear backup camera to detect the at least a portion of the trailer tongue comprises use of pattern matching.
  • 55. The method of claim 48, wherein image data captured by the side view cameras is provided to the ECU of the control and is processed at the ECU of the control.
  • 56. The method of claim 55, wherein the side view cameras view the trailer during the backing up maneuver, and wherein image data captured by the side view cameras during the backing up maneuver is provided to the ECU of the control and is processed at the ECU of the control.
  • 57. The method of claim 55, wherein the side view cameras comprise a first side view camera disposed in an exterior rearview mirror assembly attached at a driver side of the vehicle and a second side view camera disposed in an exterior rearview mirror assembly attached at a passenger side of the vehicle.
  • 58. The method of claim 55, wherein the rear backup camera mounted at the exterior rear portion of the vehicle has a field of view of 185 degrees.
  • 59. The method of claim 47, comprising providing steering angle of the vehicle to the control during the backing up maneuver of the trailer hitched to the vehicle via a vehicle communication bus of the vehicle.
  • 60. The method of claim 59, comprising calculating at the control a reversing path of the trailer during the backing up maneuver of the trailer hitched to the vehicle based at least on (i) processing at an electronic control unit (ECU) of image data captured by the rear backup camera, (ii) trailer angle of the trailer relative to the longitudinal axis of the vehicle and (iii) steering angle of the vehicle.
  • 61. The method of claim 59, wherein the vehicle communication bus of the vehicle comprises a controller area network (CAN) communication bus.
  • 62. The method of claim 61, comprising providing at least one trailer characteristic of the trailer hitched to the vehicle to the control via the CAN communication bus of the vehicle.
  • 63. The method of claim 47, comprising providing to the control location at the vehicle of the rear backup camera mounted at the exterior rear portion of the vehicle.
  • 64. The method of claim 63, wherein location at the vehicle of the rear backup camera mounted at the exterior rear portion of the vehicle is derived, at least in part, from processing at the control of image data captured by the rear backup camera.
  • 65. The method of claim 47, wherein the side view cameras comprise a first side view camera disposed in an exterior rearview mirror assembly attached at a driver side of the vehicle and a second side view camera disposed in an exterior rearview mirror assembly attached at a passenger side of the vehicle.
  • 66. The method of claim 65, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays bird's-eye top down video images that are at least partially derived from at least one selected from the group consisting of (i) image data captured by the rear backup camera and image data captured by the first side view camera disposed in the exterior rearview mirror assembly attached at the driver side of the vehicle and (ii) image data captured by the rear backup camera and image data captured by the second side view camera disposed in the exterior rearview mirror assembly attached at the passenger side of the vehicle.
  • 67. The method of claim 65, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays bird's-eye top down video images that are at least partially derived from (i) image data captured by the rear backup camera, (ii) image data captured by the first side view camera disposed in the exterior rearview mirror assembly attached at the driver side of the vehicle and (iii) image data captured by the second side view camera disposed in the exterior rearview mirror assembly attached at the passenger side of the vehicle.
  • 68. The method of claim 47, wherein processing of image data captured by the rear backup camera at the control comprises use of pattern matching.
  • 69. The method of claim 47, comprising providing to the control distance between the pivoting joint and the detected at least a portion of the trailer tongue of the trailer.
  • 70. The method of claim 47, comprising providing steering angle of the vehicle to the control during the backing up maneuver of the trailer hitched to the vehicle.
  • 71. The method of claim 47, wherein the processor of the control comprises a field-programmable gate array (FPGA).
  • 72. The method of claim 47, wherein the processor of the control comprises a digital signal processor (DSP).
  • 73. The method of claim 47, wherein the processor of the control comprises a system on chip (SOC).
  • 74. The method of claim 47, comprising providing to the control frames of image data captured by the rear backup camera during the backing up maneuver of the trailer hitched to the vehicle, and further comprising, during the backing up maneuver of the trailer hitched to the vehicle, processing image data captured by the rear backup camera at the control to detect, in real time, change in trailer angle of the trailer relative to the longitudinal axis of the vehicle.
  • 75. The method of claim 74, wherein the control, based at least in part on change in trailer angle of the trailer relative to the longitudinal axis of the vehicle during the backing up maneuver of the trailer hitched to the vehicle, controls the steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle.
  • 76. The method of claim 47, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays rear backup video images derived, at least in part, from image data captured by the rear backup camera, and wherein a graphic overlay overlays the rear backup video images displayed on the video display screen, the graphic overlay indicating to the driver of the vehicle at least a backing up direction of the trailer hitched to the vehicle.
  • 77. A method for backing up a trailer hitched to a vehicle, said method comprising: mounting a rear backup camera at an exterior rear portion of a vehicle equipped with a trailer hitch;wherein the rear backup camera comprises a CMOS imaging array sensor;wherein the rear backup camera, when mounted at the exterior rear portion of the vehicle, is operable to capture frames of image data;wherein the rear backup camera is part of a multi-camera bird's-eye top down view system of the vehicle;hitching a trailer tongue of a trailer at the hitch of the vehicle to form a pivoting joint attaching the trailer to the vehicle;wherein, with the trailer hitched to the vehicle, the rear backup camera at least partially views the trailer tongue of the trailer;providing a control at the vehicle;wherein the control comprises an electronic control unit (ECU) that is located within the vehicle separate from the exterior rear portion of the vehicle at which the rear backup camera is mounted;the ECU comprising a processor operable to process image data captured by the rear backup camera;capturing image data using the rear backup camera during a backing up maneuver of the trailer hitched to the vehicle;providing to the ECU image data captured by the rear backup camera;providing steering angle of the vehicle to the ECU during the backing up maneuver of the trailer hitched to the vehicle via a vehicle communication bus of the vehicle;with the trailer hitched to the vehicle and during the backing up maneuver of the trailer hitched to the vehicle, processing at the ECU image data captured by the rear backup camera;responsive to processing at the ECU of image data captured by the rear backup camera, detecting at least a portion of the trailer tongue of the trailer that is viewed by the rear backup camera, wherein the detected at least a portion of the trailer tongue of the trailer comprises structure of the trailer tongue of the trailer;wherein processing at the ECU of image data captured by the rear backup camera to detect the at least a portion of the trailer tongue comprises use of an edge detection algorithm;responsive to processing at the ECU of image data captured by the rear backup camera, determining during the backing up maneuver of the trailer hitched to the vehicle trailer angle of the trailer relative to a longitudinal axis of the vehicle; andwherein the control, based at least in part on trailer angle of the trailer relative to the longitudinal axis of the vehicle, controls a steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle to back up the trailer hitched to the vehicle along a rearward trajectory.
  • 78. The method of claim 77, wherein the vehicle communication bus of the vehicle comprises a controller area network (CAN) communication bus.
  • 79. The method of claim 78, comprising providing at least one trailer characteristic of the trailer hitched to the vehicle to the ECU via the CAN communication bus of the vehicle.
  • 80. The method of claim 77, wherein the ECU communicates with the rear backup camera via a serial data link.
  • 81. The method of claim 80, wherein the rear backup camera mounted at the exterior rear portion of the vehicle has a field of view of at least 180 degrees.
  • 82. The method of claim 77, wherein the processor of the ECU comprises a field-programmable gate array (FPGA).
  • 83. The method of claim 77, wherein the processor of the ECU comprises a digital signal processor (DSP).
  • 84. The method of claim 77, wherein the processor of the ECU comprises a system on chip (SOC).
  • 85. The method of claim 77, wherein processing at the ECU of image data captured by the rear backup camera comprises use of pattern matching.
  • 86. The method of claim 77, wherein processing at the ECU of image data captured by the rear backup camera to detect the at least a portion of the trailer tongue comprises use of pattern matching.
  • 87. The method of claim 77, comprising providing to the control distance between the pivoting joint and the detected at least a portion of the trailer tongue of the trailer.
  • 88. The method of claim 77, wherein location at the vehicle of the rear backup camera mounted at the exterior rear portion of the vehicle is derived, at least in part, from processing at the ECU of image data captured by the rear backup camera.
  • 89. The method of claim 77, comprising adhering a target at the at least a portion of the trailer tongue of the trailer that is viewed by the rear backup camera.
  • 90. The method of claim 89, wherein the target comprises at least one selected from the group consisting of (i) an icon viewed by the rear backup camera, (ii) a shape viewed by the rear backup camera and (iii) indicia viewed by the rear backup camera.
  • 91. The method of claim 77, comprising equipping the vehicle with side view cameras, each having a field of view at least rearward and sideward of the vehicle.
  • 92. The method of claim 91, wherein the side view cameras view the trailer during the backing up maneuver, and wherein image data captured by the side view cameras during the backing up maneuver is provided to the ECU of the control and is processed at the ECU of the control.
  • 93. The method of claim 91, wherein the side view cameras are part of the multi-camera bird's-eye top down view system of the vehicle.
  • 94. The method of claim 93, wherein the side view cameras comprise a first side view camera disposed in an exterior rearview mirror assembly attached at a driver side of the vehicle and a second side view camera disposed in an exterior rearview mirror assembly attached at a passenger side of the vehicle.
  • 95. The method of claim 94, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays bird's-eye top down video images that are at least partially derived from at least one selected from the group consisting of (i) image data captured by the rear backup camera and image data captured by the first side view camera disposed in the exterior rearview mirror assembly attached at the driver side of the vehicle and (ii) image data captured by the rear backup camera and image data captured by the second side view camera disposed in the exterior rearview mirror assembly attached at the passenger side of the vehicle.
  • 96. The method of claim 94, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays bird's-eye top down video images that are at least partially derived from (i) image data captured by the rear backup camera, (ii) image data captured by the first side view camera disposed in the exterior rearview mirror assembly attached at the driver side of the vehicle and (iii) image data captured by the second side view camera disposed in the exterior rearview mirror assembly attached at the passenger side of the vehicle.
  • 97. The method of claim 94, wherein the ECU communicates with the rear backup camera via a serial data link.
  • 98. The method of claim 97, wherein the ECU controls exposure of the rear backup camera via the serial data link.
  • 99. The method of claim 97, wherein the ECU controls white balance of the rear backup camera via the serial data link.
  • 100. The method of claim 77, wherein the ECU communicates with the rear backup camera via a serial data link, and wherein processing at the ECU of image data captured by the rear backup camera to detect the at least a portion of the trailer tongue comprises use of pattern matching.
  • 101. The method of claim 77, comprising providing to the ECU frames of image data captured by the rear backup camera during the backing up maneuver of the trailer hitched to the vehicle, and further comprising, during the backing up maneuver of the trailer hitched to the vehicle, processing image data captured by the rear backup camera at the ECU to detect, in real time, change in trailer angle of the trailer relative to the longitudinal axis of the vehicle.
  • 102. The method of claim 101, wherein the control, based at least in part on change in trailer angle of the trailer relative to the longitudinal axis of the vehicle during the backing up maneuver of the trailer hitched to the vehicle, controls the steering system of the vehicle during the backing up maneuver of the trailer hitched to the vehicle.
  • 103. The method of claim 77, comprising calculating at the control a reversing path of the trailer during the backing up maneuver of the trailer hitched to the vehicle based at least on (i) processing at the ECU of image data captured by the rear backup camera, (ii) trailer angle of the trailer relative to the longitudinal axis of the vehicle and (iii) steering angle of the vehicle.
  • 104. The method of claim 77, wherein during the backing up maneuver of the trailer hitched to the vehicle, a video display screen viewable by a driver of the vehicle and disposed in an interior cabin of the vehicle displays rear backup video images derived, at least in part, from image data captured by the rear backup camera, and wherein a graphic overlay overlays the rear backup video images displayed on the video display screen, the graphic overlay indicating to the driver of the vehicle at least a backing up direction of the trailer hitched to the vehicle.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/959,769, filed Apr. 23, 2018, now U.S. Pat. No. 10,858,042, which is a continuation of U.S. patent application Ser. No. 14/803,147, filed Jul. 20, 2015, now U.S. Pat. No. 9,950,738, which is a continuation of U.S. patent application Ser. No. 13/979,871, filed Jul. 16, 2013, now U.S. Pat. No. 9,085,261, which is a 371 national phase application of PCT Application No. PCT/US2012/022517, filed Jan. 25, 2012, which claims the priority benefit of U.S. provisional applications, Ser. No. 61/496,090, filed Jun. 13, 2011 and Ser. No. 61/436,397, filed Jan. 26, 2011.

US Referenced Citations (480)
Number Name Date Kind
4200361 Malvano et al. Apr 1980 A
4214266 Myers Jul 1980 A
4218698 Bart et al. Aug 1980 A
4236099 Rosenblum Nov 1980 A
4247870 Gabel et al. Jan 1981 A
4249160 Chilvers Feb 1981 A
4266856 Wainwright May 1981 A
4277804 Robison Jul 1981 A
4281898 Ochiai et al. Aug 1981 A
4288814 Talley et al. Sep 1981 A
4355271 Noack Oct 1982 A
4357558 Massoni et al. Nov 1982 A
4381888 Momiyama May 1983 A
4420238 Felix Dec 1983 A
4431896 Lodetti Feb 1984 A
4443057 Bauer et al. Apr 1984 A
4460831 Oettinger et al. Jul 1984 A
4481450 Watanabe et al. Nov 1984 A
4491390 Tong-Shen Jan 1985 A
4512637 Ballmer Apr 1985 A
4529275 Ballmer Jul 1985 A
4529873 Ballmer et al. Jul 1985 A
4546551 Franks Oct 1985 A
4549208 Kamejima et al. Oct 1985 A
4571082 Downs Feb 1986 A
4572619 Reininger et al. Feb 1986 A
4580875 Bechtel et al. Apr 1986 A
4600913 Caine Jul 1986 A
4603946 Kato et al. Aug 1986 A
4614415 Hyatt Sep 1986 A
4620141 McCumber et al. Oct 1986 A
4623222 Itoh et al. Nov 1986 A
4626850 Chey Dec 1986 A
4629941 Ellis et al. Dec 1986 A
4630109 Barton Dec 1986 A
4632509 Ohmi et al. Dec 1986 A
4638287 Umebayashi et al. Jan 1987 A
4647161 Muller Mar 1987 A
4653316 Fukuhara Mar 1987 A
4669825 Itoh et al. Jun 1987 A
4669826 Itoh et al. Jun 1987 A
4671615 Fukada et al. Jun 1987 A
4672457 Hyatt Jun 1987 A
4676601 Itoh et al. Jun 1987 A
4690508 Jacob Sep 1987 A
4692798 Seko et al. Sep 1987 A
4697883 Suzuki et al. Oct 1987 A
4701022 Jacob Oct 1987 A
4713685 Nishimura et al. Dec 1987 A
4717830 Botts Jan 1988 A
4727290 Smith et al. Feb 1988 A
4731669 Hayashi et al. Mar 1988 A
4741603 Miyagi et al. May 1988 A
4768135 Kretschmer et al. Aug 1988 A
4772942 Tuck Sep 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan et al. Dec 1988 A
4817948 Simonelli Apr 1989 A
4820933 Hong et al. Apr 1989 A
4825232 Howdle Apr 1989 A
4838650 Stewart et al. Jun 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4855822 Narendra et al. Aug 1989 A
4862037 Farber et al. Aug 1989 A
4867561 Fujii et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4881019 Shiraishi et al. Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4886960 Molyneux et al. Dec 1989 A
4891559 Matsumoto et al. Jan 1990 A
4892345 Rachael, III Jan 1990 A
4895790 Swanson et al. Jan 1990 A
4896030 Miyaji Jan 1990 A
4907870 Brucker Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4916374 Schierbeek et al. Apr 1990 A
4917477 Bechtel et al. Apr 1990 A
4937796 Tendler Jun 1990 A
4953305 Van Lente et al. Sep 1990 A
4956591 Schierbeek et al. Sep 1990 A
4961625 Wood et al. Oct 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
4974078 Tsai Nov 1990 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5086253 Awler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam et al. Sep 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield et al. Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell et al. Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Elong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Eddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5487116 Nakano et al. Jan 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Lino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555312 Shima et al. Sep 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5668663 Varaprasad et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5675489 Pomerleau Oct 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724187 Varaprasad et al. Mar 1998 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5757949 Kinoshita et al. May 1998 A
5760826 Nayar Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5914815 Bos Jun 1999 A
5923027 Stam et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5940120 Frankhouse et al. Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5964822 Alland et al. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6001486 Varaprasad et al. Dec 1999 A
6009336 Harris et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6087953 DeLine et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6116743 Hoek Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124886 DeLine et al. Sep 2000 A
6139172 Bos et al. Oct 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6172613 DeLine et al. Jan 2001 B1
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6201642 Bos Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6222460 DeLine et al. Apr 2001 B1
6243003 DeLine et al. Jun 2001 B1
6250148 Lynam Jun 2001 B1
6259412 Duroux Jul 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Aumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6291906 Marcus et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6326613 Heslin et al. Dec 2001 B1
6329925 Skiver et al. Dec 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6366213 DeLine et al. Apr 2002 B2
6370329 Teuchert Apr 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433676 DeLine et al. Aug 2002 B2
6433817 Guerra Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6480104 Wall et al. Nov 2002 B1
6483429 Yasui et al. Nov 2002 B1
6485155 Duroux et al. Nov 2002 B1
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6513252 Schierbeek Feb 2003 B1
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6534884 Marcus et al. Mar 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 Devries, Jr. et al. Apr 2003 B1
6553130 Emelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6559761 Miller et al. May 2003 B1
6574033 Chui et al. Jun 2003 B1
6578017 Ebersole et al. Jun 2003 B1
6587573 Stam et al. Jul 2003 B1
6587760 Okamoto Jul 2003 B2
6589625 Kothari et al. Jul 2003 B1
6593565 Heslin et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6627918 Getz et al. Sep 2003 B2
6631994 Suzuki et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6650233 DeLine et al. Nov 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678056 Downs Jan 2004 B2
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6703925 Steffel Mar 2004 B2
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6721659 Stopczynski Apr 2004 B2
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjonell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6812971 Terane Nov 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6847487 Burgner Jan 2005 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
7004593 Weller et al. Feb 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7006127 Mizusawa et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7046448 Burgner May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Aumeyer et al. Aug 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7133661 Tatae et al. Nov 2006 B2
7149613 Stam et al. Dec 2006 B2
7158015 Rao et al. Jan 2007 B2
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7224324 Quist et al. May 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7249860 Kulas et al. Jul 2007 B2
7253723 Lindahl et al. Aug 2007 B2
7255451 McCabe et al. Aug 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7360932 Uken et al. Apr 2008 B2
7370983 DeWind et al. May 2008 B2
7375803 Bamji May 2008 B1
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423821 Bechtel et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7432248 Roberts et al. Oct 2008 B2
7459664 Schofield et al. Dec 2008 B2
7483058 Frank et al. Jan 2009 B1
7526103 Schofield et al. Apr 2009 B2
7541743 Salmeen et al. Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7565006 Stam et al. Jul 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7633383 Dunsmoir et al. Dec 2009 B2
7639149 Katoh Dec 2009 B2
7676087 Dhua et al. Mar 2010 B2
7690737 Lu Apr 2010 B2
7720580 Higgins-Luthman May 2010 B2
7792329 Schofield et al. Sep 2010 B2
7843451 Lafon Nov 2010 B2
7855778 Yung et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7881496 Camilleri et al. Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7930160 Hosagrahara et al. Apr 2011 B1
8010252 Getman et al. Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8038166 Piesinger Oct 2011 B1
8063752 Oleg Nov 2011 B2
8094170 Kato et al. Jan 2012 B2
8095310 Taylor et al. Jan 2012 B2
8098142 Schofield et al. Jan 2012 B2
8164628 Stein et al. Apr 2012 B2
8218007 Lee et al. Jul 2012 B2
8224031 Saito Jul 2012 B2
8260518 Englert Sep 2012 B2
8411998 Huggett et al. Apr 2013 B2
8831831 Headley Sep 2014 B2
9189670 Moed Nov 2015 B2
9633566 Skvarce et al. Apr 2017 B2
9688306 McClain et al. Jun 2017 B2
9950738 Lu et al. Apr 2018 B2
10858042 Lu et al. Dec 2020 B2
20010001563 Tomaszewski May 2001 A1
20020113873 Williams Aug 2002 A1
20030137586 Lewellen Jul 2003 A1
20030160428 Lindell et al. Aug 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20050074143 Kawai Apr 2005 A1
20050206225 Offerle et al. Sep 2005 A1
20050219852 Stam et al. Oct 2005 A1
20050237385 Kosaka et al. Oct 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070242339 Bradley Oct 2007 A1
20080147321 Howard et al. Jun 2008 A1
20080192132 Bechtel et al. Aug 2008 A1
20080231701 Greenwood Sep 2008 A1
20090113509 Tseng et al. Apr 2009 A1
20090143967 Lee et al. Jun 2009 A1
20090160987 Bechtel et al. Jun 2009 A1
20090190015 Bechtel et al. Jul 2009 A1
20090256938 Bechtel et al. Oct 2009 A1
20110050903 Vorobiev Mar 2011 A1
20110063425 Tieman Mar 2011 A1
20110113219 Golshan May 2011 A1
20110125457 Lee et al. May 2011 A1
20120045112 Lundblad et al. Feb 2012 A1
20120154591 Baur Jun 2012 A1
20120185131 Headley Jul 2012 A1
20120265416 Lu et al. Oct 2012 A1
20140200759 Lu et al. Jul 2014 A1
20140218506 Trombley et al. Aug 2014 A1
20160049020 Kuehnle et al. Feb 2016 A1
20160264046 Bochenek et al. Sep 2016 A1
20170174128 Hu et al. Jun 2017 A1
20170240204 Raad et al. Aug 2017 A1
Foreign Referenced Citations (15)
Number Date Country
59114139 Jul 1984 JP
6080953 May 1985 JP
6414700 Jan 1989 JP
4114587 Apr 1992 JP
05050883 Mar 1993 JP
6227318 Aug 1994 JP
0769125 Mar 1995 JP
07105496 Apr 1995 JP
2630604 Jul 1997 JP
200383742 Mar 2003 JP
2000044605 Aug 2000 WO
2004007232 Jan 2004 WO
2011014497 Feb 2011 WO
2012103193 Aug 2012 WO
2019202317 Oct 2019 WO
Non-Patent Literature Citations (9)
Entry
International Search Report and Written Opinion dated May 25, 2012 from corresponding PCT Application No. PCT/ JS2012/022517.
J. Borenstein et al., “Where am I? Sensors and Method for Mobile Robot Positioning”, University of Michigan, Apr. 1996, pp. 2, 125-128.
Bow, Sing T., “Pattern Recognition and Image Preprocessing (Signal Processing and Communications)”, CRC Press, Jan. 15, 2002, pp. 557-559.
Vlacic et al., (Eds), “Intelligent Vehicle Technologies, Theory and Applications”, Society of Automotive Engineers Inc., edited by SAE International, 2001.
Van Leuven et al., “Real-Time Vehicle Tracking in Image Sequences”, IEEE, US, vol. 3, May 21, 2001, pp. 2049-2054, XP010547308.
Van Leeuwen et al., “Requirements for Motion Estimation in Image Sequences for Traffic Applications”, IEEE, US, vol. 1, May 24, 1999, pp. 145-150, XP010340272.
Van Leeuwen et al., “Motion Estimation with a Mobile Camera for Traffic Applications”, IEEE, US, vol. 1, Oct. 3, 2000, pp. 58-63.
Van Leeuwen et al., “Motion Interpretation for In-Car Vision Systems”, IEEE, US, vol. 1, Sep. 30, 2002, p. 135-140.
Pratt, “Digital Image Processing, Passage—ED.3”, John Wiley & Sons, US, Jan. 1, 2001, pp. 657-659, XP002529771.
Related Publications (1)
Number Date Country
20210114657 A1 Apr 2021 US
Provisional Applications (2)
Number Date Country
61496090 Jun 2011 US
61436397 Jan 2011 US
Continuations (3)
Number Date Country
Parent 15959769 Apr 2018 US
Child 17247270 US
Parent 14803147 Jul 2015 US
Child 15959769 US
Parent 13979871 US
Child 14803147 US