Vehicular trailer backup assist system

Information

  • Patent Grant
  • 10493917
  • Patent Number
    10,493,917
  • Date Filed
    Thursday, December 20, 2018
    5 years ago
  • Date Issued
    Tuesday, December 3, 2019
    4 years ago
Abstract
A vehicular trailer backup assist system is operable to steer a vehicle, with a trailer connected at the rear of the vehicle, during a reversing maneuver of the vehicle and trailer. The system electronically overlays a plurality of zones over displayed video images, with each individual zone of the overlayed plurality of zones being indicative of a respective individual region rearward of the vehicle and trailer. Responsive to a selected zone of travel for the trailer, the control controls steering of the vehicle to steer the vehicle to guide the trailer towards the respective individual region that is encompassed by the selected zone of travel during reversing of the vehicle and trailer. The driver of the vehicle manually adjusts steering of the vehicle to more accurately steer the vehicle towards a target location within the respective individual region that is encompassed by the selected zone of travel.
Description
FIELD OF THE INVENTION

The present invention relates generally to a vehicle vision system for a vehicle and, more particularly, to a vehicle vision system that utilizes one or more cameras at a vehicle.


BACKGROUND OF THE INVENTION

Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.


SUMMARY OF THE INVENTION

The present invention provides a camera for a vision system that utilizes one or more cameras or image sensors to capture image data of a scene exterior (such as forwardly) of a vehicle and provides a display of images indicative of or representative of the captured image data.


The vehicular vision system or trailer backup assist system of the present invention includes at least one camera disposed at a vehicle and having an exterior field of view rearward of the vehicle. The camera is operable to capture image data. An image processor operable to process captured image data. Responsive to image processing of captured image data, the system is operable to determine a trailer angle of a trailer that is towed by the vehicle. The system is responsive to the determined trailer angle and to a user input (that indicates a desired or selected path of the trailer) to adjust or control the steering of the vehicle to guide the trailer along the selected path. The system steers the vehicle to guide or backup the trailer in a direction that generally corresponds to the selected path or direction, and that is within a selected zone or range of angles or directions rearward of the vehicle. The driver of the vehicle may manually adjust the steering of the vehicle to more accurately or precisely steer the vehicle in the desired direction, with the system steering the vehicle generally in the desired direction so that only minor or slight adjustments of vehicle steering may be needed or provided by the driver of the vehicle.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a plan view of a vehicle with a vision system that incorporates a camera and trailer backup assist system in accordance with the present invention;



FIG. 1B is a plan view of a vehicle with another vision system that incorporates a camera and trailer backup assist system in accordance with the present invention;



FIG. 2 is a perspective view and illustration of different angles of the trailer relative to the vehicle that the system may determine during operation of the vehicle and trailer backup assist system of the present invention;



FIG. 3 is a perspective view of a trailer being pulled by a vehicle with a target at the trailer;



FIG. 4 is a view of the target as viewed by the camera when the trailer is turned relative to the vehicle;



FIGS. 5A-11A are perspective views of a trailer at different degrees of turning relative to the towing vehicle;



FIGS. 5B-11B are top plan views showing the target at the trailer of FIGS. 5A-11A, respectively;



FIG. 12 shows the plan views of FIGS. 6B, 11B, 8B, 9B, 7B and 10B side by side; and



FIG. 13 is a schematic showing operation of a search algorithm of the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

A vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide a top down or bird's eye or surround view display and may provide a displayed image that is representative of the subject vehicle, and optionally with the displayed image being customized to at least partially correspond to the actual subject vehicle.


Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forwardly facing camera 14b at the front (or at the windshield) of the vehicle, and a sidewardly/rearwardly facing camera 14c, 14d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (FIG. 1A). The vision system 12 includes a control or electronic control unit (ECU) or processor 18 that is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle (although shown in FIG. 1A as being part of or incorporated in or at an interior rearview mirror assembly 20 of the vehicle, the control 18 and/or the display device may be disposed elsewhere at or in the vehicle). For example, and such as shown in FIG. 1B, the control may be close to or attached at or incorporated in a camera, such as the rear viewing camera 14a. The data transfer or signal communication from the camera to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.


Driver assistant systems made to assist the driver when pulling or pushing a trailer without having a specific trailer angle sensor are described in International Publication No. WO 2012/103193, published Aug. 2, 2012, which is hereby incorporated herein by reference in its entirety. Such trailer angle sensing systems may detect the trailer nicking angle (relative to the vehicle) by determining the location and/or angle of targets on the trailer via image processing of image data captured by the vision system's rearward viewing camera or cameras. In some systems, when attaching a trailer to the vehicle, the driver has to enter its properties to put the trailer driving aid system into a position to calculate the driving aids overlays properly, when backing up with the trailer attached. Such measurements and entries would need to be done for each different trailer that is connected to the vehicle.


The present invention provides a trailer backup assist system that assists the driver of the vehicle 10 when reversing the vehicle with a trailer 22 connected thereto (FIGS. 1A, 1B and 2). The system is responsive a user input 24 that, responsive to adjustment or selection or setting of the input by the user or driver of the vehicle, sets or establishes a rearward direction or path for the trailer to travel along (such as via a steering knob or dial that “steers” the trailer along its rearward path). The system determines the angle of the trailer relative to the vehicle (such as via image processing of image data captured by the rear camera 14a of the vehicle, which may include determination of a location of a target 26 on the trailer) and, responsive to the determined angle and responsive to the user input (which may set or establish a selected location or zone of the target 26 on the trailer to set a selected or desired direction of travel of the trailer), the system adjusts or controls the steering of the vehicle to generally follow the selected path of the trailer to assist the driver in reversing the vehicle and trailer.


For example, if zone 3 were selected (in FIG. 2), the system would steer the vehicle and trailer (during a reversing maneuver) to guide the trailer towards a region (such as towards the region or area occupied by the vehicle shown in FIG. 2) that is encompassed by the selected zone, and may adjust the vehicle steering to guide and maintain the trailer travel direction towards the determined region or regions encompassed by the selected zone. The driver then may only have to slightly adjust the steering of the vehicle as the vehicle and trailer are reversing to more accurately steer the trailer to the desired location. The system may utilize aspects of the systems described in U.S. patent application Ser. No. 14/102,981, filed Dec. 11, 2013, now U.S. Pat. No. 9,558,419; Ser. No. 14/036,723, filed Sep. 25, 2013, now U.S. Pat. No. 9,446,713, and/or Ser. No. 13/979,871, filed Jul. 16, 2013, now U.S. Pat. No. 9,085,261, which are hereby incorporated herein by reference in their entireties.


The present invention thus may simplify customer interaction and human machine interface (HMI) during setup of the system, with a goal to eliminate user measurements during system and/or trailer setup. The system is operable to modify the trailer backup assist (TBA) system steering algorithm in order to accommodate detection or measurement error from the trailer angle detection (TAD) sensor or system. The system may provide steering control of the vehicle that results in the trailer being pushed or driven generally in the selected or desired rearward direction, whereby the driver may use the steering wheel of the vehicle or the user input to further adjust or refine the steering direction to further control or guide the trailer in a more precise desired or selected direction. The system of the present invention thus provides a simplified steering algorithm for controlling the vehicle steering, and does not require the detailed measurements during set up of the system on a vehicle with a trailer connected thereto.


The system of the present invention changes the assist system feature approach from absolute trailer steering (such as an attempt to drive any trailer on a defined trajectory or curve radius using knob utilizing measurements of the vehicle and trailer in an attempt to more accurately or precisely know where the trailer will be for any given vehicle steering adjustment) to relative trailer steering (where the system may steer the vehicle to generally position or guide the trailer towards a desired or selected direction of travel). This would move the control more to the driver of the vehicle, who may provide a finer adjustment of the steering of the vehicle as the vehicle and trailer are reversed.


The system reduces or eliminates user measurements for TAD, and may provide a display box as an overlay on the display screen (in the vehicle) and may instruct the user to move the trailer to place the trailer target inside the box overlay. The TAD sensor then may output the zone in which the trailer is located. For example, and such as shown in FIG. 2, the trailer target (such as at or on the tongue of the trailer) is located at one of the center regions or zones of the overlay. The number of zones may vary depending on the particular application of the system. For example, the overlay may provide 20 zones over roughly 180 degrees wide angle field of view, such that each zone sweeps or encompasses about nine degrees or thereabouts.


The zone indicates in which general direction (or angle or ranges of angles relative to the vehicle) the trailer is moving. Because the user measurements are removed, an angle measurement is not needed, and nevertheless the TAD allows detection of the direction in which the trailer is moving. The detected or determined or reported zone may not be the same for two different trailer setups if the trailers have the same trailer angle, but since the steering is relative, the impact is minimized.


The trailer backup assist system user input range or steering knob range may be divided into zones. The requested zone on the knob is mapped to the reported zones of the TAD sensor. The steering algorithm attempts to steer the trailer so that the selected angle of the knob matches the zone of the TAD. The TAD zone at zero degrees can still be identified using the zero offset calibration, and the zones can be centered around zero degrees (or straight behind the vehicle).


Due to variations in vehicles and steering systems and trailers, the same knob angle may result in a different curve radius for different trailers. This can be compared with different vehicles. Different vehicles drive on a different curve radius when the steering wheel angle is the same depending on the vehicle wheel base and steering rack parameters and the like.


The trailer backup assist (TBA) system may not be able to steer the trailer exactly at zero degrees. The driver may have to correct the steering if the trailer drifts slightly off to one side. This may be compared to driving a vehicle on a road which slightly leans to one side. The steering wheel has to be put on an angle to drive straight. Thus, the driver input or steering adjustment is small and is not different from small adjustments that drivers typically have to do during normal driving conditions.


Thus, the present invention will decrease setup complexity for trailer backup assist systems, and will allow for less accuracy from the TAD sensor but may receive more control input from the driver. The system of the present invention thus actively controls the vehicle and trailer during a reversing maneuver. The rear camera and image processor determine the target and the angle of trailer relative to the tow vehicle and an output is generated indicative of that angle. The system of the present invention adjusts the vehicle steering for the particular trailer and vehicle configuration and for the selected direction of travel of the vehicle and trailer. Because the system does not require the complicated measurements at the time of setup, the system is simplified and provides a simplified steering input to the vehicle. The driver selects the segment or zone that corresponds to the desired direction of travel of the vehicle and trailer and the system generally steers the trailer toward that direction. During the reversing maneuver, the driver can then override or adjust or slightly adjust or refine the vehicle steering (such as via the vehicle steering wheel or the user input or dial) to steer the trailer more accurately in the selected direction or towards the target location (such as a selected boat launch ramp or parking space or garage stall or loading/unloading station or the like). The system operates to guide or steer the trailer close to the desired direction of travel (within the selected zone or region or general direction or range of angles) and the driver may then adjust the steering accordingly as the trailer is moved rearwardly in that general direction. The steering algorithm of the control is thus simplified and the system provides a trailer backup assist that can be implemented on a wide range of vehicles and trailers without complex setup measurements and processes.


Both when using a TBA system as described above with decreased setup complexity, or when using a system without and when using target based to TDA algorithm, the finding of the trailer target by image processing (of image data captured by a rear camera of the vehicle) in the rear image is important. This is not always a trivial problem. Typically, these targets should be as small as about 2 square inches. Due to the wide angle lens and because the rear camera is typically equipped with low resolution imagers such as 1.2 MP and about 180 degree fisheye lens cameras, the checkerboard target 26 such as shown in FIGS. 3-13 is captured by just about 10×20 pixels. Typically, the rear vehicle scene is mostly illuminated by scattered light, shattered light, stray light or brindle (such as to be seen in exemplary FIGS. 11A and 11B) or the background itself is brindle such as at times when the ground is covered with leaves. For having a vehicle trailer angle detection target 26 with minimal auto correlation and rotation invariance, a Baker target may be used, such as described in U.S. patent application Ser. No. 14/102,981, filed Dec. 11, 2013, now U.S. Pat. No. 9,558,409, which is hereby incorporated herein by reference in its entirety. Due to the limited target size, the resolution may be less for using a Baker target. Because of that, less sophisticated targets with conventional patterns such as black and white chessboard pattern may find use, such as shown in the exemplary cases in the FIGS. 3-13 and especially in FIG. 4. When using cameras with higher resolution or bigger targets, the use of Baker targets instead of chessboard targets may be preferred.


Algorithms for finding target patterns within camera views such as rear camera fisheye views or demorphed rear camera views typically check the whole image or check the image portion which is below the horizon or may check a limited region of interest, for finding the target pattern. Since the trailer 22 turns around a turning point at the hitch 28 head, the target turns with the trailer. Because the viewing angle from the camera to the target is comparably flat, the checkerboard squares appear as rhombs in a two dimensional (2D) camera image due to the perspective when the target is turned against the camera view (see FIG. 4). As can be seen in FIG. 2, a trailer target fixedly attached to a trailer turns radially around the turning point of the hitch head with the vehicle. When the trailer-vehicle train stands on a balled surface, the target turns a little downwardly, when the trailer-vehicle train stands in a sink, the target turns a little upwardly within the camera view, but the turning radius stays substantially curved (arch-shaped area).


As an aspect of the present invention, the system may execute a few transformations of just the area the trailer target can possibly be located (the arch-shaped area) plus a small tolerance area. The few transformations may be of a nature that the substantially curved area (arch-shaped area) may be bent (transformed) to a string shape area such as ‘parallel projection’-transformation. That kind of view transformation is different to known art view point transformation, such as top view (or top down or bird's eye or surround view) transformation. Straight lines become curved and curved lines may straighten, compare the two straight (real scene) lane markings 40a and 40b in the demorphed fisheye view of FIG. 5A to the target turning curve (arch-shaped) area parallel projection transformed view of FIG. 5B. FIGS. 6A-11A show more scenes with the trailer hooked onto the hitch in different turning angles. FIGS. 6B-11B show the relating views transformed views of the target area according the invention. In FIG. 12, the FIGS. 6B-11B are arranged side by side depending on the turning angle. Starting from the left (FIG. 6B), with the trailer is turned to the right, to the very right (FIG. 10B) with the trailer turned to the left (related to the vehicle's middle axis (x-dimension)).


As can be seen with reference to FIG. 12, the benefit of the suggested view transformation and pattern search area becomes clear. The pattern is always orientated in the same way, but shifted sideward within the view area. By using such transformed views, a pattern search and compare algorithm can easily find the target's position within the view, which can directly be set into relation to the trailer's turning angle.


A typical pattern search method is convolution. A 1×10 convolution may find use here. The search may run over the search area from left to right in lines such as shown in the example in FIG. 13 (at I). In there the convolution search may run through line a then through line b and then through line c. FIG. 13 (at II) shows a possible transformed input image with a target visible in the left third. A pattern which the convolution search may detect the reference target matching at position was marked with a lightning shape in line b of FIG. 13 (at III). Optionally, more sophisticated search strategies may take previously found positions into account the search may begin and hover around near the position an earlier search found the reference target matching at for fastest and low performance consuming search.


The camera is disposed at a rear portion of the vehicle and has a field of view generally rearward of the vehicle. The camera may capture image data for use by the trailer backup assist system of the present invention, and may also capture image data for use by other vehicle systems, such as a rear backup assist system or a surround view system or the like.


The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.


The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EYEQ2 or EYEQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.


The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.


For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/0116043; WO 2012/0145501; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2013/019795; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661 and/or WO 2013/158592 and/or PCT Application No. PCT/US2014/042229, filed Jun. 13, 2014, and published on Dec. 24, 2014 as International Publication No. WO 2014/204794, and/or U.S. patent application Ser. No. 14/573,307, filed Dec. 17, 2014, and published on Jun. 25, 2015 as U.S. Patent Publication No. US-2015/0179074; Ser. No. 14/573,306, filed Dec. 17, 2014, now U.S. Pat. No. 10,095,935; Ser. No. 14/572,018, filed Dec. 16, 2014, and published on Jun. 25, 2015 as U.S. Patent Publication No. 2015/0176596; Ser. No. 14/572,017, filed Dec. 16, 2014, and published Jun. 25, 2015 as U.S. Patent Publication No. 2015/0175072; Ser. No. 14/568,177, filed Dec. 12, 2014, now U.S. Pat. No. 9,988,047; Ser. No. 14/561,794, filed Dec. 5, 2014, now U.S. Pat. No. 9,499,139; Ser. No. 14/558,981, filed Dec. 3, 2014, and published Jun. 4, 2015 as U.S. Patent Publication No. 2015/0156383; Ser. No. 14/535,739, filed Nov. 7, 2014, now U.S. Pat. No. 9,451,138; Ser. No. 14/524,203, filed Oct. 27, 2014, now U.S. Pat. No. 9,457,717; Ser. No. 14/519,469, filed Oct. 21, 2014, now U.S. Pat. No. 9,881,220; Ser. No. 14/391,841, filed Oct. 10, 2014, now U.S. Pat. No. 9,751,465; Ser. No. 14/489,659, filed Sep. 18, 2014, and published on Apr. 2, 2015 as U.S. Patent Publication No. 2015/0092042; Ser. No. 14/446,099, filed Aug. 22, 2014, now U.S. Pat. No. 9,343,245; Ser. No. 14/377,940, filed Aug. 11, 2014, and published Jan. 22, 2015 as U.S. Patent Publication No. US-2015/0022665; Ser. No. 14/377,939, filed Aug. 11, 2014, now U.S. Pat. No. 9,871,971; Ser. No. 14/456,164, filed Aug. 11, 2014, now U.S. Pat. No. 9,619,716; Ser. No. 14/456,163, filed Aug. 11, 2014, and published on Feb. 12, 2015 as U.S. Patent Publication No. 2015/0042807; Ser. No. 14/456,162, filed Aug. 11, 2014, and published on Feb. 12, 2015 as U.S. Patent Publication No. US-2015/0042806; Ser. No. 14/373,501, filed Jul. 21, 2014, and published on Jan. 29, 2015 as U.S. Patent Publication No. US-2015/0028781; Ser. No. 14/372,524, filed Jul. 16, 2014, and published on Jan. 22, 2015 as U.S. Patent Publication No. US-2015/0022664; Ser. No. 14/324,696, filed Jul. 7, 2014, now U.S. Pat. No. 9,701,258; Ser. No. 14/369,229, filed Jun. 27, 2014, now U.S. Pat. No. 9,491,342; Ser. No. 14/316,940, filed Jun. 27, 2014, and published on Jan. 8, 2015 as U.S. Patent Publication No. 2015/0009010; Ser. No. 14/316,939, filed Jun. 27, 2014, and published on Jan. 1, 2015 as U.S. Patent Publication No. 2015/0002670; Ser. No. 14/303,696, filed Jun. 13, 2014, now U.S. Pat. No. 9,609,757; Ser. No. 14/303,695, filed Jun. 13, 2014, and published Dec. 25, 2014 as U.S. Patent Publication No. US-2014-0375476; Ser. No. 14/303,694, filed Jun. 13, 2014, now U.S. Pat. No. 9,260,095; Ser. No. 14/303,693, filed Jun. 13, 2014, and published Dec. 18, 2014 as U.S. Patent Publication No. US-2014/0368654; Ser. No. 14/297,663, filed Jun. 6, 2014, and published Dec. 11, 2014 as U.S. Patent Publication No. US-2014/0362209; Ser. No. 14/290,028, filed May 29, 2014, now U.S. Pat. No. 9,800,794; Ser. No. 14/290,026, filed May 29, 2014, now U.S. Pat. No. 9,476,398; Ser. No. 14/282,029, filed May 20, 2014, now U.S. Pat. No. 9,205,776; Ser. No. 14/282,028, filed May 20, 2014, now U.S. Pat. No. 9,563,951; Ser. No. 14/358,232, filed May 15, 2014, now U.S. Pat. No. 9,491,451; Ser. No. 14/272,834, filed May 8, 2014, now U.S. Pat. No. 9,280,202; Ser. No. 14/356,330, filed May 5, 2014, now U.S. Pat. No. 9,604,581; Ser. No. 14/269,788, filed May 5, 2014, now U.S. Pat. No. 9,508,014; Ser. No. 14/268,169, filed May 2, 2014, and published Nov. 6, 2014 as U.S. Patent Publication No. US-2014/0327772; Ser. No. 14/264,443, filed Apr. 29, 2014, and published Oct. 30, 2014 as U.S. Patent Publication No. US-2014/0320636; Ser. No. 14/354,675, filed Apr. 28, 2014, now U.S. Pat. No. 9,580,013; Ser. No. 14/248,602, filed Apr. 9, 2014, now U.S. Pat. No. 9,327,693; Ser. No. 14/242,038, filed Apr. 1, 2014, now U.S. Pat. No. 9,487,159; Ser. No. 14/229,061, filed Mar. 28, 2014, now U.S. Pat. No. 10,027,930; Ser. No. 14/343,937, filed Mar. 10, 2014, now U.S. Pat. No. 9,681,062; Ser. No. 14/343,936, filed Mar. 10, 2014, and published Aug. 7, 2014 as U.S. Patent Publication No. US-2014/0218535; Ser. No. 14/195,135, filed Mar. 3, 2014, now U.S. Pat. No. 9,688,200; Ser. No. 14/195,136, filed Mar. 3, 2014, now U.S. Pat. No. 10,057,544; Ser. No. 14/191,512, filed Feb. 27, 2014, and published on Sep. 4, 2014 as U.S. Patent Publication No. US-2014/0247352; Ser. No. 14/183,613, filed Feb. 19, 2014, now U.S. Pat. No. 9,445,057; Ser. No. 14/169,329, filed Jan. 31, 2014, and published Aug. 7, 2014 as U.S. Patent Publication No. US-2014/0218529; Ser. No. 14/169,328, filed Jan. 31, 2014, now U.S. Pat. No. 9,092,986; Ser. No. 14/163,325, filed Jan. 24, 2014, and published Jul. 31, 2014 as U.S. Patent Publication No. US-2014/0211009; Ser. No. 14/159,772, filed Jan. 21, 2014, now U.S. Pat. No. 9,068,390; Ser. No. 14/107,624, filed Dec. 16, 2013, now U.S. Pat. No. 9,140,789; Ser. No. 14/102,981, filed Dec. 11, 2013, now U.S. Pat. No. 9,558,409; Ser. No. 14/102,980, filed Dec. 11, 2013, and published Jun. 19, 2014 as U.S. Patent Publication No. US-2014/0168437; Ser. No. 14/098,817, filed Dec. 6, 2013, and published on Jun. 19, 2014 as U.S. Patent Publication No. US-2014/0168415; Ser. No. 14/097,581, filed Dec. 5, 2013, now U.S. Pat. No. 9,481,301; Ser. No. 14/093,981, filed Dec. 2, 2013, now U.S. Pat. No. 8,917,169; Ser. No. 14/093,980, filed Dec. 2, 2013, now U.S. Pat. No. 10,025,994; Ser. No. 14/082,573, filed Nov. 18, 2013, now U.S. Pat. No. 9,743,002; Ser. No. 14/082,574, filed Nov. 18, 2013, now U.S. Pat. No. 9,307,640; Ser. No. 14/082,575, filed Nov. 18, 2013, now U.S. Pat. No. 9,090,234; Ser. No. 14/082,577, filed Nov. 18, 2013, now U.S. Pat. No. 8,818,042; Ser. No. 14/071,086, filed Nov. 4, 2013, now U.S. Pat. No. 8,886,401; Ser. No. 14/076,524, filed Nov. 11, 2013, now U.S. Pat. No. 9,077,962; Ser. No. 14/052,945, filed Oct. 14, 2013, now U.S. Pat. No. 9,707,896; Ser. No. 14/046,174, filed Oct. 4, 2013, now U.S. Pat. No. 9,723,272; Ser. No. 14/036,723, filed Sep. 25, 2013, now U.S. Pat. No. 9,446,713; Ser. No. 14/016,790, filed Sep. 3, 2013, now U.S. Pat. No. 9,761,142; Ser. No. 14/001,272, filed Aug. 23, 2013, now U.S. Pat. No. 9,233,641; Ser. No. 13/970,868, filed Aug. 20, 2013, now U.S. Pat. No. 9,365,162; Ser. No. 13/964,134, filed Aug. 12, 2013, now U.S. Pat. No. 9,340,227; Ser. No. 13/942,758, filed Jul. 16, 2013, and published on Jan. 23, 2014 as U.S. Patent Publication No. US-2014/0025240; Ser. No. 13/942,753, filed Jul. 16, 2013, and published Jan. 30, 2014 as U.S. Patent Publication No. 2014/0028852; Ser. No. 13/927,680, filed Jun. 26, 2013, and published Jan. 2, 2014 as U.S. Patent Publication No. US-2014/0005907; Ser. No. 13/916,051, filed Jun. 12, 2013, now U.S. Pat. No. 9,077,098; Ser. No. 13/894,870, filed May 15, 2013, now U.S. Pat. No. 10,089,537; Ser. No. 13/887,724, filed May 6, 2013, now U.S. Pat. No. 9,670,895; Ser. No. 13/852,190, filed Mar. 28, 2013, and published on Aug. 29, 2013 as U.S. Patent Publication No. 2013/0222593; Ser. No. 13/851,378, filed Mar. 27, 2013, now U.S. Pat. No. 9,319,637; Ser. No. 13/848,796, filed Mar. 22, 2012, and published Oct. 24, 2013 as U.S. Patent Publication No. US-2013/0278769; Ser. No. 13/847,815, filed Mar. 20, 2013, and published Oct. 31, 2013 as U.S. Patent Publication No. 2013/0286193; Ser. No. 13/800,697, filed Mar. 13, 2013, and published Oct. 3, 2013 as U.S. Patent Publication No. 2013/0258077; Ser. No. 13/785,099, filed Mar. 5, 2013, now U.S. Pat. No. 9,565,342; Ser. No. 13/779,881, filed Feb. 28, 2013, now U.S. Pat. No. 8,694,224; Ser. No. 13/774,317, filed Feb. 22, 2013, now U.S. Pat. No. 9,269,263; Ser. No. 13/774,315, filed Feb. 22, 2013, and published Aug. 22, 2013 as U.S. Patent Publication No. 2013/0215271; Ser. No. 13/681,963, filed Nov. 20, 2012, now U.S. Pat. No. 9,264,673; Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898; Ser. No. 13/653,577, filed Oct. 17, 2012, now U.S. Pat. No. 9,174,574; and/or Ser. No. 13/534,657, filed Jun. 27, 2012, and published on Jan. 3, 2013 as U.S. Patent Publication No. US-2013/0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication Nos. WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. Pat. Publication No. US 2010-0020170, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, and published on Jan. 3, 2013 as U.S. Patent Publication No. US-2013/0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Publication No. US-2009-0244361, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580 and/or 7,965,336, and/or International Publication Nos. WO/2009/036176 and/or WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. Nos. 7,255,451 and/or 7,480,149, and/or U.S. Publication No. US-2006-0061008, and/or U.S. patent application Ser. No. 12/578,732, filed Oct. 14, 2009, now U.S. Pat. No. 9,487,144, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties.


Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. Publication Nos. US-2006-0061008 and/or US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and/or 6,124,886, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.

Claims
  • 1. A vehicular trailer backup assist system, said vehicular trailer backup assist system comprising: a camera disposed at a vehicle and having a field of view exterior and at least rearward of the vehicle;wherein, with a trailer hitched at a trailer hitch of the vehicle, the field of view of said camera at least partially encompasses the trailer;a control comprising an image processor that processes image data captured by said camera;wherein image data captured by said camera is processed at said control;a display screen disposed in the vehicle and viewable by a driver of the vehicle when operating the vehicle, wherein, during a reversing maneuver of the vehicle and trailer, said display screen displays video images derived at least in part from image data captured by said camera;wherein, during the reversing maneuver of the vehicle and trailer, said vehicular trailer backup assist system electronically overlays a plurality of zones over the video images displayed at said display screen, and wherein each individual zone of the overlayed plurality of zones is indicative of a respective individual region rearward of the vehicle and trailer;a user input operable by a driver of the vehicle, wherein the driver operates the user input to select an individual zone of the overlayed plurality of zones as a zone of travel for the trailer during the reversing maneuver of the vehicle;wherein, responsive to the selected zone of travel for the trailer, said control controls, during the reversing maneuver of the vehicle and trailer, steering of the vehicle to steer the vehicle such that, during the reversing maneuver, the trailer travels towards the respective individual region displayed in the displayed video images that is encompassed by the overlay of the selected zone of travel;wherein said control controls steering of the vehicle to move the trailer relative to the overlayed plurality of zones in a manner that guides the trailer towards the respective individual region that is encompassed by the overlay of the selected zone of travel during reversing of the vehicle and trailer; andwherein, during the reversing maneuver of the vehicle and trailer, the driver of the vehicle manually adjusts steering of the vehicle to more accurately steer the vehicle towards a target location within the respective individual region that is encompassed by the overlay of the selected zone of travel.
  • 2. The vehicular trailer backup assist system of claim 1, wherein said control, responsive at least in part to processing at said control of image data captured by said camera, determines a trailer angle of the trailer relative to a longitudinal axis of the vehicle.
  • 3. The vehicular trailer backup assist system of claim 2, wherein said control, via processing at said control of image data captured by said camera, determines the trailer angle of the trailer via determining a location of a target disposed at the trailer.
  • 4. The vehicular trailer backup assist system of claim 3, wherein image data captured by said camera is demorphed by processing at said control before said control determines the location of the target disposed at the trailer.
  • 5. The vehicular trailer backup assist system of claim 4, wherein said image data, when demorphed by processing at said control, results in straightened lines in regions of image data captured by said camera.
  • 6. The vehicular trailer backup assist system of claim 1, wherein said control, via processing at said control of image data captured by said camera, determines an orientation of the trailer relative to a longitudinal axis of the vehicle via determining a location of a target disposed at the trailer.
  • 7. The vehicular trailer backup assist system of claim 6, wherein, with the trailer hitched at the trailer hitch of the vehicle, said control determines a zone of the overlayed plurality of zones where the target disposed at the trailer is located, and wherein steering of the vehicle is controlled responsive at least in part to the determined zone.
  • 8. The vehicular trailer backup assist system of claim 7, wherein the overlayed plurality of zones comprises an overlay of a plurality of pie-shaped zones electronically overlayed over the video images displayed at said display screen so as to appear over the displayed video images and partially surrounding the trailer hitch of the vehicle.
  • 9. The vehicular trailer backup assist system of claim 8, wherein said control controls steering of the vehicle to maintain the target disposed at the trailer in the selected zone of travel.
  • 10. The vehicular trailer backup assist system of claim 1, wherein the overlayed plurality of zones comprises an overlay of a plurality of pie-shaped zones electronically overlayed over the video images displayed at said display screen so as to appear over the displayed video images and partially surrounding the trailer hitch of the vehicle.
  • 11. The vehicular trailer backup assist system of claim 10, wherein said control controls steering of the vehicle to maintain a portion of the trailer in the selected zone of travel during reversing of the vehicle and trailer.
  • 12. A vehicular trailer backup assist system, said vehicular trailer backup assist system comprising: a camera disposed at a vehicle and having a field of view exterior and at least rearward of the vehicle;wherein, with a trailer hitched at a trailer hitch of the vehicle, the field of view of said camera at least partially encompasses the trailer;a control comprising an image processor that processes image data captured by said camera;wherein image data captured by said camera is processed at said control;a display screen disposed in the vehicle and viewable by a driver of the vehicle when operating the vehicle, wherein, during a reversing maneuver of the vehicle and trailer, said display screen displays video images derived at least in part from image data captured by said camera;wherein, during the reversing maneuver of the vehicle and trailer, said vehicular trailer backup assist system electronically overlays a plurality of zones over the video images displayed at said display screen, and wherein each individual zone of the overlayed plurality of zones is indicative of a respective individual region rearward of the vehicle and trailer;wherein the overlayed plurality of zones comprises an overlay of a plurality of pie-shaped zones electronically overlayed over the video images displayed at said display screen so as to appear over the displayed video images and partially surrounding the trailer hitch of the vehicle;wherein said control, via processing at said control of image data captured by said camera, determines an orientation of the trailer relative to a longitudinal axis of the vehicle via determining a location of a target disposed at the trailer;a user input operable by a driver of the vehicle, wherein the driver operates the user input to select an individual zone of the overlayed plurality of zones as a zone of travel for the trailer during the reversing maneuver of the vehicle;wherein, responsive to the selected zone of travel for the trailer, said control controls, during the reversing maneuver of the vehicle and trailer, steering of the vehicle to steer the vehicle such that, during the reversing maneuver, the trailer travels towards the respective individual region displayed in the displayed video images that is encompassed by the overlay of the selected zone of travel;wherein said control controls steering of the vehicle to move the trailer relative to the overlayed plurality of zones in a manner that guides the trailer towards the respective individual region that is encompassed by the overlay of the selected zone of travel during reversing of the vehicle and trailer; andwherein, during the reversing maneuver of the vehicle and trailer, the driver of the vehicle manually adjusts steering of the vehicle to more accurately steer the vehicle towards a target location within the respective individual region that is encompassed by the overlay of the selected zone of travel.
  • 13. The vehicular trailer backup assist system of claim 12, wherein image data captured by said camera is demorphed by processing at said control before said control determines the location of the target disposed at the trailer.
  • 14. The vehicular trailer backup assist system of claim 13, wherein said image data, when demorphed by processing at said control, results in straightened lines in regions of image data captured by said camera.
  • 15. The vehicular trailer backup assist system of claim 12, wherein, with the trailer hitched at the trailer hitch of the vehicle, said control determines a zone of the overlayed plurality of zones where the target disposed at the trailer is located, and wherein steering of the vehicle is controlled responsive at least in part to the determined zone.
  • 16. The vehicular trailer backup assist system of claim 15, wherein said control controls steering of the vehicle to maintain the target disposed at the trailer in the selected zone of travel.
  • 17. A vehicular trailer backup assist system, said vehicular trailer backup assist system comprising: a camera disposed at a vehicle and having a field of view exterior and at least rearward of the vehicle;wherein, with a trailer hitched at a trailer hitch of the vehicle, the field of view of said camera at least partially encompasses the trailer;a control comprising an image processor that processes image data captured by said camera;wherein image data captured by said camera is processed at said control;a display screen disposed in the vehicle and viewable by a driver of the vehicle when operating the vehicle, wherein, during a reversing maneuver of the vehicle and trailer, said display screen displays video images derived at least in part from image data captured by said camera;wherein, during the reversing maneuver of the vehicle and trailer, said vehicular trailer backup assist system electronically overlays a plurality of zones over the video images displayed at said display screen, and wherein each individual zone of the overlayed plurality of zones is indicative of a respective individual region rearward of the vehicle and trailer;wherein said control, via processing at said control of image data captured by said camera, determines an orientation of the trailer relative to a longitudinal axis of the vehicle via determining a location of a target disposed at the trailer;wherein image data captured by said camera is demorphed by processing at said control before said control determines the location of the target disposed at the trailer;a user input operable by a driver of the vehicle, wherein the driver operates the user input to select an individual zone of the overlayed plurality of zones as a zone of travel for the trailer during the reversing maneuver of the vehicle;wherein, responsive to the selected zone of travel for the trailer, said control controls, during the reversing maneuver of the vehicle and trailer, steering of the vehicle to steer the vehicle such that, during the reversing maneuver, the trailer travels towards the respective individual region displayed in the displayed video images that is encompassed by the overlay of the selected zone of travel;wherein said control controls steering of the vehicle to move the trailer relative to the overlayed plurality of zones in a manner that guides the trailer towards the respective individual region that is encompassed by the overlay of the selected zone of travel during reversing of the vehicle and trailer;wherein, with the trailer hitched at the trailer hitch of the vehicle, said control determines a zone of the overlayed plurality of zones where the target disposed at the trailer is located, and wherein steering of the vehicle is controlled responsive at least in part to the determined zone where the target disposed at the trailer is located; andwherein, during the reversing maneuver of the vehicle and trailer, the driver of the vehicle manually adjusts steering of the vehicle to more accurately steer the vehicle towards a target location within the respective individual region that is encompassed by the overlay of the selected zone of travel.
  • 18. The vehicular trailer backup assist system of claim 17, wherein said image data, when demorphed by processing at said control, results in straightened lines in regions of image data captured by said camera.
  • 19. The vehicular trailer backup assist system of claim 18, wherein the overlayed plurality of zones comprises an overlay of a plurality of pie-shaped zones electronically overlayed over the video images displayed at said display screen so as to appear over the displayed video images and partially surrounding the trailer hitch of the vehicle.
  • 20. The vehicular trailer backup assist system of claim 19, wherein said control controls steering of the vehicle to maintain the target disposed at the trailer in the selected zone of travel.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 14/613,441, filed Feb. 4, 2015, now U.S. Pat. No. 10,160,382, which claims the filing benefits of U.S. provisional applications, Ser. No. 61/972,708, filed Mar. 31, 2014, and Ser. No. 61/935,485, filed Feb. 4, 2014, which are hereby incorporated herein by reference in their entireties.

US Referenced Citations (370)
Number Name Date Kind
4200361 Malvano Apr 1980 A
4214266 Myers Jul 1980 A
4218698 Bart et al. Aug 1980 A
4236099 Rosenblum Nov 1980 A
4247870 Gabel et al. Jan 1981 A
4249160 Chilvers Feb 1981 A
4266856 Wainwright May 1981 A
4277804 Robison Jul 1981 A
4281898 Ochiai Aug 1981 A
4288814 Talley et al. Sep 1981 A
4355271 Noack Oct 1982 A
4357558 Massoni et al. Nov 1982 A
4381888 Momiyama May 1983 A
4420238 Felix Dec 1983 A
4431896 Lodetti Feb 1984 A
4443057 Bauer Apr 1984 A
4460831 Oettinger et al. Jul 1984 A
4481450 Watanabe et al. Nov 1984 A
4491390 Tong-Shen Jan 1985 A
4512637 Ballmer Apr 1985 A
4529275 Ballmer Jul 1985 A
4529873 Ballmer Jul 1985 A
4546551 Franks Oct 1985 A
4549208 Kamejima et al. Oct 1985 A
4571082 Downs Feb 1986 A
4572619 Reininger Feb 1986 A
4600913 Caine Jul 1986 A
4603946 Kato Aug 1986 A
4614415 Hyatt Sep 1986 A
4620141 McCumber et al. Oct 1986 A
4623222 Itoh Nov 1986 A
4626850 Chey Dec 1986 A
4629941 Ellis Dec 1986 A
4630109 Barton Dec 1986 A
4632509 Ohmi Dec 1986 A
4638287 Umebayashi et al. Jan 1987 A
4647161 Müller Mar 1987 A
4653316 Fukuhara Mar 1987 A
4669825 Itoh Jun 1987 A
4669826 Itoh Jun 1987 A
4671615 Fukada Jun 1987 A
4672457 Hyatt Jun 1987 A
4676601 Itoh Jun 1987 A
4690508 Jacob Sep 1987 A
4692798 Seko et al. Sep 1987 A
4697883 Suzuki Oct 1987 A
4701022 Jacob Oct 1987 A
4713685 Nishimura et al. Dec 1987 A
4717830 Botts Jan 1988 A
4727290 Smith Feb 1988 A
4731669 Hayashi et al. Mar 1988 A
4741603 Miyagi May 1988 A
4768135 Kretschmer et al. Aug 1988 A
4772942 Tuck Sep 1988 A
4789904 Peterson Dec 1988 A
4793690 Gahan Dec 1988 A
4817948 Simonelli Apr 1989 A
4820933 Hong Apr 1989 A
4825232 Howdle Apr 1989 A
4838650 Stewart Jun 1989 A
4847772 Michalopoulos et al. Jul 1989 A
4855822 Narendra et al. Aug 1989 A
4862037 Farber et al. Aug 1989 A
4867561 Fujii et al. Sep 1989 A
4871917 O'Farrell et al. Oct 1989 A
4872051 Dye Oct 1989 A
4881019 Shiraishi et al. Nov 1989 A
4882565 Gallmeyer Nov 1989 A
4886960 Molyneux Dec 1989 A
4891559 Matsumoto et al. Jan 1990 A
4892345 Rachael, III Jan 1990 A
4895790 Swanson et al. Jan 1990 A
4896030 Miyaji Jan 1990 A
4907870 Brucker Mar 1990 A
4910591 Petrossian et al. Mar 1990 A
4937796 Tendler Jun 1990 A
4953305 Van Lente et al. Sep 1990 A
4961625 Wood et al. Oct 1990 A
4967319 Seko Oct 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
4974078 Tsai Nov 1990 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield Mar 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi et al. Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5487116 Nakano et al. Jan 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Iino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555312 Shima et al. Sep 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5670935 Schofield et al. Sep 1997 A
5675489 Pomerleau Oct 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724316 Brunts Mar 1998 A
5737226 Olson et al. Apr 1998 A
5757949 Kinoshita et al. May 1998 A
5760826 Nayer Jun 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5929786 Schofield et al. Jul 1999 A
5940120 Frankhouse et al. Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956181 Lin Sep 1999 A
5959367 O'Farrell et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5964822 Alland et al. Oct 1999 A
5971552 O'Farrell et al. Oct 1999 A
5990649 Nagao et al. Nov 1999 A
6009336 Harris et al. Dec 1999 A
6020704 Buschur Feb 2000 A
6084519 Coulling et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6116743 Hoek Sep 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6175164 O'Farrell et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6353392 Schofield et al. Mar 2002 B1
6370329 Teuchert Apr 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6411328 Franke et al. Jun 2002 B1
6424273 Gutta et al. Jul 2002 B1
6430303 Naoi et al. Aug 2002 B1
6433817 Guerra Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6480104 Wall et al. Nov 2002 B1
6483429 Yasui et al. Nov 2002 B1
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6523964 Schofield et al. Feb 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6559761 Miller et al. May 2003 B1
6574033 Chui et al. Jun 2003 B1
6578017 Ebersole et al. Jun 2003 B1
6589625 Kothari et al. Jul 2003 B1
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6631994 Suzuki et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6672731 Schnell et al. Jan 2004 B2
6678056 Downs Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6704621 Stein et al. Mar 2004 B1
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6721659 Stopczynski Apr 2004 B2
6735506 Breed et al. May 2004 B2
6744353 Sjönell Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6975775 Rykowski et al. Dec 2005 B2
7004606 Schofield Feb 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7133661 Hatae et al. Nov 2006 B2
7158015 Rao et al. Jan 2007 B2
7202776 Breed Apr 2007 B2
7227611 Hull et al. Jun 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7375803 Bamji May 2008 B1
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7425076 Schofield et al. Sep 2008 B2
7432248 Schofield et al. Sep 2008 B2
7459664 Schofield et al. Dec 2008 B2
7483058 Frank et al. Jan 2009 B1
7526103 Schofield et al. Apr 2009 B2
7541743 Salmeen et al. Jun 2009 B2
7561181 Schofield et al. Jul 2009 B2
7616781 Schofield et al. Nov 2009 B2
7633383 Dunsmoir et al. Dec 2009 B2
7639149 Katoh Dec 2009 B2
7676087 Dhua et al. Mar 2010 B2
7690737 Lu Apr 2010 B2
7792329 Schofield et al. Sep 2010 B2
7843451 Lafon Nov 2010 B2
7855778 Yung et al. Dec 2010 B2
7859565 Schofield et al. Dec 2010 B2
7881496 Camilleri Feb 2011 B2
7930160 Hosagrahara et al. Apr 2011 B1
8010252 Getman et al. Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8063752 Oleg Nov 2011 B2
8094170 Kato et al. Jan 2012 B2
8098142 Schofield et al. Jan 2012 B2
8164628 Stein et al. Apr 2012 B2
8218007 Lee et al. Jul 2012 B2
8224031 Saito Jul 2012 B2
8260518 Englert et al. Sep 2012 B2
8411998 Huggett et al. Apr 2013 B2
9085261 Lu et al. Jul 2015 B2
9495876 Lu et al. Nov 2016 B2
1016038 Pliefke et al. Dec 2018 A1
20010001563 Tomaszewski May 2001 A1
20020113873 Williams Aug 2002 A1
20030137586 Lewellen Jul 2003 A1
20030210807 Sato Nov 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20050000738 Gehring et al. Jan 2005 A1
20050237385 Kosaka et al. Oct 2005 A1
20060103727 Tseng May 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070242339 Bradley Oct 2007 A1
20080147321 Howard et al. Jun 2008 A1
20080231701 Greenwood et al. Sep 2008 A1
20090113509 Tseng et al. Apr 2009 A1
20090143967 Lee et al. Jun 2009 A1
20120045112 Lundblad et al. Feb 2012 A1
20120200706 Greenwood et al. Aug 2012 A1
20120265416 Lu et al. Oct 2012 A1
20130158863 Skvarce et al. Jun 2013 A1
20140085472 Lu et al. Mar 2014 A1
20140160276 Pliefke et al. Jun 2014 A1
20140343793 Lavoie et al. Nov 2014 A1
20150002670 Bajpai Jan 2015 A1
20150217693 Pliefke et al. Aug 2015 A1
Related Publications (1)
Number Date Country
20190143895 A1 May 2019 US
Provisional Applications (2)
Number Date Country
61972708 Mar 2014 US
61935485 Feb 2014 US
Continuations (1)
Number Date Country
Parent 14613441 Feb 2015 US
Child 16227069 US