Method of inputting a path for a vehicle and trailer

Information

  • Patent Grant
  • 9506774
  • Patent Number
    9,506,774
  • Date Filed
    Thursday, August 14, 2014
    10 years ago
  • Date Issued
    Tuesday, November 29, 2016
    8 years ago
Abstract
A method of inputting a path is provided. The method includes the steps of generating an aerial view of a vehicle and a trailer based on at least one of image data and satellite image data, displaying the aerial view on a display having a touch screen, and registering a touch event on the touch screen that inputs an intended path for the vehicle and the trailer.
Description
FIELD OF THE INVENTION

The present invention generally relates to driver assist and active safety technologies in vehicles, and more particularly to methods for inputting an intended path of a vehicle and a trailer using a display.


BACKGROUND OF THE INVENTION

Operating a vehicle that is connected to a trailer is very challenging for many drivers. Thus, there is a need for a system allowing a user to input an intended path in a simple yet intuitive manner.


SUMMARY OF THE INVENTION

According to one aspect of the present invention, a method of inputting a backing path is provided. The method includes the steps of generating an aerial view of a vehicle and a trailer based on at least one of image data and satellite image data, displaying the aerial view on a display having a touch screen, and registering a touch event on the touch screen that inputs an intended backing path for the vehicle and the trailer.


According to another aspect of the present invention, a method of inputting a backing path is provided. The method includes the steps of generating an aerial view of a vehicle and a trailer, displaying the aerial view on a touch screen, and registering a second touch event on the touch screen that inputs a modification of the intended backing.


According to yet another aspect of the invention, a method of inputting a backing path is provided. The method includes the steps of generating an aerial view of a vehicle and a trailer, displaying the aerial view on a touch screen, and performing a first touch event on the touch screen that inputs an intended backing path for the vehicle and the trailer.


These and other aspects, objects, and features of the present invention will be understood and appreciated by those skilled in the art upon studying the following specification, claims, and appended drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

In the drawings:



FIG. 1 is a is a schematic diagram illustrating imaging devices located on a vehicle or a trailer that is attached to the vehicle;



FIG. 2 illustrates an imaging device according to one embodiment;



FIG. 3. is a top plan view of the vehicle connected to the trailer demonstrating a plurality of fields of view corresponding to the imaging devices;



FIG. 4. is a block diagram of a controller in communication with the imaging devices and other equipment;



FIG. 5. is a diagram of an aerial view of the vehicle and the trailer displayed on a display located within the vehicle;



FIG. 6. illustrates a path input screen displayed on the display;



FIG. 7. illustrates a touch event being registered on the path input screen to input an intended backing path;



FIG. 8. illustrates an intended backing path crossing through an obstacle;



FIG. 9. illustrates a touch event that modifies the intended backing path;



FIG. 10. illustrates a path selection screen displaying suggested backing paths; and



FIG. 11. is a flow diagram for a method of inputting a backing path.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

As required, detailed embodiments of the present invention are disclosed herein. However, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to a detailed design and some schematics may be exaggerated or minimized to show function overview. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.


As used herein, the term “and/or,” when used in a list of two or more items, means that any one of the listed items can be employed by itself, or any combination of two or more of the listed items can be employed. For example, if a composition is described as containing components A, B, and/or C, the composition can contain A alone; B alone; C alone; A and B in combination; A and C in combination; B and C in combination; or A, B, and C in combination.


Backing and maneuvering a trailer can be a difficult task due to challenges in vision and path prediction. Challenges may vary based on vehicle dimensions, trailer dimensions, and environmental conditions. With large trailers a field of view behind the trailer may be completely occluded. With smaller trailers, small changes in steering can cause a hitch angle between the vehicle tow and the trailer to inflect quickly. In view of these and other concerns, the following improvements provide various implementations to bolster the functionality of a trailer backup assist system.


Backing and maneuvering a trailer can be a difficult task due to challenges in vision and path prediction. Challenges may vary based on vehicle dimensions, trailer dimensions, and environmental conditions. With large trailers a field of view behind the trailer may be completely occluded. With smaller trailers, small changes in steering can cause a hitch angle between the vehicle tow and the trailer to inflect quickly. In view of these and other concerns, the following improvements provide various implementations to bolster the functionality of a trailer backup assist system.


As shown in FIG. 1, the imaging devices C1-C5 may be arranged in various locations such that each field of view of the imaging devices C1-C5 is configured to capture a significantly different area of the operating environment 14. Each of the imaging devices C1-C5 may include any form of imaging device configured to capture image data such as, but not limited to, charge coupled device (CCD) and complementary metal oxide semiconductor (CMOS) image sensors. Although five imaging devices C1-C5 are discussed in reference to FIG. 1, the number of imaging devices may vary based on the particular operating specifications of the particular imaging devices implemented and the proportions and/or exterior profiles of a particular vehicle and trailer. For example, large vehicle and trailer combinations may require additional imaging devices to capture image data corresponding to a larger operating environment. The imaging devices may also vary in viewing angle and range of a field of view corresponding to a particular vehicle and trailer combination.


The imaging devices C1, C3, C4, and C5 are disposed on the vehicle 10, each oriented to have a field of view directed towards a substantially different region of the operating environment 14. Imaging device C1 is disposed centrally on a rear portion 16 (e.g. a tailgate) of the vehicle 10 and may employ object detection to monitor the position of a target 18 disposed on the trailer 12 so that a hitch angle γ between the vehicle 10 and the trailer 12 can be determined. As used herein, the hitch angle γ is defined as the angle between a longitudinal centerline axis 20 of the vehicle 10 and the longitudinal centerline axis 22 of the trailer 12. In addition to imaging device C1, or alternatively thereto, imaging device C1′ may be disposed centrally on a rear facing portion 24 of the vehicle 10 proximate a roof portion 26.


Imaging device C3 is disposed centrally on a front facing portion 28 of the vehicle 10 proximate a front grill portion 30. In addition to imaging device C3, or alternatively thereto, imaging device C3′ may be disposed centrally on a front facing portion 32 of the vehicle proximate the roof portion 26. Imaging devices C1 (and/or C1′) and C3 (and/or C3′) are oriented such that the corresponding fields of view encompass substantially the entire operating environment 14 in the aft and fore directions relative to the vehicle 10.


Imaging devices C4 and C5 are disposed on a passenger side 34 and a driver side 36, respectively, and are configured to capture image data corresponding to the operating environment 14 to the sides of the vehicle 10. In some implementations, imaging device C4 is disposed proximate a passenger side mirror 38 and imaging device C5 is disposed proximate a driver side mirror 40. Imaging devices C4 and C5, in combination with imaging devices C1 and C3, are configured to capture image data corresponding to approximately the entire operating environment 14 surrounding the vehicle 10. However, when the vehicle is towing the trailer 12, the trailer 12 may occlude a large portion of a rearward facing field of view from the vehicle 10.


Imaging device C2 may be configure to operate in combination with the imaging devices C1 and C3-C5 to provide a combined field of view of the operating environment 14 surrounding the vehicle 10 and the trailer 12. Imaging device C2 may be disposed on a rear structure 42 of the trailer 12. Imaging device C2 may be located centrally in an upper portion 44 of the trailer 12 and have a rearward facing field of view relative to the trailer 12. Imaging device C2 can be variously located depending on trailer type and trailer geometry. In various implementations, the imaging device C2 may have a substantially rearward facing field of view configured to capture image data corresponding to the operating environment 14 that may be occluded from imaging devices C1 and C3-C5 by the trailer 12.


Referring to FIG. 2, imaging device C2 is shown according to one implementation as a portable electronic device 46 with a built in camera 48 and corresponding image capture setting. Portable electronic device 46 may correspond to a smart device such as, but not limited to, a smart phone or a tablet. As shown, the portable electronic device 46 is provided in a housing 50 coupled to the rear structure 42 of the trailer 12. The housing 50 may be constructed from a transparent rigid material (e.g. plastic) to enable the portable electronic device 46 to accurately capture image data rearward of the trailer 12. The portable electronic device 46 can be supported within the housing 50 via a cradle 52 and/or other support mechanism and may be powered via a corresponding charging cord 54 that is electrically coupled to an electrical system of the trailer 12. The portable electronic device 46 may have an integrated navigational system 56 that includes a GPS receiver 58 for assisted GPS functionality and/or one or more integrated inertial sensors 60, which may include tri-axial gyroscopes, tri-axial accelerometers, tri-axial magnetometers, barometers, the like, or combination thereof. Image data from the camera 48 along with data from the GPS receiver 58 and/or the inertial sensors 60 may be communicated to a vehicle system via wired or wireless (e.g. Bluetooth®) connections. While one implementation of imaging device C2 has been described herein as being a portable electronic device 46, it should not be seen as limiting. Further, it should be appreciated that imaging devices C1 and C3-C5 may also be implemented as portable electronic devices, but are not limited thereto.


Referring to FIG. 3, a top plan view of the vehicle 10 connected to the trailer 12 is shown demonstrating a plurality of fields of view of imaging devices C1-C5. In the illustrated embodiment, imaging device C1 is shown having field of view 62, imaging device C2 is shown having field of view 64, imaging device C3 is shown having field of view 66, imaging device C4 is shown having field of view 68, and imaging device C5 is shown having field of view 70. In this implementation, each of fields of view 62, 64, and 66 include a horizontal viewing angle of approximately 170 degrees or greater and each of corresponding imaging devices C1, C2, and C3 are configured to capture image data corresponding to the fore and aft directions relative to the vehicle 10 and the trailer 12. Imaging devices C4 and C5 are configured to capture image data corresponding to the operating area to each side of the vehicle 10 and the trailer 12 and have corresponding fields of view 68 and 70 that may include viewing angles of approximately 170 degrees or greater. As shown, field of view 68 may form an overlapping portion 72 with field of view 66 and an overlapping portion 74 with field of view 62. Similarly, field of view 70 may also form an overlapping portion 76 with field of view 66 and an overlapping portion 78 with field of view 62. While not shown, each of fields of view 62, 68, and 70 may further form overlapping portions with field of view 64. The overlapping portions may be combined in some implementations to form an expanded view or an aerial view of the vehicle 10 and the trailer 12. The imaging devices C1-C5 are configured to capture image data corresponding to objects and terrain in the surrounding operating environment 14 of the vehicle 10 and the trailer 12.


In the various implementations discussed herein, each of the fields of view 62-70 may be combined in any combination to form various expanded fields of view and corresponding viewing angles based on operating states and relative orientations of the vehicle 10 and the trailer 12. The operating states and relative orientations of the vehicle 10 and the trailer 12 may be determined from the heading of the vehicle 10, the velocity of the vehicle 10, the steering angle δ, and the hitch angle γ between the vehicle 10 and the trailer 12. In some implementations, the fields of view 62-70 may also be combined to form a composite aerial view or bird's eye view of the vehicle 10 and the trailer 12. Information related to the operating state and orientation of the vehicle 10 relative to the trailer 12 may also be utilized to generate a simulated aerial view of the vehicle 10 and the trailer 12 demonstrating the hitch angle γ about point 80.


The various views of the vehicle 10 and the trailer 12, as discussed herein, may be generated and displayed by a controller on the display 13 such that an operator of the vehicle 10 may view the information corresponding to the vehicle 10, the trailer 12, and the surrounding operating environment 14. The display 13 may be implemented in the vehicle 10 as a center stack monitor, rear view display mirror, gauge cluster monitor, a heads-up display, or any other device configured to present the image data processed from the imaging devices C1-C5. The image data from the imaging devices C1-C5 may be raw image data, lens corrected camera image data, composite image data, or any other form of image data captured by the imaging devices C1-C5 or any other form of imaging device.


Referring to FIG. 4, a block diagram of a controller 82 is shown. The controller 82 may be combined or in communication with a trailer backup assist system. The controller 82 may receive and process image data from imaging devices C1-C5 to generate a variety of views for display on display 13. Display 13 may include a plurality of user inputs 84 to enable the controller 82 to receive selections from an operator of the vehicle 10. Display 13 may also include a screen 86 for showing one or more views, which may be selected by the operator and/or autonomously generated. According to one implementation, the screen 86 can be configured as a touch screen for registering one or more touch events. The screen 86 may employ resistive sensing, capacitive sensing, surface acoustic wave sensing, or any other sensing means capable of registering a single or multi-touch event for allowing an operator to input a variety of user commands related to trailer backup assist functionality.


The controller 82 may also be in communication with a first navigational system 88 that includes a GPS device 90, a compass 92, and one or more inertial sensors 94, each of which can be equipment already on-board the vehicle 10. The GPS device 90 can include GPS receiver 91 and is operable to determine a global position and location of the vehicle 10 and communicate the position and location to the controller 82. The compass 92 can be operable to determine the heading direction of the vehicle 10 relative to a geographic compass direction and communicate the heading direction to the controller 82. The inertial sensors 94 can be operable to determine the motion and rotation of the vehicle 10. They may include one or more motion sensors 96 (e.g. an accelerometer) and rotation sensors 98 (e.g. a gyroscope).


The controller 82 may further be in communication with a second navigational system 100, which can include a GPS receiver 102 and one or more inertial sensors 104. According to one implementation, GPS receiver 102 is integrated with imaging device C2. Optionally, inertial sensors 104 may also be integrated with imaging device C2, which can be configured as the portable electronic device 46 shown in FIG. 2. However, it should be appreciated that imaging device C2 can be implemented as a dedicated piece of equipment that is fixed to the trailer 12. Further, GPS receiver 102 and inertial sensors 104 can be provided elsewhere on the trailer 12 and may be incorporated with other equipment and/or structures on the trailer 12.


GPS receiver 102 may be operable to determine a global position and location of the trailer 12 and communicate the position and location to the controller 82. Inertial sensors 104 may be operable to determine the motion and rotation of the trailer 12 and can include any sensor configurations described herein. By providing a navigational system 100 on the trailer 12, the hitch angle γ between the vehicle 10 and the trailer 12 can be determined without the need for image based target recognition. This would also eliminate the need for an operator to attach a target (e.g. target 18) on the trailer 12 or perform vehicle/trailer measurements related to setting up an image based target detection system.


In one implementation, the controller 82 can calculate the hitch angle γ by comparing the vehicle position to the trailer position using vehicle position data received from GPS receiver 91 and trailer position data received from GPS receiver 102. In another implementation, the controller 82 may include a hitch angle detection module 106 configured to alternate between receiving vehicle position data outputted from GPS receiver 91 and trailer position data outputted from GPS receiver 102. The hitch angle detection module 106 can include a Kalman filter 108 for smoothing and extrapolating a vehicle position and a trailer position from the vehicle position data and the trailer position data and subsequently computing a hitch angle γ based on the extrapolated vehicle position and the extrapolated trailer position. In yet another implementation, the controller 82 may calculate the hitch angle γ based on data received from the inertial sensors 94 associated with the vehicle 10 and the inertial sensors 104 associated with the trailer 12. For instance, inertial sensors 94 and 104 can provide the controller 82 with data related to an instantaneous vehicle direction and an instantaneous trailer direction, respectively, which the controller 82 can use to calculate the hitch angle γ. In yet another implementation, the controller 82 may utilize position data for the vehicle 10 as a reference to compute differential position biases for the trailer 12 and vice versa. Doing so may result in more accurate relative position calculations between the vehicle 10 and the trailer 12, thereby resulting in more precise hitch angle γ calculations. It should be appreciated that each of the abovementioned implementations can be combined or performed separately.


As is further shown in FIG. 4, the controller 82 can be configured to communicate with one or more vehicle systems, shown as powertrain system 110, steering system 112, brake system 114, and a gear selection device (PRDNL) 116. Jointly, the powertrain system 110, steering system 112, brake system 114, and gear selection device 116 may cooperate to control the vehicle 10 and the trailer 12 during a backing procedure. According to one implementation, the controller 82 may send instructions to any one of the powertrain system 110, steering system 112, brake system 114, and gear selection device 116 based on input received from a steering input apparatus 118, which may include information defining a path of travel of the vehicle 10 and the trailer 12. The steering input apparatus 118 can be configured as a rotatable device (e.g. knob, steering wheel) that allows an operator of the vehicle 10 to steer the vehicle 10 during a backing maneuver.


The controller 82 may include a memory 120 coupled to one or more processors 122 for executing instructions 124 stored in the memory 120. The memory 120 and instructions 124 together define an example of a non-transient processor-readable medium. The controller 82 may further include a plurality of modules for combining the image data received from the imaging devices C1-C5 with satellite image data (e.g. from GPS device 90) to form various composite views of the operating environment 14 surrounding the vehicle 10 and the trailer 12. The plurality of modules may include a distortion correction module 126, a view conversion module 128, an image trimming/scaling module 130, an image reference identification module 132, and an image compositor 134.


To generate a composite view combining imaging data corresponding to two or more of the image devices C1-C5, the controller 82 may receive image data from the imaging devices C1-C5 and correct any distortion in the image data with the distortion correction module 126. Distortion in the image data may be the result of lens distortion, viewpoint correction, or any other form of distortion common in imaging devices. The view conversion module 128 may the convert a viewpoint of the image data. A viewpoint correction may correspond to altering the orientation of a perspective of the image data corresponding to a field of view of an imaging device. For example, the image data may be adjusted from a side view to an aerial view. The image data from each of the two or more imaging devices may then be trimmed and scaled by the image trimming/scaling module 130 and combined in the image compositor 134. The composite image data output by the compositor 134 may form an expanded field of view, an aerial view, or any combination of the image data received from the imaging devices C1-C5.


In some implementations, the relative location of the image data received from the two or more imaging devices may further be aligned by the image reference identification module 132. The image reference identification module 132 may be operable to detect and identify objects in the image data received from each of the imaging devices C1-C5 and utilize objects in different fields of view to align and accurately combine the image data. The image compositor 134 may further be able to identify occluded and/or missing image data and request satellite image data or other feature data from the GPS device 90 to further supplement and enhance the composite image data. The resulting enhanced composite image data may then be output to the screen 86 for display to the operator of the vehicle 10.


Referring to FIG. 5, an aerial view of the vehicle 10 and the trailer 12 is displayed on the screen 86 of display 13. A vehicle model 136 of the vehicle 10 and a trailer model 138 of the trailer 12 may be incorporated into the aerial view by the controller 82 as sample image data and/or rendered graphics. The sample image data may include stock images of the vehicle 10 and a library of trailer images that may be incorporated in the aerial view to demonstrate the proportions and position of the vehicle 10 relative to the trailer 12. In some implementations, the vehicle operator may input the dimensions of the trailer 12 from the vehicle operator via the user inputs 84 of display 13. The controller 82 may also be operable to estimate the dimensions of the trailer 12 based on known relationships of the positions of each of the imaging devices C1-C5. For example, the controller 82 may be operable to detect the trailer 12 in the image data with the image reference identification module 132. Based on the known relationships of the positions of the imaging devices C1-C5 and the corresponding fields of view 62-70, the controller 82 may be operable to determine the proportions, approximate dimensions, and shape of the trailer 12 to generate the trailer model 138.


The controller 82 may further utilize the hitch angle γ to process and compare image data of the trailer 12 in different positions relative to the vehicle 10 to gain additional image data to determine the proportions, approximate dimensions, and shape of the trailer 12. The hitch angle γ may further be utilized by the controller 82 to display the trailer model 138 relative to the vehicle model 136 at the corresponding hitch angle γ. By demonstrating the vehicle model 136 and the trailer model 138, the controller 82 may provide useful information to the operator of the vehicle 10. In some implementations, a graphic outline simulating the trailer 12 may also be included in the image data displayed on the screen 86 for a reference to the operator of the vehicle 10 to demonstrate the position of the trailer model 138 relative to the vehicle model 136 and an operating environment model 140. Based on the determined proportions, approximate dimensions, and shape of the trailer 12, the controller 82 may automatically select a trailer graphic or a stock image of a trailer model 138 from a library of trailer images or graphics via memory 120.


A plurality of environmental features 142 may also be displayed on the screen 86 by the controller 82. The environmental features 142 may be incorporated in the image data displayed on the screen 86 to demonstrate a location of the environmental features 142 relative to the vehicle model 136 and the trailer model 138. The locations of the environmental features 142 may be extrapolated from the composite image data captured by the imaging devices C1-C5 by the image reference identification module 132 of the controller 82. Each of the environmental features 142 may be identified based on one or more feature identification algorithms configured to identify various natural and man-made features that may obstruct the path of the vehicle 10 and the trailer 12. Additionally or alternatively, sensors and/or radar may be used for detecting environmental features that may be in the path of the vehicle 10 and the trailer 12.


The environmental features 142 may be identified and incorporated in the aerial view based on image data, satellite image data, and any other data corresponding to the position and heading of the vehicle 10. Based on the position and heading of the vehicle 10, the environmental features 142 may be added to the composite image data and located on the screen 86 relative to the vehicle model 136 and the trailer model 138 by utilizing global positions of each of the environmental features 142. The location of the environmental features 142 may be determined by the controller 82 from the GPS device 90 and the compass 92. By enhancing the aerial view with satellite image data, the controller 82 may provide additional information that may be used in addition to the information identified from the imaging devices C1-C5. In some implementations, satellite image data may further be utilized by the controller 82 to provide information corresponding to a region that may be occluded from the fields of view 62-70 of the imaging devices C1-C5.


The screen 86 of display 13 may be configured as a touchscreen of any type such as a resistive type, capacitive type, surface acoustic type, infrared type, and optical type. The plurality of user inputs 84 may be implemented as soft keys and provide options for the operator of the vehicle 10 to alter a view displayed by the controller 82. The soft keys may allow the operator of the vehicle 10 to view the operating environment 140 and select a view corresponding to each of the imaging devices C1-C5, a combination of the imaging devices C1-C5, or the composite aerial view. The soft keys may further provide an option for a manual mode to manually control the view displayed on the screen 86 or an automatic mode to automatically control the view displayed on the screen 86.


While the composite aerial view is selected, an operator of the vehicle 10 may touch soft key 144 to enter a path input mode. When the path input mode is activated, the controller 82 may prompt the display 13 to display a path input screen 145 as shown in FIG. 6. The path input screen 145 can correspond to a “zoomed out” aerial view of the aerial view shown in FIG. 5. Additionally, the zoomed out aerial view may show other environmental features 142 that were not shown in the previous aerial view. By providing a zoom feature, the operator of the vehicle 10 is afforded a greater view of the operating environment. However, it is contemplated that the operator may decrease the amount of zoom should the operator desire a magnified view. The amount of zoom may be adjusted automatically or manually via zoom out key 146 and zoom in key 148. It is also contemplated that the vehicle model 136 and the trailer model 138 may be initially located at the center of the screen 86 by default. An operator may later adjust the screen center by touching the edit center soft key 150 and then dragging the aerial view in a desired direction. Once the aerial view is located in the desired position, the operator may touch the edit center soft key 150 to lock the new aerial view in place. The controller 82 may fit the new aerial view to the screen 86 and analyze image data and/or satellite image data to display any new environmental features 142 arising therefrom.


According to one implementation, the screen 86 is configured to register a touch event that inputs an intended backing path for the vehicle 10 and the trailer 12. Before the intended backing path can be inputted, certain prerequisite conditions may be required. For instance, it may be required for the gear selection device 116 of the vehicle 10 to be in either a park or a reverse position and that the vehicle 10 and the trailer 12 be aligned with one another. To input an intended backing path, the operator of the vehicle 10 may touch the new path soft key 151 and then trace the intended backing path on the screen 86. At any point, the operator may touch the exit soft key 153 to exit the path input mode.


An intended backing path 152 is exemplarily shown in FIG. 7. The intended backing path 152 may be traced (e.g. using finger) from the rear of the trailer model 138 and terminates at a final position 154 selected by the operator. Once traced, the intended backing path 152 is represented as an overlay on the screen 86. The intended backing path 152 may be curved, straight, or a combination thereof. Preferably, the intended backing path 152 is traced to avoid obstacles displayed on the screen 86, which may include environmental features 142 and any objects that would impede the vehicle 10 and trailer 12 from being backed along the intended backing path 152. The obstructions can be natural and/or man-made and may be detected using one or a combination of imaging devices C1-C5 in addition to satellite image data. It is also contemplated that sensors and/or radar may be employed for object detection.


When the intended backing path 152 has been traced, the operator may touch soft key 156 to accept the intended backing path 152 or otherwise touch soft key 158 to trace a new one. While the intended backing path 152 is being traced or afterwards, the controller 82 may determine if any unacceptable path conditions are present. An unacceptable path condition may arise if any traced portions of the intended backing path 152 would result in a hitch angle γ between the vehicle 10 and the trailer 12 exceeding a maximum hitch angle γ, thereby creating a possible jackknife condition. Another unacceptable path condition may arise if one or more obstacles block the intended backing path 152. If one or more unacceptable path conditions arise, the controller 82 may generate a warning to the operator indicating that the intended backing path 152 requires revision or that a new intended backing path needs to be inputted. The warning may be of any type intended to stimulate the senses of an operator and may include warnings that are visual, auditory, tactile, or a combination thereof.


In FIG. 8, an intended backing path 152 is exemplarily shown crossing through an obstacle 160. As shown, the intended backing path 152 has various traced portions corresponding to unobstructed portions 162 and an obstructed portion 164. To alert the operator of the unacceptable path condition, the controller 82 may display the unobstructed portions 162 of the intended backing path 152 in a first color (e.g. green) and display the obstructed portion 164 in a second color (e.g. red) that is visually distinguishable from the first color. In response to the unacceptable path condition, another touch event may be performed to modify the intended backing path 152.


According to one implementation shown in FIG. 9, the operator may manipulate the obstructed portion 164 of the intended backing path 152 by dragging the obstructed portion 164 away from the obstacle 160. It is contemplated that the operator may drag any area of the obstructed portion 164 in any direction. While this occurs, the controller 82 may automatically adjust the curvature of the modified intended backing path 166 as needed to ensure allowable hitch angles γ and/or avoid any other potential obstacles. If no unacceptable backing conditions arise from the modified intended backing path 166, soft key 168 may become available to allow the operator to accept the modified intended backing path 166. Otherwise, the operator may touch soft key 170 to exit back to the path input screen 145. Additionally or alternatively, the controller 82 may automatically generate one or more possible backing paths A, B, as shown in FIG. 10, and the operator may be given the option of selecting one of the suggested backing paths A, B via a corresponding soft key 172, 174.


Once a backing path has been entered via soft key 168, the controller 82 may extrapolate GPS coordinates for all points along the backing path. The controller 82 may work in conjunction with the GPS device 90 and send instructions to the powertrain system 110, steering system 112, and/or brake system 114 to back the vehicle 10 and the trailer 12 along the inputted backing path. Depending on which systems 110, 112, 114 are employed, the backing maneuver may be completely autonomous or require some actions on the part of the operator of the vehicle 10. While the vehicle 10 and the trailer 12 are backing along the backing path, the operator may adjust the path curvature using the steering input apparatus 118 and/or performing another touch event on the screen 86 (e.g. dragging a traced portion of the backing path). The final resulting backing path may be saved to the GPS device 90 or other location, either manually or automatically. Additionally, the GPS coordinates along with the orientation of the vehicle 10 and trailer 12 may also be saved to the GPS device 90 and/or other location. In this manner, an operator performing repetitive backing maneuvers can simply retrieve and order a saved backing path to be performed instead of having to manually input the backing path each time. Similarly, when an operator pulls of a parking spot, the corresponding pull out path may also be saved accordingly and may be subsequently or concurrently displayed as an overlay on the screen 86. It should be appreciated that an operator may input a pull out path via one or more touch events in a similar way to inputting a backing path, as described herein. Furthermore, saved backing and/or pull out paths may be offered as suggested paths when applicable.


Referring now to FIG. 11, a method 175 for inputting a backing path is shown. The method may be implemented using the controller 82 and other equipment described herein and shown in FIGS. 4-10. More specifically, the method can be embodied as instructions 124 stored in memory 120 and executable by processor 122. The method 175 may begin at step 176, which includes generating an aerial view of a vehicle 10 and a trailer 12 based on at least one of image data and satellite image data. Step 178 includes displaying the aerial view on a display 13 having a touch screen 86. Step 180 includes registering a touch event on the touch screen 86 that inputs an intended backing path for the vehicle 10 and the trailer 12. Step 182 includes determining whether an unacceptable path condition is present. If an unacceptable path condition exits, step 184 includes modifying the intended backing path to overcome the unacceptable path condition or selecting a backing path from one or more suggested backing paths to overcome the unacceptable path condition. Step 186 includes backing the vehicle 10 and the trailer 12 along either the intended backing path or the backing path selected in step 184.


The systems and methods described herein may offer improvements to the functionality of a trailer backup assist system. Though the systems and methods were described and illustrated herein as being implemented on a specific vehicle and trailer, it should be appreciated that the systems and methods described herein may be utilized with any vehicle and trailer combination in accordance with the disclosure.


It is to be understood that variations and modifications can be made on the aforementioned structure without departing from the concepts of the present invention, and further it is to be understood that such concepts are intended to be covered by the following claims unless these claims by their language expressly state otherwise.

Claims
  • 1. A method of inputting a backing path, comprising: displaying an aerial view of a vehicle and a trailer on a display having a touch screen;tracing, on the touch screen, an intended backing path for the vehicle and the trailer, wherein unobstructed portions and obstructed portions of the intended backing path are displayed in different colors; anddragging an obstructed portion of the intended backing path to modify a path curvature of the intended backing path.
  • 2. The method of claim 1, further comprising determining whether an unacceptable path condition is present and generating a warning indicating the presence of the unacceptable path condition.
  • 3. The method of claim 2, wherein the unacceptable path condition comprises one of a traced portion of the intended backing path resulting in a hitch angle between the vehicle and the trailer exceeding a maximum hitch angle and one or more obstacles blocking a traced portion of the intended backing path.
  • 4. The method of claim 1, further comprising automatically adjusting a path curvature of the intended backing path to avoid an unacceptable path condition.
  • 5. The method of claim 1, further comprising generating one or more possible backing paths and selecting between the intended backing path and the one or more possible backing paths.
  • 6. A method of inputting a backing path, comprising the steps of: displaying an aerial view of a vehicle and a trailer on a touch screen;tracing, on the touch screen, an intended backing path for the vehicle and the trailer; anddragging an obstructed portion of the intended backing path to modify the intended backing path.
  • 7. The method of claim 6, further comprising determining whether an unacceptable path condition is present and generating a warning indicating the presence of the unacceptable path condition.
  • 8. The method of claim 7, wherein the unacceptable path condition comprises one of a traced portion of the intended backing path resulting in a hitch angle between the vehicle and the trailer exceeding a maximum hitch angle and one or more obstacles blocking a traced portion of the intended backing path.
  • 9. The method of claim 6, further comprising automatically adjusting a path curvature of the intended backing path to avoid an unacceptable path condition.
  • 10. The method of claim 6, further comprising generating one or more possible backing paths and selecting between the intended backing path and the one or more possible backing paths.
  • 11. The method of claim 6, further comprising saving the intended backing path for later use.
  • 12. The method of claim 6, wherein the obstructed portion of the intended backing path is displayed in a color that is visually distinguishable from a color used to display an unobstructed portion of the intended backing path.
  • 13. A method of inputting a backing path, comprising the steps of: displaying an aerial view of a vehicle and a trailer on a touch screen;tracing, on the touch screen, an intended backing path for the vehicle and the trailer;determining whether an unacceptable path condition is present;generating a warning indicating the presence of the unacceptable path condition; anddragging an obstructed portion of the intended backing path to modify a path curvature of the intended backing path.
  • 14. The method of claim 13, wherein the obstructed portion of the intended backing path is displayed in a color that is visually distinguishable from a color used to display an unobstructed portion of the intended backing path.
  • 15. The method of claim 13, wherein the unacceptable path condition comprises one of a traced portion of the intended backing path resulting in a hitch angle between the vehicle and the trailer exceeding a maximum hitch angle and one or more obstacles blocking a traced portion of the intended backing path.
  • 16. The method of claim 13, further comprising automatically adjusting a path curvature of the intended backing path to avoid an unacceptable path condition.
  • 17. The method of claim 13, further comprising generating one or more possible backing paths and selecting between the intended backing path and the one or more possible backing paths.
  • 18. The method of claim 13, further comprising saving the intended backing path for later use.
CROSS-REFERENCE TO RELATED APPLICATIONS

This patent application is continuation-in-part of U.S. patent application Ser. No. 14/289,888, which was filed on May 29, 2014, entitled “DISPLAY SYSTEM UTILIZING VEHICLE AND TRAILER DYNAMICS,” which is a continuation-in-part of U.S. patent application Ser. No. 14/256,427, which was filed on Apr. 18, 2014, entitled “CONTROL FOR TRAILER BACKUP ASSIST SYSTEM,” which is a continuation-in-part of U.S. patent application Ser. No. 14/249,781, which was filed on Apr. 10, 2014, entitled “SYSTEM AND METHOD FOR CALCULATING A HORIZONTAL CAMERA TO TARGET DISTANCE,” which is a continuation-in-part of U.S. patent application Ser. No. 14/188,213, which was filed on Feb. 24, 2014, entitled “SENSOR SYSTEM AND METHOD FOR MONITORING TRAILER HITCH ANGLE,” which is a continuation-in-part of U.S. patent application Ser. No. 13/847,508, which was filed on Mar. 20, 2013, entitled “HITCH ANGLE ESTIMATION.” U.S. patent application Ser. No. 14/188,213 is also a continuation-in-part of U.S. patent application Ser. No. 14/068,387, which was filed on Oct. 31, 2013, entitled “TRAILER MONITORING SYSTEM AND METHOD,” which is a continuation-in-part of U.S. patent application Ser. No. 14/059,835, which was filed on Oct. 22, 2013, entitled “TRAILER BACKUP ASSIST SYSTEM,” which is a continuation-in-part of U.S. patent application Ser. No. 13/443,743 which was filed on Apr. 10, 2012, entitled “DETECTION OF AND COUNTERMEASURES FOR JACKKNIFE ENABLING CONDITIONS DURING TRAILER BACKUP ASSIST,” which is a continuation-in-part of U.S. patent application Ser. No. 13/336,060, which was filed on Dec. 23, 2011, entitled “TRAILER PATH CURVATURE CONTROL FOR TRAILER BACKUP ASSIST,” which claims benefit from U.S. Provisional Patent Application No. 61/477,132, which was filed on Apr. 19, 2011, entitled “TRAILER BACKUP ASSIST CURVATURE CONTROL.” U.S. patent application Ser. No. 14/249,781 is also a continuation-in-part of U.S. patent application Ser. No. 14/161,832 which was filed Jan. 23, 2014, entitled “SUPPLEMENTAL VEHICLE LIGHTING SYSTEM FOR VISION BASED TARGET DETECTION,” which is a continuation-in-part of U.S. patent application Ser. No. 14/059,835 which was filed on Oct. 22, 2013, entitled “TRAILER BACKUP ASSIST SYSTEM.” Furthermore, U.S. patent application Ser. No. 14/249,781 is a continuation-in-part of U.S. application Ser. No. 14/201,130 which was filed on Mar. 7, 2014, entitled “SYSTEM AND METHOD OF CALIBRATING A TRAILER BACKUP ASSIST SYSTEM,” which is a continuation-in-part of U.S. patent application Ser. No. 14/068,387, which was filed on Oct. 31, 2013, entitled “TRAILER MONITORING SYSTEM AND METHOD.” The aforementioned related applications are hereby incorporated by reference in their entirety.

US Referenced Citations (398)
Number Name Date Kind
3605088 Savelli Sep 1971 A
3833928 Gavit et al. Sep 1974 A
3924257 Roberts Dec 1975 A
4044706 Gill Aug 1977 A
4430637 Koch-Ducker et al. Feb 1984 A
4846094 Woods Jul 1989 A
4848499 Martinet et al. Jul 1989 A
4897642 DiLullo et al. Jan 1990 A
4947097 Tao Aug 1990 A
5097250 Hernandez Mar 1992 A
5132851 Bomar et al. Jul 1992 A
5155683 Rahim Oct 1992 A
5191328 Nelson Mar 1993 A
5235316 Qualizza Aug 1993 A
5247442 Kendall Sep 1993 A
5455557 Noll et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5650764 McCullough Jul 1997 A
5690347 Juergens et al. Nov 1997 A
5734336 Smithline Mar 1998 A
5781662 Mori et al. Jul 1998 A
5905433 Wortham May 1999 A
5951035 Phillips, Jr. et al. Sep 1999 A
5957232 Shimizu et al. Sep 1999 A
5999091 Wortham Dec 1999 A
6041582 Tiede et al. Mar 2000 A
6100795 Otterbacher et al. Aug 2000 A
6178650 Thibodeaux Jan 2001 B1
6182010 Berstis Jan 2001 B1
6198992 Winslow Mar 2001 B1
6226226 Lill et al. May 2001 B1
6351698 Kubota et al. Feb 2002 B1
6366202 Rosenthal Apr 2002 B1
6411898 Ishida et al. Jun 2002 B2
6434486 Studt et al. Aug 2002 B1
6480104 Wall et al. Nov 2002 B1
6483429 Yasui et al. Nov 2002 B1
6526335 Treyz et al. Feb 2003 B1
6539288 Ishida et al. Mar 2003 B2
6573833 Rosenthal Jun 2003 B1
6577952 Geier et al. Jun 2003 B2
6580984 Fecher et al. Jun 2003 B2
6604592 Pietsch et al. Aug 2003 B2
6643576 O Connor et al. Nov 2003 B1
6683539 Trajkovic et al. Jan 2004 B2
6801125 McGregor et al. Oct 2004 B1
6816765 Yamamoto et al. Nov 2004 B2
6837432 Tsikos et al. Jan 2005 B2
6847916 Ying Jan 2005 B1
6857494 Kobayashi et al. Feb 2005 B2
6933837 Gunderson et al. Aug 2005 B2
6959970 Tseng Nov 2005 B2
6970184 Hirama et al. Nov 2005 B2
6989739 Li Jan 2006 B2
7005974 McMahon et al. Feb 2006 B2
7026957 Rubenstein Apr 2006 B2
7047117 Akiyama et al. May 2006 B2
7085634 Endo et al. Aug 2006 B2
7089101 Fischer et al. Aug 2006 B2
7136754 Hahn et al. Nov 2006 B2
7142098 Lang et al. Nov 2006 B2
7154385 Lee et al. Dec 2006 B2
7161616 Okamoto Jan 2007 B1
7175194 Ball Feb 2007 B2
7204504 Gehring et al. Apr 2007 B2
7207041 Elson et al. Apr 2007 B2
7220217 Tamai et al. May 2007 B2
7225891 Gehring et al. Jun 2007 B2
7229139 Lu et al. Jun 2007 B2
7239958 Grougan et al. Jul 2007 B2
7266435 Wang et al. Sep 2007 B2
7309075 Ramsey et al. Dec 2007 B2
7310084 Shitanaka et al. Dec 2007 B2
7315299 Sunda et al. Jan 2008 B2
7319927 Sun et al. Jan 2008 B1
7352388 Miwa et al. Apr 2008 B2
7353110 Kim Apr 2008 B2
7366892 Spaur et al. Apr 2008 B2
7401871 Lu et al. Jul 2008 B2
7425889 Widmann et al. Sep 2008 B2
7451020 Goetting et al. Nov 2008 B2
7463137 Wishart et al. Dec 2008 B2
7505784 Barbera Mar 2009 B2
7537256 Gates et al. May 2009 B2
7552009 Nelson Jun 2009 B2
7602782 Doviak et al. Oct 2009 B2
7623952 Unruh et al. Nov 2009 B2
7640108 Shimizu et al. Dec 2009 B2
7689253 Basir Mar 2010 B2
7690737 Lu Apr 2010 B2
7692557 Medina et al. Apr 2010 B2
7693661 Iwasaka Apr 2010 B2
7715953 Shepard May 2010 B2
7777615 Okuda et al. Aug 2010 B2
7783699 Rasin et al. Aug 2010 B2
7786849 Buckley Aug 2010 B2
7801941 Conneely et al. Sep 2010 B2
7825782 Hermann Nov 2010 B2
7827047 Anderson et al. Nov 2010 B2
7840347 Noguchi Nov 2010 B2
7904222 Lee et al. Mar 2011 B2
7907975 Sakamoto et al. Mar 2011 B2
7917081 Voto et al. Mar 2011 B2
7932623 Burlak et al. Apr 2011 B2
7932815 Martinez et al. Apr 2011 B2
7950751 Offerle et al. May 2011 B2
7969326 Sakakibara Jun 2011 B2
7974444 Hongo Jul 2011 B2
8009025 Engstrom et al. Aug 2011 B2
8010252 Getman et al. Aug 2011 B2
8019592 Fukuoka et al. Sep 2011 B2
8024743 Werner Sep 2011 B2
8033955 Farnsworth Oct 2011 B2
8036792 Dechamp Oct 2011 B2
8037500 Margis et al. Oct 2011 B2
8038166 Piesinger Oct 2011 B1
8044776 Schofield et al. Oct 2011 B2
8044779 Hahn et al. Oct 2011 B2
8121802 Grider et al. Feb 2012 B2
8131458 Zilka Mar 2012 B1
8140138 Chrumka Mar 2012 B2
8150474 Saito et al. Apr 2012 B2
8165770 Getman et al. Apr 2012 B2
8169341 Toledo et al. May 2012 B2
8174576 Akatsuka et al. May 2012 B2
8179238 Roberts, Sr. et al. May 2012 B2
8195145 Angelhag Jun 2012 B2
8205704 Kadowaki et al. Jun 2012 B2
8244442 Craig et al. Aug 2012 B2
8245270 Cooperstein et al. Aug 2012 B2
8255007 Saito et al. Aug 2012 B2
8267485 Barlsen et al. Sep 2012 B2
8270933 Riemer et al. Sep 2012 B2
8280607 Gatti et al. Oct 2012 B2
8308182 Ortmann et al. Nov 2012 B2
8310353 Hinninger et al. Nov 2012 B2
8315617 Tadayon et al. Nov 2012 B2
8319618 Gomi Nov 2012 B2
8319663 Von Reyher et al. Nov 2012 B2
8352575 Samaha Jan 2013 B2
8362888 Roberts, Sr. et al. Jan 2013 B2
8370056 Trombley et al. Feb 2013 B2
8374749 Tanaka Feb 2013 B2
8380416 Offerle et al. Feb 2013 B2
8392066 Ehara et al. Mar 2013 B2
8401744 Chiocco Mar 2013 B2
8406956 Wey et al. Mar 2013 B2
8417263 Jenkins et al. Apr 2013 B2
8417417 Chen et al. Apr 2013 B2
8417444 Smid et al. Apr 2013 B2
8427288 Schofield et al. Apr 2013 B2
8451107 Lu et al. May 2013 B2
8471691 Zhang et al. Jun 2013 B2
8473575 Marchwicki et al. Jun 2013 B2
8494439 Faenger Jul 2013 B2
8498757 Bowden et al. Jul 2013 B2
8538785 Coleman et al. Sep 2013 B2
8548680 Ryerson et al. Oct 2013 B2
8560175 Bammert et al. Oct 2013 B2
8571758 Klier et al. Oct 2013 B2
8626382 Obradovich Jan 2014 B2
8755984 Rupp et al. Jun 2014 B2
8786417 Holmen et al. Jul 2014 B2
8788204 Shimizu Jul 2014 B2
8797190 Kolbe Aug 2014 B2
8798860 Dechamp Aug 2014 B2
8807261 Subrt et al. Aug 2014 B2
8823796 Shen et al. Sep 2014 B2
8868329 Ikeda Oct 2014 B2
8888120 Trevino Nov 2014 B2
8892360 Otani Nov 2014 B2
8909426 Rhode et al. Dec 2014 B2
8928757 Maekawa et al. Jan 2015 B2
8930140 Trombley et al. Jan 2015 B2
9008913 Sears et al. Apr 2015 B1
9013286 Chen et al. Apr 2015 B2
9042603 Elwart et al. May 2015 B2
9082315 Lin et al. Jul 2015 B2
9094583 Shih et al. Jul 2015 B2
9102271 Trombley et al. Aug 2015 B2
9114832 Wang et al. Aug 2015 B2
9120359 Chiu et al. Sep 2015 B2
9132856 Shepard Sep 2015 B2
9208686 Takamatsu Dec 2015 B2
9248858 Lavoie et al. Feb 2016 B2
9315212 Kyrtsos et al. Apr 2016 B1
9335162 Kyrtsos et al. May 2016 B2
20020005780 Ehrlich et al. Jan 2002 A1
20020098853 Chrumka Jul 2002 A1
20020111118 Klitsner Aug 2002 A1
20030079123 Mas Ribes Apr 2003 A1
20030147534 Ablay et al. Aug 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030234512 Holub Dec 2003 A1
20040119822 Custer et al. Jun 2004 A1
20040203660 Tibrewal et al. Oct 2004 A1
20040207525 Wholey et al. Oct 2004 A1
20040260438 Chernetsky et al. Dec 2004 A1
20050000738 Gehring et al. Jan 2005 A1
20050073433 Gunderson et al. Apr 2005 A1
20050074143 Kawai Apr 2005 A1
20050091408 Parupudi et al. Apr 2005 A1
20050128059 Vause Jun 2005 A1
20050146607 Linn et al. Jul 2005 A1
20050168331 Gunderson Aug 2005 A1
20050177635 Schmidt et al. Aug 2005 A1
20050206225 Offerle et al. Sep 2005 A1
20050206231 Lu et al. Sep 2005 A1
20050206299 Nakamura et al. Sep 2005 A1
20050236201 Spannheimer et al. Oct 2005 A1
20050236896 Offerle et al. Oct 2005 A1
20060071447 Gehring et al. Apr 2006 A1
20060076828 Lu et al. Apr 2006 A1
20060089794 DePasqua Apr 2006 A1
20060092129 Choquet et al. May 2006 A1
20060103511 Lee et al. May 2006 A1
20060111820 Goetting et al. May 2006 A1
20060142936 Dix Jun 2006 A1
20060156315 Wood et al. Jul 2006 A1
20060190097 Rubenstein Aug 2006 A1
20060238538 Kapler Oct 2006 A1
20060244579 Raab Nov 2006 A1
20060250501 Widmann et al. Nov 2006 A1
20060276959 Matsuoka Dec 2006 A1
20060287821 Lin Dec 2006 A1
20060293800 Bauer et al. Dec 2006 A1
20070027581 Bauer et al. Feb 2007 A1
20070057816 Sakakibara et al. Mar 2007 A1
20070106466 Noguchi May 2007 A1
20070132560 Nystrom et al. Jun 2007 A1
20070132573 Quach et al. Jun 2007 A1
20070198190 Bauer et al. Aug 2007 A1
20070216136 Dietz Sep 2007 A1
20070260395 Matsuoka Nov 2007 A1
20080027599 Logan Jan 2008 A1
20080027635 Tengler et al. Jan 2008 A1
20080148374 Spaur et al. Jun 2008 A1
20080177443 Lee et al. Jul 2008 A1
20080180526 Trevino Jul 2008 A1
20080186384 Ishii et al. Aug 2008 A1
20080231701 Greenwood et al. Sep 2008 A1
20080312792 Dechamp Dec 2008 A1
20080313050 Basir Dec 2008 A1
20090005932 Lee et al. Jan 2009 A1
20090045924 Roberts, Sr. et al. Feb 2009 A1
20090063053 Basson et al. Mar 2009 A1
20090075624 Cox et al. Mar 2009 A1
20090079828 Lee et al. Mar 2009 A1
20090082935 Leschuk et al. Mar 2009 A1
20090093928 Getman et al. Apr 2009 A1
20090106036 Tamura et al. Apr 2009 A1
20090117890 Jacobsen et al. May 2009 A1
20090138151 Smid May 2009 A1
20090140064 Schultz et al. Jun 2009 A1
20090219147 Bradley et al. Sep 2009 A1
20090253466 Saito et al. Oct 2009 A1
20090271078 Dickinson Oct 2009 A1
20090306854 Dechamp Dec 2009 A1
20090318119 Basir et al. Dec 2009 A1
20100060739 Salazar Mar 2010 A1
20100063670 Brzezinski et al. Mar 2010 A1
20100098853 Hoffmann et al. Apr 2010 A1
20100114471 Sugiyama May 2010 A1
20100152989 Smith et al. Jun 2010 A1
20100156671 Lee et al. Jun 2010 A1
20100157061 Katsman et al. Jun 2010 A1
20100171828 Ishii Jul 2010 A1
20100174422 Jacobsen Jul 2010 A1
20100191421 Nilsson Jul 2010 A1
20100198491 Mays Aug 2010 A1
20100222964 Dechamp Sep 2010 A1
20100234071 Shabtay et al. Sep 2010 A1
20100305815 Trueman et al. Dec 2010 A1
20100306309 Santori et al. Dec 2010 A1
20100324770 Ramsey et al. Dec 2010 A1
20110022282 Wu et al. Jan 2011 A1
20110025482 Alguera et al. Feb 2011 A1
20110063425 Tieman Mar 2011 A1
20110088659 Wang et al. Apr 2011 A1
20110102583 Kinzalow May 2011 A1
20110110530 Kimura May 2011 A1
20110112721 Wang et al. May 2011 A1
20110112762 Gruijters et al. May 2011 A1
20110125457 Lee et al. May 2011 A1
20110129093 Karam et al. Jun 2011 A1
20110140872 McClure Jun 2011 A1
20110149077 Robert Jun 2011 A1
20110153198 Kokkas et al. Jun 2011 A1
20110160956 Chung et al. Jun 2011 A1
20110181457 Basten Jul 2011 A1
20110185390 Faenger et al. Jul 2011 A1
20110195659 Boll et al. Aug 2011 A1
20110216199 Trevino et al. Sep 2011 A1
20110257860 Getman et al. Oct 2011 A1
20110281522 Suda Nov 2011 A1
20110296037 Westra et al. Dec 2011 A1
20120004805 Gray et al. Jan 2012 A1
20120062743 Lynam et al. Mar 2012 A1
20120062744 Schofield et al. Mar 2012 A1
20120065815 Hess Mar 2012 A1
20120079002 Boll et al. Mar 2012 A1
20120084292 Liang et al. Apr 2012 A1
20120086808 Lynam et al. Apr 2012 A1
20120095649 Klier et al. Apr 2012 A1
20120185131 Headley Jul 2012 A1
20120191285 Woolf et al. Jul 2012 A1
20120200706 Greenwood et al. Aug 2012 A1
20120224059 Takamatsu Sep 2012 A1
20120265416 Lu et al. Oct 2012 A1
20120271512 Rupp et al. Oct 2012 A1
20120271514 Lavoie et al. Oct 2012 A1
20120271515 Rhode et al. Oct 2012 A1
20120271522 Rupp et al. Oct 2012 A1
20120283909 Dix Nov 2012 A1
20120283910 Lee et al. Nov 2012 A1
20120288156 Kido Nov 2012 A1
20120290150 Doughty Nov 2012 A1
20120314073 Shimoda et al. Dec 2012 A1
20120316732 Auer Dec 2012 A1
20130006472 McClain et al. Jan 2013 A1
20130024064 Shepard Jan 2013 A1
20130027195 Van Wiemeersch et al. Jan 2013 A1
20130038436 Brey et al. Feb 2013 A1
20130041524 Brey Feb 2013 A1
20130046559 Coleman Feb 2013 A1
20130057397 Cutler et al. Mar 2013 A1
20130076007 Goode et al. Mar 2013 A1
20130148748 Suda Jun 2013 A1
20130158803 Headley Jun 2013 A1
20130158863 Skvarce Jun 2013 A1
20130158872 Shimizu Jun 2013 A1
20130166190 Ikeda Jun 2013 A1
20130226390 Luo et al. Aug 2013 A1
20130250114 Lu Sep 2013 A1
20130253814 Wirthlin Sep 2013 A1
20130268160 Trombley et al. Oct 2013 A1
20140005918 Qiang Jan 2014 A1
20140012465 Shank et al. Jan 2014 A1
20140025260 McClure Jan 2014 A1
20140052337 Lavoie et al. Feb 2014 A1
20140058614 Trombley et al. Feb 2014 A1
20140058622 Trombley et al. Feb 2014 A1
20140058655 Trombley et al. Feb 2014 A1
20140058668 Trombley et al. Feb 2014 A1
20140074401 Otani Mar 2014 A1
20140074743 Rademaker Mar 2014 A1
20140085472 Lu Mar 2014 A1
20140088797 McClain et al. Mar 2014 A1
20140088824 Ishimoto Mar 2014 A1
20140121883 Shen et al. May 2014 A1
20140121930 Allexi et al. May 2014 A1
20140156148 Kikuchi Jun 2014 A1
20140160276 Pliefke et al. Jun 2014 A1
20140172232 Rupp et al. Jun 2014 A1
20140188344 Lavoie Jul 2014 A1
20140188346 Lavoie Jul 2014 A1
20140210456 Crossman Jul 2014 A1
20140218506 Trombley et al. Aug 2014 A1
20140218522 Lavoie et al. Aug 2014 A1
20140222288 Lavoie et al. Aug 2014 A1
20140236532 Trombley et al. Aug 2014 A1
20140249691 Hafner et al. Sep 2014 A1
20140267688 Aich et al. Sep 2014 A1
20140267689 Lavoie Sep 2014 A1
20140267727 Alaniz Sep 2014 A1
20140277941 Chiu et al. Sep 2014 A1
20140277942 Kyrtsos et al. Sep 2014 A1
20140297128 Lavoie et al. Oct 2014 A1
20140297129 Lavoie et al. Oct 2014 A1
20140303847 Lavoie Oct 2014 A1
20140309888 Smit et al. Oct 2014 A1
20140324295 Lavoie et al. Oct 2014 A1
20140343795 Lavoie Nov 2014 A1
20140361955 Goncalves Dec 2014 A1
20140379217 Rupp et al. Dec 2014 A1
20150002670 Bajpai Jan 2015 A1
20150057903 Rhode et al. Feb 2015 A1
20150066296 Trombley et al. Mar 2015 A1
20150084755 Chen Mar 2015 A1
20150094945 Cheng Apr 2015 A1
20150115571 Zhang et al. Apr 2015 A1
20150120141 Lavoie et al. Apr 2015 A1
20150120143 Schlichting Apr 2015 A1
20150134183 Lavoie et al. May 2015 A1
20150138340 Lavoie May 2015 A1
20150142211 Shehata May 2015 A1
20150158527 Hafner et al. Jun 2015 A1
20150165850 Chiu et al. Jun 2015 A1
20150179075 Lee Jun 2015 A1
20150197278 Boos et al. Jul 2015 A1
20150203156 Hafner et al. Jul 2015 A1
20150210317 Hafner et al. Jul 2015 A1
20150217692 Yanagawa Aug 2015 A1
20150217693 Pliefke Aug 2015 A1
20150232031 Kitaura Aug 2015 A1
20150232092 Fairgrieve et al. Aug 2015 A1
20150234386 Zini Aug 2015 A1
20160152263 Singh Jun 2016 A1
Foreign Referenced Citations (63)
Number Date Country
101610420 Dec 2009 CN
101833869 Sep 2010 CN
202541524 Nov 2012 CN
3931518 Apr 1991 DE
9208595 Aug 1992 DE
10065230 Jul 2002 DE
10154612 May 2003 DE
102005043467 Mar 2007 DE
102005043468 Mar 2007 DE
102006035021 Jan 2008 DE
102006048947 Apr 2008 DE
102008020838 Nov 2008 DE
102009012253 Sep 2010 DE
102010004920 Jul 2011 DE
102008004158 Oct 2011 DE
102008004159 Oct 2011 DE
102008004160 Oct 2011 DE
102010021052 Nov 2011 DE
102011108440 Jan 2013 DE
0418653 Mar 1991 EP
0849144 Jun 1998 EP
1361543 Nov 2003 EP
1695888 Aug 2006 EP
1593552 Mar 2007 EP
2168815 Mar 2010 EP
2199188 Jun 2010 EP
2452549 May 2012 EP
2551132 Jan 2013 EP
2644477 Oct 2013 EP
1569073 Sep 2014 EP
2803944 Nov 2014 EP
2515379 Oct 1981 FR
2606717 May 1988 FR
2716145 Aug 1995 FR
2786456 Jun 2000 FR
2980750 Apr 2013 FR
2265587 Oct 1993 GB
2342630 Apr 2000 GB
2398048 Aug 2004 GB
2398049 Aug 2004 GB
2398050 Aug 2004 GB
63-085568 Jun 1988 JP
06-028598 Apr 1994 JP
2003148938 May 2003 JP
2003175852 Jun 2003 JP
2004114879 Apr 2004 JP
3716722 Nov 2005 JP
2008027138 Feb 2008 JP
2008123028 May 2008 JP
2009171122 Jul 2009 JP
2012166647 Sep 2012 JP
2014034289 Feb 2014 JP
20060012710 Feb 2006 KR
20060133750 Dec 2006 KR
20110114897 Oct 2011 KR
20140105199 Sep 2014 KR
200930010 Jul 2009 TW
8503263 Aug 1985 WO
2011117372 Sep 2011 WO
2014019730 Feb 2014 WO
2014037500 Mar 2014 WO
2014123575 Aug 2014 WO
2015074027 May 2015 WO
Non-Patent Literature Citations (58)
Entry
Hwang et al., “Mobile Robots at Your Fingertip: Bezier Curve On-line Trajectory Generation for Supervisory Control”, Proceedings of the 2003 IEEE/RSJ, Intl. Conference on Intelligent Robots and Systems, Oct. 2003, pp. 1444-1449.
Khatib et al., “Dynamic Path Modification for Car-Like Nonholonomic Mobile Robots”, Proceedings of the 1997 IEEE, International Conferences on Robotics and Automation, Apr. 1997, pp. 2920-2925.
“Ford Super Duty: Truck Technology”, Brochure, www.media.ford.com, Sep. 2011, pp. 1-2.
“Ford Guide to Towing”, Trailer Life, Magazine, 2012, pp. 1-38.
“Dodge Dart: The Hot Compact Car”, Brochure, www.dart-mouth.com/enginerring-development.html, pp. 1-6; date unknown.
M. Wagner, D. Zoebel, and A. Meroth, “Adaptive Software and Systems Architecture for Driver Assistance Systems” International Journal of Machine Learning and Computing, Oct. 2011, vol. 1, No. 4, pp. 359-365.
Christian Lundquist, Wolfgang Reinelt, Olof Enqvist, “Back Driving Assistant for Passenger Cars with Trailer”, SAE Int'l, ZF Lenksysteme Gmbh, Schwaebisch Gmuend, Germany, 2006, pp. 1-8.
“Understanding Tractor-Trailer Performance”, Caterpillar, 2006, pp. 1-28.
Divelbiss, A.W.; Wen, J.T.; “Trajectory Tracking Control of a Car-Trailer System”, IEEE, Control Systems Technology, Aug. 6, 2002, vol. 5, No. 3, ISSN: 1063-6536, pp. 269-278.
Stahn, R.; Heiserich, G.; Stopp, A., “Laser Scanner-Based Navigation for Commercial Vehicles”, IEEE, Intelligent Vehicles Symposium, Jun. 2007, pp. 969-974, print ISBN: 1931-0587.
Widrow, B.; Lamego, M.M., “Neurointerfaces: Applications”, IEEE, Adaptive Systems for Signal Processing, Communications, and Control Symposium, Oct. 2000, pp. 441-444.
Dieter Zoebel, David Polock, Philipp Wojke, “Steering Assistance for Backing Up Articulated Vehicles”, Systemics, Cybernetics and Informatics, Universitaet Koblenz-Landau, Germany, vol. 1, No. 5, pp. 101-106; date unknown.
Stephen K. Young, Carol A. Eberhard, Philip J. Moffa, “Development of Performance Specifications for Collision Avoidance Systems for Lane Change, Merging and Backing”, TRW Space and Electronics Group, Feb. 1995, pp. 1-31.
Ford Motor Company, “09 F-150”, Brochure, www.fordvehicles.com, pp. 1-30; date unknown.
Michael Paine, “Heavy Vehicle Object Detection Systems”, Vehicle Design and Research Pty Lmited for VicRoads, Jun. 2003, pp. 1-22.
Claudio Altafini, Alberto Speranzon, and Karl Henrik Johansson, “Hybrid Control of a Truck and Trailer Vehicle”, Springer-Verlag Berlin Heidelberg, HSCC 2002, LNCS 2289; 2002, pp. 21-34.
“2012 Edge—Trailer Towing Selector”, Brochure, Preliminary 2012 RV & Trailer Towing Guide Information, pp. 1-3.
“Meritor Wabco Reverse Detection Module for Trailers with 12-Volt Constant Power Systems”, Technical Bulletin, TP-02172, Revised Oct. 2004, pp. 1-8.
Simonoff, Adam J., “USH0001469 Remotely Piloted Vehicle Control and Interface System”, Aug. 1, 1995, pp. 1-7.
“Range Rover Evoque's Surround Camera System”; MSN Douglas Newcomb Jun. 15, 2012, pp. 1-2.
“Electronic Trailer Steering”, VSE, Advanced Steering & Suspension Solutions, Brochure, 2009, The Netherlands, pp. 1-28.
“WABCO Electronic Braking System—New Generation”, Vehicle Control Systems—An American Standard Company, www.wabco-auto.com, 2004, pp. 1-8.
T. Wang, “Reverse-A-Matic-Wheel Direction Sensor System Operation and Installation Manual”, Dec. 15, 2005, pp. 1-9.
“Wireless-Enabled Microphone, Speaker and User Interface for a Vehicle”, The IP.com, Aug. 26, 2004, pp. 1-5, IP.com disclosure No. IPCOM000030782D.
“RFID Read/Write Module”, Grand Idea Studio, 2013, pp. 1-3, website, http://www.grandideastudio.com/portfolio/rfid-read-write-module/.
Laszlo Palkovics, Pal Michelberger, Jozsef Bokor, Peter Gaspar, “Adaptive Identification for Heavy-Truck Stability Control”, Vehicle Systems Dynamics Supplement, vol. 25, No. sup1, 1996, pp. 502-518.
“Convenience and Loadspace Features” Jaguar Land Rover Limited, 2012, pp. 1-15, http://www.landrover.com/us/en/lr/all-new-range-rover/explore/.
“Delphi Lane Departure Warning”, Delphi Corporation, Troy, Michigan pp. 1-2; date unknown.
Micah Steele, R. Brent Gillespie, “Shared Control Between Human and Machine: Using a Haptic Steering Wheel to Aid in Land Vehicle Guidance”, University of Michigan, pp. 1-5; date unknown.
“Electric Power Steering”, Toyota Hybrid System Diagnosis-Course 072, Section 7, pp. 1-10; date unknown.
“Telematics Past, Present, and Future,” Automotive Service Association, www.ASAshop.org, May 2008, 20 pgs.
“Fully Automatic Trailer Tow Hitch With LIN Bus,” https://webista.bmw.com/webista/show?id=1860575499&lang=engb&print=1, pp. 1-5; date unknown.
Nüsser, René; Pelz, Rodolfo Mann, “Bluetooth-based Wireless Connectivity in an Automotive Environment”, VTC, 2000, pp. 1935-1942.
Whitfield, Kermit, “A Hitchhiker's Guide to the Telematics Ecosystem”, Automotive Design & Production, Oct. 1, 2003, 3 pgs.
Narasimhan, N.; Janssen, C.; Pearce, M.; Song, Y., “A Lightweight Remote Display Management Protocol for Mobile Devices”, 2007, IEEE, pp. 711-715.
Microsoft, Navigation System, Sync Powered by Microsoft, Ford Motor Company, Jul. 2007, 164 pgs.
Microsoft, Supplemental Guide, Sync Powered by Microsoft, Ford Motor Company, Nov. 2007, 86 pgs.
Voelcker, J., “Top 10 Tech Cars: It's the Environment, Stupid”, IEEE Spectrum, Apr. 2008, pp. 26-35.
Microsoft, Navigation System, Sync Powered by Microsoft, Ford Motor Company, Oct. 2008, 194 pgs.
Microsoft, Supplemental Guide, Sync Powered by Microsoft, Ford Motor Company, Oct. 2008, 83 pgs.
Chantry, Darryl, “Mapping Applications to the Cloud”, Microsoft Corporation, Jan. 2009, 20 pgs.
Yarden, Raam; Surage Jr., Chris; Kim, Chong Il; Doboli, Alex; Voisan, Emil; Purcaru, Constantin, “TUKI: A Voice-Activated Information Browser”, 2009, IEEE, pp. 1-5.
Gil-Castiñeira, Felipe; Chaves-Diéguez, David; González-Castaño, Francisco J., “Integration of Nomadic Devices with Automotive User Interfaces”, IEEE Transactions on Consumer Electronics, Feb. 2009, vol. 55, Issue 1, pp. 34-41.
Microsoft, Navigation System, Sync Powered by Microsoft, Ford Motor Company, Jul. 2009, 196 pgs.
Microsoft, Supplemental Guide, Sync Powered by Microsoft, Ford Motor Company, Aug. 2009, 87 pgs.
Goodwin, Antuan, “Ford Unveils Open-Source Sync Developer Platform”, The Car Tech Blog, Oct. 29, 2009, 5 pgs. [Retrieved from http://reviews.cnet.com/8301-13746—7-10385619-48.html on Feb. 15, 2011].
Lamberti, Ralf, “Full Circle: The Rise of Vehicle-Installed Telematics”,Telematics Munich, Nov. 10, 2009, 12 pgs.
“Apple Files Patent Which Could Allow You to Control Your Computer Remotely Using iPhone”, Dec. 18, 2009, 7 pgs [Retrieved from www.iphonehacks.com on Jun. 22, 2010].
Newmark, Zack, “Student develop in-car cloud computing apps; envision the future of in-car connectivity”, May 4, 2010, 3 pgs [Retrieved from www.worldcarfans.com on Jun. 18, 2010].
“Service Discovery Protocol (SDP)”, Palo Wireless Bluetooth Resource Center, 7 pgs [Retrieved from http://palowireless.com/infotooth/tutorial/sdp.asp on Aug. 3, 2010].
Sonnenberg, Jan, “Service and User Interface Transfer from Nomadic Devices to Car Infotainment Systems”, Second International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Automotive UI), Nov. 11-12, 2010, pp. 162-165.
“MobileSafer makes it easy to keep connected and safe”, ZoomSafer Inc., 2010, 5 pgs. [Retrieved from http://zoomsafer.com/products/mobilesafer on Dec. 28, 2010].
“PhonEnforcer FAQs”, Turnoffthecellphone.com, 3 pgs. [Retrieved from http://turnoffthecellphone.com/faq.html on Dec. 28, 2010].
“How PhonEnforcer Works”, Turnoffthecellphone.com, 2 pgs. [Retrieved from http://turnoffthecellphone.com/howitworks.htm on Dec. 28, 2010].
European Patent Office, European Search Report for Application No. EP11151623, Feb. 15, 2011, 7 pgs.
Wikipedia, “X Window System”, Wikipedia, The Free Encyclopedia, date unknown, 19 pgs. [Retrieved from http://en.wikipedia.org/w/index.php?title=X—Window—System&oldid=639253038].
Jung-Hoon Hwang, Ronald C. Arkin, and Dong-Soo Kwon; “Mobile robots at your fingertip: Bezier curve on-line trajectory generation for supervisory control,” IEEE/RSJ, International Conference on Intelligent Robots and Systems, Las Vegas, Nevada, Oct. 2003, 6 pages.
M. Khatib, H. Jaouni, R. Chatila, and J.P. Laumond; “Dynamic Path Modification for Car-Like Nonholonomic Mobile Robots,” IEEE, International Conference on Robotics and Automation, Albuquerque, New Mexico, Apr. 1997, 6 pages.
Related Publications (1)
Number Date Country
20140358429 A1 Dec 2014 US
Provisional Applications (1)
Number Date Country
61477132 Apr 2011 US
Continuation in Parts (13)
Number Date Country
Parent 14289888 May 2014 US
Child 14459926 US
Parent 14256427 Apr 2014 US
Child 14289888 US
Parent 14249781 Apr 2014 US
Child 14256427 US
Parent 14188213 Feb 2014 US
Child 14249781 US
Parent 13847508 Mar 2013 US
Child 14188213 US
Parent 14068387 Oct 2013 US
Child 13847508 US
Parent 14059835 Oct 2013 US
Child 14068387 US
Parent 13443743 Apr 2012 US
Child 14059835 US
Parent 13336060 Dec 2011 US
Child 13443743 US
Parent 14161832 Jan 2014 US
Child 14249781 US
Parent 14059835 US
Child 14161832 US
Parent 14201130 Mar 2014 US
Child 14249781 US
Parent 14068387 US
Child 14201130 US