This disclosure relates to a method and device for user-adjustable trajectories for automated vehicle reversing.
Trailers are usually unpowered vehicles that are pulled by a powered tow vehicle. A trailer may be a utility trailer, a popup camper, a travel trailer, livestock trailer, flatbed trailer, enclosed car hauler, and boat trailer, among others. The tow vehicle may be a car, a crossover, a truck, a van, a sports-utility-vehicle (SUV), a recreational vehicle (RV), or any other vehicle configured to attach to the trailer and pull the trailer. The trailer may be attached to a powered vehicle using a trailer hitch. A receiver hitch mounts on the tow vehicle and connects to the trailer hitch to form a connection. The trailer hitch may be a ball and socket, a fifth wheel and gooseneck, or a trailer jack. Other attachment mechanisms may also be used.
Recent advancements in computing and sensor technology have led to improved vehicle autonomous driving. As such, it is desirable to provide an automated vehicle reverse system that is capable of planning a path from the tow vehicle to the trailer allowing the vehicle to autonomously maneuver towards the trailer.
One aspect of the disclosure provides a method for autonomously maneuvering a vehicle in a rearward direction towards a point of interest. The method includes receiving, at data processing hardware, one or more images from a camera positioned on a back portion of the vehicle and in communication with the data processing hardware. The method also includes overlaying, at the data processing hardware, a path on the one or more images. The method also includes receiving, at the data processing hardware, a command by way of a user interface in communication with the data processing hardware. The command includes instructions to adjust the path. The method also includes adjusting, at the data processing hardware, the path based on the received command. The method also includes transmitting, from the data processing hardware to a drive system in communication with the data processing hardware, a drive command causing the vehicle to autonomously maneuver along the adjusted path.
Implementations of the disclosure may include one or more of the following optional features. In some implementations, the command includes instructions to adjust a distance of the path, instructions to adjust an angle of the path, and/or instructions to adjust an angle of an end portion of the path. In some examples, the point of interest is a trailer. Adjusting the angle of the end portion of the path causes a fore-aft axis of the vehicle to be aligned with a fore-aft axis of the trailer.
In some implementations, before transmitting a drive command, the method includes receiving an action from a driver causing the data processing hardware to transmit the drive command. The method may further include instructing the user interface to display a position of the vehicle relative to the path during autonomous maneuvering of the vehicle in the rearward direction.
Another aspect of the disclosure provides a system for autonomously maneuvering a vehicle in a rearward direction towards a point of interest. The system includes: data processing hardware; and memory hardware in communication with the data processing hardware. The memory hardware stores instructions that when executed on the data processing hardware cause the data processing hardware to perform operations that include the method described above.
The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
Like reference symbols in the various drawings indicate like elements.
A tow vehicle, such as, but not limited to a car, a crossover, a truck, a van, a sports-utility-vehicle (SUV), and a recreational vehicle (RV) may be configured to tow a trailer. The tow vehicle connects to the trailer by way of a trailer hitch. It is desirable to have a tow vehicle that is capable of autonomously backing up towards a driver specified position, for example a trailer, identified from an image of the rearward environment of the vehicle and displayed on a user interface, such as a user display.
Referring to
The tow vehicle 100 may include a drive system 110 that maneuvers the tow vehicle 100 across a road surface based on drive commands having x, y, and z components, for example. As shown, the drive system 110 includes a front right wheel 112, 112a, a front left wheel 112, 112b, a rear right wheel 112, 112c, and a rear left wheel 112, 112d. The drive system 110 may include other wheel configurations as well. The drive system 110 may also include a brake system 114 that includes brakes associated with each wheel 112, 112a-d, and an acceleration system 116 that is configured to adjust a speed and direction of the tow vehicle 100. In addition, the drive system 110 may include a suspension system 118 that includes tires associates with each wheel 112, 112a-d, tire air, springs, shock absorbers, and linkages that connect the tow vehicle 100 to its wheels 112, 112a-d and allows relative motion between the tow vehicle 100 and the wheels 112, 112a-d. The suspension system 132 may be configured to adjust a height of the tow vehicle 100 allowing a tow vehicle hitch 120 (e.g., a tow vehicle hitch ball 122) to align with a trailer hitch 210 (e.g., trailer hitch coupler 212), which allows for autonomous connection between the tow vehicle 100 and the trailer 200.
The tow vehicle 100 may move across the road surface by various combinations of movements relative to three mutually perpendicular axes defined by the tow vehicle 100: a transverse axis X, a fore-aft axis Y, and a central vertical axis Z. The transverse axis x, extends between a right side and a left side of the tow vehicle 100. A forward drive direction along the fore-aft axis Y is designated as F, also referred to as a forward motion. In addition, an aft or rearward drive direction along the fore-aft direction Y is designated as R, also referred to as rearward motion. When the suspension system 118 adjusts the suspension of the tow vehicle 100, the tow vehicle 100 may tilt about the X axis and or Y axis, or move along the central vertical axis Z.
The tow vehicle 100 may include a user interface 130. The user interface 130 may include the display 132, a knob 134, and a button 136, which are used as input mechanisms. In some examples, the display 132 may show the knob 134 and the button 136. While in other examples, the knob 134 and the button 136 are a knob button combination. In some examples, the user interface 130 receives one or more driver commands from the driver via one or more input mechanisms or a touch screen display 132 and/or displays one or more notifications to the driver. The user interface 130 is in communication with a vehicle controller 150, which is in turn in communication with a sensor system 140. In some examples, the display 132 displays an image of an environment of the tow vehicle 100 leading to one or more commands being received by the user interface 130 (from the driver) that initiate execution of one or more behaviors. In some examples, the user display 132 displays an image of the rearward environment of the vehicle 100. In this case, the driver can select a position within the image that the driver wants the vehicle to autonomously maneuver towards. In some examples, the user display 132 displays one or more representations of trailers 200 positioned behind the vehicle 100. In this case, the driver selects one representation of a trailer 200 for the vehicle 100 to autonomously maneuver towards.
The display 132 displays a planned path 182 of the vehicle 100 that is superimposed on the camera image 143 of the rearward environment of the vehicle 100. The driver may change the planned path 182 using the user interface 130. For example, the driver may turn the knob 134, which simulates a virtual steering wheel. As the driver is turning the knob 134, the planned path 182 shown on the display 132 is updated. The driver adjusts the displayed path 182 until an updated planned path 182 displayed on the display 132 intersects the trailer representation 138 or other object that the driver wants the vehicle 100 to drive towards. Once the driver is satisfied with the planned path 182 displayed, then the driver executes an action indicative of finalizing the path 182 which allows the vehicle 100 to autonomously follow the planned path 182.
The tow vehicle 100 may include a sensor system 140 to provide reliable and robust driving. The sensor system 140 may include different types of sensors that may be used separately or with one another to create a perception of the environment of the tow vehicle 100 that is used for the tow vehicle 100 to drive and aid the driver in make intelligent decisions based on objects and obstacles detected by the sensor system 140. The sensor system 140 may include the one or more cameras 142. In some implementations, the tow vehicle 100 includes a rear camera 142 that is mounted to provide a view of a rear-driving path for the tow vehicle 100. The rear camera 142 may include a fisheye lens that includes an ultra wide-angle lens that produces strong visual distortion intended to create a wide panoramic or hemispherical image. Fisheye cameras capture images having an extremely wide angle of view. Moreover, images captured by the fisheye camera have a characteristic convex non-rectilinear appearance. Other types of cameras may also be used to capture images of the rear of the vehicle 100.
The sensor system 140 may include other sensors such as, but not limited to, inertial measuring unit (IMU) radar, sonar, LIDAR (Light Detection and Ranging, which can entail optical remote sensing that measures properties of scattered light to find range and/or other information of a distant target), LADAR (Laser Detection and Ranging), ultrasonic sensors, etc.
The vehicle controller 150 includes a computing device (or processor) 152 (e.g., central processing unit having one or more computing processors) in communication with non-transitory memory 154 (e.g., a hard disk, flash memory, random-access memory, memory hardware) capable of storing instructions executable on the computing processor(s) 152.
The vehicle controller 150 executes a hitch assist system 160 that receives images 143 from the camera 142 and superimposes the vehicle path 182 on the received image 143. In some implementations, the driver may adjust the path 182 selection based on one or more path modes 170. In some examples, the path modes 170 include an arc mode 172 having an angle sub-mode 174 and a distance sub-mode 176. In some examples, the path modes 170 may include a bi-arc mode 178. Therefore, the driver may select between the angle sub-mode 174, the distance sub-mode, and/or the bi-arc mode 178 for determining and adjusting the path 182 to a trailer 200 or an object.
In some examples, the angle sub-mode 174 and the distance sub-mode 176 are part of the arc-mode 172 (
The angle sub-mode 174 is configured to adjust a curvature angle of the path 182 as shown in
In some implementations, where the bi-arc mode 178 is optional, if the driver is satisfied with the path 182 based on the arc mode 172 selection, then the driver may finalize the path 182 by pressing the button 136. Otherwise, the driver adjusts the knob 134 for the third time to change the shape of a bi-arc or other suitable path 182. This allows for adjusting the final approach angle to the trailer 200 or other object. Once the driver is satisfied with the choice of approach angle, he/she presses the button 136 to finalize the path choice.
In some implementations, the driver parks the tow vehicle 100 in a location where the trailer 200, or other object or point of interest, is within a field of view of the rear camera 142 of the vehicle 100. The engine of the tow vehicle 100 may be idling, and the transmission in Park position. The driver may initiate the hitch assist system 160 by pressing the button 136 and/or making a selection on the display 132. In some examples, the display 132 shows a selectable option or button 136 allowing the driver to initiate the Arc mode 172. The hitch assist system 160 begins by executing the angle sub-mode 174 of the arc mode 172, as shown in
In some implementations, the final approach angle to the trailer 200 or the point of interest is important, for example, for aligning the vehicle fore-aft axis Y with the trailer fore-aft axis Y. In this case, the driver may select or press the “Arc/Bi-Arc Mode” button 136 (displayed on the display 132) and switch to the bi-arc mode 178. In the bi-arc mode 178 the previously set endpoint of the path 182 stays constant, and the driver adjusts the final approach angle with the knob 134. When the driver is satisfied with the final approach angle and with the complete trajectory or path 182, the driver may confirm the selected path 182 by executing an action. In some examples, the driver switches the transmission to reverse which is indicative that the driver is satisfied with the displayed path 182. In some examples, the driver switches the transmission into reverse with the brake on, then releases the brake, and the vehicle 100 follows the selected path 182. In some examples, while the vehicle is autonomously maneuvering in the rearward direction R along the path 182, the driver may stop the tow vehicle 100 by, for example, pressing the brake. This causes the controller 150 to exit the hitch assist system 160.
In some implementations, the hitch assist system 160 sets the path distance at a default, which allows the driver to only adjust the steering angle until it intersects the trailer 200 or other point of interest.
In some implementation, the final approach angle is not adjusted. Instead, the final approach angle is always the same as the initial vehicle departure angle. So, the final vehicle fore-aft axis Y is parallel to the initial vehicle fore-aft axis Y. In this case, the driver adjusts the final location of the path 182 to interest with the trailer.
In some examples, while the tow vehicle 100 is maneuvering in the rearward direction R along the path 182, the display 132 may show a progress of the vehicle 100 along the path 182. For example, the display 132 may show an original trajectory projected on the ground, but updated by the vehicle's changing position. The display 132 may also show an indication of how well the vehicle is following this trajectory.
Referring to
Once the trajectory planning system 180 determines the planned path 182, then the vehicle controller 150 executes a drive assist system 190, which in turn includes path following behaviors 192. The path following behaviors 192 receive the planned path 182 and executes one or more behaviors 192a-b that send commands 191 to the drive system 110, causing the vehicle 100 to autonomously drive along the planned path 182 in a rearward direction R.
The path following behaviors 192a-b may include one or more behaviors, such as, but not limited to, a braking behavior 192a, a speed behavior 192b, and a steering behavior 192c. Each behavior 192a-b causes the vehicle 100 to take an action, such as driving backward, turning at a specific angle, breaking, speeding, slowing down, among others. The vehicle controller 150 may maneuver the vehicle 100 in any direction across the road surface by controlling the drive system 110, more specifically by issuing commands 191 to the drive system 110.
The braking behavior 192a may be executed to either stop the vehicle 100 or to slow down the vehicle 100 based on the planned path. The braking behavior 192a sends a signal or command 191 to the drive system 110, e.g., the brake system (not shown), to either stop the vehicle 100 or reduce the speed of the vehicle 100.
The speed behavior 192b may be executed to change the speed of the vehicle 100 by either accelerating or decelerating based on the planned path 182. The speed behavior 192b sends a signal or command 191 to the brake system 114 for decelerating or the acceleration system 116 for accelerating.
The steering behavior 192c may be executed to change the direction of the vehicle 100 based on the planned path 182. As such, the steering behavior 192c sends the acceleration system 130 a signal or command 191 indicative of an angle of steering causing the drive system 110 to change direction.
In some examples, the command includes at least one of, instructions to adjust a distance of the path, instructions to adjust an angle of the path, and instructions to adjust an angle of an end portion of the path. In some examples, where the point of interest is a trailer 200, adjusting the angle of the end portion of the path causes a fore-aft axis Y of the vehicle 100 to be aligned with a fore-aft axis Y of the trailer 200.
In some examples, before transmitting a drive command, the method 800 includes receiving an action from a driver causing the data processing hardware 152 to transmit the drive command 191. The method 800 may also include, during autonomous maneuvering of the vehicle 100 in the rearward direction R, instructing the user interface 130 to display a position of the vehicle relative to the path 182.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” and “computer-readable medium” refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
Implementations of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Moreover, subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The terms “data processing apparatus”, “computing device” and “computing processor” encompass all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multi-tasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims.
This U.S. patent application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application 62/668,629, filed on May 8, 2018, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
8560234 | Kahn | Oct 2013 | B2 |
9290202 | Lavoie | Mar 2016 | B2 |
9500497 | Poulakis | Nov 2016 | B2 |
9533683 | Lavoie | Jan 2017 | B2 |
9969386 | Wang | May 2018 | B1 |
10228700 | Hüger | Mar 2019 | B2 |
20040254720 | Tanaka et al. | Dec 2004 | A1 |
20050264432 | Tanaka et al. | Jan 2005 | A1 |
20050021203 | Iwazaki et al. | Feb 2005 | A1 |
20050074143 | Kawai | Apr 2005 | A1 |
20080231701 | Greenwood et al. | Sep 2008 | A1 |
20100096203 | Freese V | Apr 2010 | A1 |
20120271515 | Rhode et al. | Oct 2012 | A1 |
20130006472 | McClain et al. | Jan 2013 | A1 |
20140052337 | Lavoie et al. | Feb 2014 | A1 |
20140058614 | Trombley et al. | Feb 2014 | A1 |
20140218522 | Lavoie et al. | Aug 2014 | A1 |
20140267688 | Aich et al. | Sep 2014 | A1 |
20140358429 | Shutko et al. | Dec 2014 | A1 |
20150094945 | Cheng et al. | Apr 2015 | A1 |
20150217693 | Pliefke et al. | Aug 2015 | A1 |
20160023601 | Windeler | Jan 2016 | A1 |
20160052548 | Singh | Feb 2016 | A1 |
20160129939 | Singh et al. | May 2016 | A1 |
20160146618 | Caveney | May 2016 | A1 |
20160304122 | Herzog et al. | Oct 2016 | A1 |
20160378118 | Zeng et al. | Dec 2016 | A1 |
20170050672 | Gieseke et al. | Feb 2017 | A1 |
20170140228 | Lang et al. | May 2017 | A1 |
20170151846 | Wuergler et al. | Jun 2017 | A1 |
20180079395 | Cekola et al. | Mar 2018 | A1 |
20180088590 | Zhu et al. | Mar 2018 | A1 |
20180181142 | Baran | Jun 2018 | A1 |
20180188734 | Zhu | Jul 2018 | A1 |
20180194344 | Wang et al. | Jul 2018 | A1 |
20180215382 | Gupta | Aug 2018 | A1 |
20180251153 | Li et al. | Sep 2018 | A1 |
20180312022 | Mattern et al. | Nov 2018 | A1 |
20180350108 | Wang | Dec 2018 | A1 |
20190029109 | Uejima | Jan 2019 | A1 |
20190383945 | Wang et al. | Dec 2019 | A1 |
20200001790 | Ling et al. | Jan 2020 | A1 |
20200019182 | Ling et al. | Jan 2020 | A1 |
20200097021 | Carpenter et al. | Mar 2020 | A1 |
20210163068 | Zhu et al. | Jun 2021 | A1 |
20210197798 | Funke et al. | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
1577199 | Feb 2005 | CN |
101270983 | Sep 2008 | CN |
102745193 | Oct 2012 | CN |
103998325 | Aug 2014 | CN |
104590116 | May 2015 | CN |
106043281 | Oct 2016 | CN |
107567412 | Jan 2018 | CN |
108136867 | Jun 2018 | CN |
108255171 | Jul 2018 | CN |
102012001380 | Aug 2012 | DE |
102012005707 | Oct 2012 | DE |
2682329 | Jan 2014 | EP |
3081405 | Oct 2016 | EP |
2004291866 | Oct 2004 | JP |
2005014775 | Jan 2005 | JP |
2005313710 | Nov 2005 | JP |
2016203972 | Dec 2016 | JP |
2017105439 | Jun 2017 | JP |
2016164118 | Oct 2016 | WO |
2018160960 | Sep 2018 | WO |
Entry |
---|
File history of U.S. Appl. No. 16/575,096, including the final Office Action dated Nov. 5, 2021. |
File history of U.S. Appl. No. 16/507,749, including the Notice of Allowance dated Jun. 30, 2020. |
File history of U.S. Appl. No. 16/530,931, including the final Office Action dated May 4, 2022. |
Chinese First Office Action dated Jul. 5, 2022 for the counterpart Chinese Patent Application No. 201980046005.9. |
Number | Date | Country | |
---|---|---|---|
20190346858 A1 | Nov 2019 | US |
Number | Date | Country | |
---|---|---|---|
62668629 | May 2018 | US |