This disclosure relates to a tractor configured to identify one or more pieces of farm equipment positioned proximate the tractor and drive to one of the one or more pieces of farm equipment for the purpose of attaching the equipment to the trailer.
Trailers are usually unpowered vehicles, tools, or equipment that are pulled by a powered tow vehicle. When referring to farming equipment the powered tow vehicle is typically referred to as a tractor (rear or front loading) or any other vehicle configured to attach to the trailer and pull the trailer. The trailer, may be any type of farm implement, such as a bailer, a trailer bed, a plow, a seeder, a harvester, a baler, etc. In the instance of front loading tractors, the trailer (i.e. the vehicle is unpowered) may actually be located at the front of the tow vehicle (i.e. the vehicle providing power).
The trailer may be attached to a tractor using a trailer hitch. A receiver hitch mounts on the tractor and connects to the trailer hitch to form a connection. When dealing with farm equipment the connection point, i.e. hitch, between the tractor and the trailer may vary depending on the type of implement of the trailer.
The trailer hitch may be a ball and socket, a fifth wheel and gooseneck, trailer jack, eye and ring, pintle hitch, drawbar, or three-point hitch. Other attachment mechanisms may also be used. In addition to the mechanical connection between the trailer and the tractor, in some examples, the trailer is electrically connected to the tractor. As such, the electrical connection allows the trailer to take the feed from the powered tractor's rear light circuit, allowing the trailer to have lights that are in sync with the powered tractor's lights.
Some of the challenges that face tractor operators are connecting the tractor to the trailer, because more than one person is needed. For example, one person drives the tractor, e.g., the operator, and another one or more people are needed to view the tractor and the trailer and provide the operator with direction regarding the path the tractor has to take to align with the hitch. If the people providing directions to the operator are not accustomed to hitching a tractor to a trailer, then they may have difficulty providing efficient instructions for directing the path of the tractor.
Recent advancements in sensor technology have led to improved safety systems for vehicles. Arrangements and methods for detecting and avoiding collisions are becoming available. Such operator assistance systems use sensors located on the vehicle to detect an ongoing collision. In some examples, the system may warn the operator of one or more driving situations to prevent or minimize collisions. Additionally, sensors and cameras may also be used to alert a operator of possible obstacles when the vehicle is traveling in a forward direction. Therefore, it is desirable to provide a system that includes sensors to overcome the challenges faced by operators of tractors.
One general aspect includes a method of maneuvering a tractor in reverse for attachment to a trailer using a hitch assist system, the method including: entering a hitch assist mode. The method also includes displaying a camera image on a user interface, where the displayed image shows at least one camera. The method also includes receiving, at a user interface in communication an indication of a selected trailer. The method also includes determining a planned path with a computing device a tractor path from an initial position to a final position adjacent the trailer, the tractor path including maneuvers configured to move the tractor along the tractor path from the initial position to the final position. The method also includes autonomously following, at a drive system in communication with the computing device, the tractor path from the initial position to the final position.
Implementations may include one or more of the following features. The method further including: continuously detecting, at the neural network, one or more objects within the tractor path as the tractor is moving along the tractor path. The method may also include when detecting an object, altering the tractor path at the computing device. The method where detecting one or more trailers includes: capturing, at one or more imaging devices in communication with the neural network, one or more images, at least one of the one or more imaging devices positioned on a back side of the trailer facing a rearward direction. The method may also include determining, at the neural network, the one or more trailers within the one or more images. The method where displaying on the user interface further includes receiving, at a controller, one or more images from one or more cameras positioned on a back portion of the tractor and in communication with the controller, and overlaying, at the controller, a path representation indicative of an expected path the tractor drives along, the expected path starting at a tractor hitch. The method where selecting the trailer further includes receiving, at the controller, a first command by way of a user interface, the first command indicative of a change in the path representation such that the path representation ends at a point of interest, and adjusting, at the controller, the path representation based on the first command and where the path planning is completed by a controller for the hitch assist system. The method further including: stopping or halting, at the drive system, the tractor at an intermediate position before reaching the final position, the intermediate position being closer to the final position than the initial position. The method may also include modifying, at the drive system, one or more tractor suspensions associated with the tractor to align a tractor hitch with a trailer hitch. The method may also include autonomously following, at the drive system, the tractor path from the intermediate position to the final position. The method may also include connecting, at the drive system, the tractor hitch with the trailer hitch. The method where connecting a tractor hitch with a trailer hitch includes modifying one or more tractor suspensions associated with the tractor to align a tractor hitch with a trailer hitch.
One aspect of the disclosure provides a method of autonomously driving a vehicle in a rearward direction towards a point of interest. The method includes receiving, at data processing hardware, one or more images from a camera positioned on a back portion of the vehicle and in communication with the data processing hardware. The method also includes receiving, at the data processing hardware, a operator planned path from a user interface in communication with the data processing hardware. The operator planned path includes a plurality of waypoints. The method includes transmitting, from the data processing hardware to a drive system in communication with the data processing hardware, one or more commands causing the vehicle to autonomously maneuver along the operator planned path. The method includes determining, at the data processing hardware, a current vehicle position. In addition, the method includes determining, at the data processing hardware, an estimated subsequent vehicle position based on the operator planned path. The estimated subsequent vehicle position being at a subsequent waypoint along the operator planned path from the current vehicle position. The method also includes determining, at the data processing hardware, a path adjustment from the current vehicle position to the estimated subsequent vehicle position. Additionally, the method includes transmitting, from the data processing hardware to the drive system, instructions causing the vehicle to autonomously maneuver towards the estimated subsequent vehicle position based on the path adjustment.
Another aspect of the disclosure provides a system for autonomously maneuvering a vehicle in a rearward direction towards a point of interest. The system includes: data processing hardware; and memory hardware in communication with the data processing hardware. The memory hardware stores instructions that when executed on the data processing hardware cause the data processing hardware to perform operations that include the methods described above.
Implementations of the aspects of the disclosure may include one or more of the following optional features. In some implementations, the method includes overlaying a path on the one or more images and receiving a command by way of the user interface in communication with the data processing hardware. The command including instructions to adjust the path as the operator planned path. The command may include instructions to adjust a distance of the path. In some examples, the command includes instructions to adjust an angle of the path. The command may include instructions to adjust an angle of an end portion of the path.
In some examples, determining the current vehicle position includes receiving wheel encoder sensor data associated with one or more wheels and receiving steering angle sensor data. The current vehicle position is based on the wheel encoder sensor data and the steering angle sensor data.
The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
Like reference symbols in the various drawings indicate like elements.
A tractor, such as, but not limited to a rear load tractor or a front load tract, hereinafter referred to as a tractor may be configured to tow a trailer. The tractor may be other types of vehicles and farm equipment that are powered and configured to pull other farm implements. The trailer, may be any type of farm implement, such as a bailer, a trailer bed, a plow, a seeder, a harvester, a baler, etc. In the instance of front loading tractors, the trailer (i.e. the vehicle is unpowered) may actually be located at the front of the tractor (i.e. the vehicle providing power). Hereinafter all such equipment will be referred to as the trailer.
The tractor connects to the trailer by way of a trailer hitch. It is desirable to have a tractor that is capable to autonomously maneuvering towards a trailer and attaching to the trailer, thus eliminating the need for a operator to drive the tractor (e.g. in a rearward direction) while another one or more people provide the operator with directions regarding the path that the has to take to align with the trailer and ultimately a hitch of the trailer. As such, a tractor with an autonomous driving and hitching feature provides a operator with a safer and faster experience when hitching the tractor to the trailer.
Referring to
The tractor 100 may move across the surface by various combinations of movements relative to three mutually perpendicular axes defined by the tractor 100: a transverse axis X, a fore-aft axis Y, and a central vertical axis Z. The transverse axis x, extends between a right side R and a left side of the tractor 100. A forward drive direction along the fore-aft axis Y is designated as F, also referred to as a forward motion. In addition, an aft or rearward drive direction along the fore-aft direction Y is designated as R, also referred to as rearward motion. When the suspension system 132 adjusts the suspension of the tractor 100, the tractor 100 may tilt about the X axis and or Y axis, or move along the central vertical axis Z. The example shown herein is a trailer 200 that is located in a rear position of the tractor 100. However, this is merely shown as exemplary in nature and one skilled in the art would be able to apply this embodiment to a trailer 200 that is located on a fore or generally aft position of the tractor 100 as well. Alternatively, one skilled in the art would be able to apply this embodiment to a tractor 100 that would have a front-loading tractor hitch 160. Further, the tractor 100 and trailer 200 which are shown are using a hitch with a ball style hitch and receiver for the sake of drawing simplicity. Other styles of hitches may be used with the exemplary embodiments.
The tractor 100 may include a user interface 140, such as, a display. The user interface 140 receives one or more user commands from the operator via one or more input mechanisms or a touch screen display 142 and/or displays one or more notifications to the operator. The user interface 140 is in communication with a controller 300, which is in turn in communication with a sensor system 400. In some examples, the user interface 140 displays an image of an environment of the tractor 100 leading to one or more commands being received by the user interface 140 (from the operator) that initiate execution of one or more behaviors. The controller 300 includes a computing device (or processor) 302 (e.g., central processing unit having one or more computing processors) in communication with non-transitory memory 304 (e.g., a hard disk, flash memory, random-access memory) capable of storing instructions executable on the computing processor(s)).
The controller 300 executes a operator assistance system 310, which in turn includes a path following sub-system 320. The path following sub-system 320 receives a planned path 552 (
The path following sub-system 320 includes, a braking behavior 322, a speed behavior 324, a steering behavior 326, a hitch connect behavior 328, and a suspension adjustment behavior 330. Each behavior 330, 330a-c cause the tractor 100 to take an action, such as driving backward, turning at a specific angle, breaking, speeding, slowing down, among others. The controller 300 may maneuver the tractor 100 in any direction across the surface by controlling the drive system 110, more specifically by issuing commands 301 to the drive system 110. For example, the controller 300 may maneuver the tractor 100 from an initial position (as shown in
The tractor 100 may include a sensor system 400 to provide reliable and robust autonomous driving. The sensor system 400 may include different types of sensors that may be used separately or with one another to create a perception of the tractor's environment that is used for the tractor 100 to autonomously drive and make intelligent decisions based on objects and obstacles detected by the sensor system 400. The sensors may include, but not limited to, one or more imaging devices (such as cameras) 410, and sensors 420 such as, but not limited to, radar, sonar, LIDAR (Light Detection and Ranging, which can entail optical remote sensing that measures properties of scattered light to find range and/or other information of a distant target), LADAR (Laser Detection and Ranging), etc. In addition, the camera(s) 410 and the sensor(s) 420 may be used to alert the operator of possible obstacles when the tractor 100 is traveling in the forward direction F or in the rearward direction R, by way of audible alerts and/or visual alerts via the user interface 140. Therefore, the sensor system 400 is especially useful for increasing safety in tractors 100 which operate under semi-autonomous or autonomous conditions.
In some implementations, the tractor 100 includes a rear camera 410, 410a that is mounted to provide a view of a rear driving path for the tractor 100. Additionally, in some examples, the tractor 100 includes a front camera 410, 410b to provide a view of a front driving path for the tractor 100, a right camera 410, 410c positioned on the right side of the tractor 100, and a left camera 410, 410d positioned on the left side of the tractor 100. The left and right cameras 410, 410c, 410d provide additional side views of the tractor 100. In this case, the tractor 100 may detect object and obstacles positioned on either side of the tractor 100, in addition to the objects and obstacle detected along the front and rear driving paths. The camera(s) 410, 410a-d may be a monocular camera, binocular camera, or another type of sensing device capable of providing a view of the rear travelling path of the tractor 100.
In some implementations, the tractor 100 includes one or more Neural Networks (NN) 500, for example, Deep Neural Networks (DNN) to improve the autonomous driving of the tractor 100. DNNs 500 are computational approaches used in computer science, among other disciplines, and are based on a large collection of neural unites, loosely imitating the way a biological brain solves problems with large clusters of biological neurons connected by axons. DNNs 500 are self-learning and trained, rather than programmed, and excel in areas where the solution feature detection is difficult to express in a traditional computer program. In other words, DNNs 500 are a set of algorithms that are designed to recognize patterns. DNNs 500 interpret sensor system data 402 (e.g., from the sensor system 400) through a machine perception, labeling or clustering raw input. The recognized patters are numerical, vectors, into which all-real-world data, such as images, text, sound, or time series is translates. The DNN 500 includes multiple layers of nonlinear processing units 502 in communication with DNN non-transitory memory 504. The DNN non-transitory memory 504 stores instructions that when executed on the nonlinear processing units 502 cause the DNN 500 to provide an output 506, 508. Each nonlinear processing unit 502 is configured to transform an input or signal (e.g., sensor system data 402) using parameters that are learned through training. A series of transformations from input (e.g., sensor system data 402) to outputs 506, 508 occurs at the multiple layers of the nonlinear processing units 502. Therefore, the DNN 500 is capable of determining the location based on images 412 or sensor data 422 eliminating the need to have a DGPS or a GPS.
The DNN 500 receives sensor system data 402 (including images 412 and/or sensor data 422) and based on the received data 402 provides an image output 506 to the user interface 140 and/or a data output 508 to the controller 300. In some examples, the DNN 500 receives image(s) 412 of a rear view of the tractor 100 from the camera 410 in communication with the DNN 500. The DNN 500 analyzes the image 412 and identifies one or more trailers 200 in the received image 412. The DNN 500 may also receive sensor data 420 from the sensors 420 in communication with the DNN 500, and analyze the received sensor data 420. Based on the analyzed images 412 (or the analyzed images 412 and the sensor data 422), the DNN 500 identifies the location of each identified trailers 200 relative to the tractor 100, for example by way of a coordinate system. As such, the DNN 500 displays on the user interface 140 the received images 412 displaying trailer representations 146, 146a-c of the identified trailers 200, 200a-c located at a distance behind the tractor 100. As shown in
The operator may select one of the trailer representations 146, 146a-c indicating that the operator wants the tractor 100 to autonomously drive and connect to the trailer 200, 200a-c associated with the selected trailer representation 146, i.e., the operator selection 144. In some examples, the user interface is a touch screen display 142; as such, the operator may point his finger and select the trailer representation 146. In other examples, the user interface 140 is not a touchscreen and the operator may use an input device, such as, but not limited to, a rotary knob or a mouse to select one of the trailer representations 146, 146a-c.
When the operator selects which trailer 200, 200a-c he/she wants the tractor 100 to connect to, a path planning system 550 plans a path 552 (
In some examples, the path planning system 550 is part of the controller 300 as shown in
With continued reference to
Referring back to
Referring back to
The braking behavior 322 may be executed to either stop the tractor 100 or to slow down the tractor based on the planned path 552. The braking behavior 322 sends a signal or command 301 to the drive system 110, e.g., the brake system 120, to either stop the tractor 100 or reduce the speed of the tractor 100.
The speed behavior 324 may be executed to change the speed of the tractor 100 by either accelerating or decelerating based on the planned path 552. The speed behavior 324 sends a signal or command 301 to the brake system 120 for decelerating or the acceleration system 130 for accelerating.
The steering behavior 326 may be executed to change the direction of the tractor 100 based on the planned path. As such, the steering behavior 326 sends the acceleration system 130 a signal or command 301 indicative of an angle of steering causing the drive system 110 to change direction.
Referring to
The controller 300 determines a relative height HR between a top portion of the tractor hitch coupler 162 and a bottom portion of the trailer hitch coupler 212. To connect the tractor 100 and the selected trailer 200a, the trailer hitch coupler 212 releasably receives the tractor hitch coupler 162. Therefore, to connect the tractor hitch coupler 162 to the trailer hitch coupler 212, the relative height HR has to equal zero allowing the tractor hitch coupler 162 to move under and be inserted in the trailer hitch coupler 212. Therefore, when the hitch connect behavior 328 receives the relative height HR that is greater than zero between the tractor hitch coupler 162 and the trailer hitch coupler 212 from the controller 300, the hitch connect behavior 328 sends a command to the suspension adjustment behavior 330 to execute and issue a command 301 to the suspension system 132 to adjust the height of the tractor 100 reducing the relative height HR based on the measurements from the controller 300. When the hitch connect behavior 328 receives the relative height HR that is equal to zero, then the hitch connect behavior 328 issues a command 301 to the drive system 110 to maneuver along the remainder of the path 552, i.e., from the intermediate position PM to the final position PF (
Referring to
The tractor 100 may include a user interface 400. The user interface 400 may include the display 142, a knob 143, and a button 145, which are used as input mechanisms. In some examples, the display 142 may show the knob 143 and the button 145. While in other examples, the knob 143 and the button 145 are a knob button combination. In some examples, the user interface 400 receives one or more operator commands from the operator via one or more input mechanisms or a touch screen display 142 and/or displays one or more notifications to the operator. The user interface 400 is in communication with a tractor controller 300, which is in turn in communication with a sensor system 400. In some examples, the display 142 displays an image of an environment of the tractor 100 leading to one or more commands being received by the user interface 400 (from the operator) that initiate execution of one or more behaviors. In some examples, the user display 142 displays a representation image of the rearward environment of the tractor 100. In this case, the operator can select a position within the representation of the image that is indicative of the environment that the operator wants the vehicle to autonomously maneuver towards. In some examples, the user display 142 displays one or more representations of trailers 200 positioned behind the tractor 100. In this case, the operator selects which representation of a trailer 200 the tractor 100 to autonomously maneuver towards.
The display 142 displays an expected path 554 of the tractor 100 that is superimposed on a camera image 412 of the rearward environment of the tractor 100. The operator may change the expected path 554 using the user interface 400. For example, the operator may turn the knob 143, which simulates a virtual steering wheel. As the operator is turning the knob 143, the expected path shown on the display 142 updates. The operator adjusts the displayed path 554 until an expected path displayed on the display 142 intersects the trailer representation or other object that the user wants the tractor 100 to drive towards. Once the user is satisfied with the expected path 554 displayed, then the operator executes an action indicative of finalizing the path 554 which allows the tractor 100 to autonomously follow the planned path 554. The planned path 554 is the path from the tractor hitch coupler 552 to the trailer hitch receiver/coupler 212. The wheel path 552 is the estimated path the wheels will move along the planned path 554.
In some implementations, the user may adjust three path modes 312, including an angle mode 314 for adjusting a curvature angle of the path as shown in
The tractor controller 300 includes a computing device (or processor) 302 (e.g., central processing unit having one or more computing processors) in communication with non-transitory memory 304 (e.g., a hard disk, flash memory, random-access memory) capable of storing instructions executable on the computing processor(s) 302.
The tractor controller 300 executes a operator assistance system 310 that receives a planned path 552 from a path system 550 and executes behaviors 330, 330a-330c that send commands 301 to the drive system 110, leading to the tractor 100 autonomously driving about the planned path 554 in a rearward direction R (in the example shown).
The path following behaviors 330 include a braking behavior 330a, a speed behavior 330b, and a steering behavior 330c. In some examples, the path following behaviors 330 also include a hitch connect behavior, and a suspension adjustment behavior. Each behavior 330, 330a-330c causes the tractor 100 to take an action, such as driving backward, turning at a specific angle, breaking, speeding, slowing down, among others. The tractor controller 300 may maneuver the tractor 100 in any direction across the surface by controlling the drive system 110, more specifically by issuing commands 301 to the drive system 110.
The tractor 100 may include a sensor system 400 to provide reliable and robust driving. The sensor system 400 may include different types of sensors that may be used separately or with one another to create a perception of the environment of the tractor 100 that is used for the tractor 100 to drive and aid the operator in make intelligent decisions based on objects and obstacles detected by the sensor system 400. The sensor system 400 may include the one or more cameras 410. In some implementations, the tractor 100 includes a rear camera 410, 410a that is mounted to provide a view of a rear-driving path for the tractor 100. The rear camera 410 may include a fisheye lens that includes an ultra wide-angle lens that produces strong visual distortion intended to create a wide panoramic or hemispherical image. Fisheye cameras capture images having an extremely wide angle of view. Moreover, images captured by the fisheye camera have a characteristic convex non-rectilinear appearance. Other types of cameras may also be used to capture images of the rear of the tractor 100.
The sensor system 400 may also include the IMU (inertial measurement unit) 420 configured to measure the tractor's linear acceleration (using one or more accelerometers) and rotational rate (using one or more gyroscopes). In some examples, the IMU 420 also determines a heading reference of the tractor 100. Therefore, the IMU 420 determines the pitch, roll, and yaw of the tractor 100.
The sensor system 400 may include other sensors such as, but not limited to, radar, sonar, LIDAR (Light Detection and Ranging, which can entail optical remote sensing that measures properties of scattered light to find range and/or other information of a distant target), LADAR (Laser Detection and Ranging), etc.
The tractor controller 300 executes a hitch assist system 310 that receives images 412 from the camera 410 and superimposes a tractor path 552 on the received image 412.
Referring back to
In some implementations, the hitch assist system 160 includes a trajectory generator 560, a motion estimator 570, and a path tracker 580. The trajectory generator 560 determines an estimated position of the tractor 100 based on the operator selected path 552. The motion estimator 570 determines an actual position of the tractor 100, and the path tracker 580 determines an error 562 based on the estimated position Pe and the actual position Pa and adjusts the planned path 552 of the tractor 100 to eliminate the error 562 between the actual position Pa and the estimated position Pe.
In some implementations, the trajectory generator 560 receives images 143 from the camera 410 and superimposes the vehicle path 552 on the received image 143. The operator may adjust the path 552 selection based on one or more path modes 312. In some examples, the path modes 312 include an arc mode 312 having an angle sub-mode 314 and a distance sub-mode 316. In some examples, the path modes 312 may include a bi-arc mode 318. Therefore, the operator may select between the angle sub-mode 314, the distance sub-mode, and/or the bi-arc mode 318 for determining and adjusting the path 552 to a trailer 200 or an object.
Referring to
The angle sub-mode 314 is configured to adjust a curvature angle of the path 552 as shown in
In some implementations, where the bi-arc mode 318 is optional, if the operator is satisfied with the path 552 based on the arc mode 312 selection, then the operator may finalize the path 552 by pressing the button 145. Otherwise, the operator adjusts the knob 143 for the third time to change the shape of a bi-arc or other suitable path 552. This allows for adjusting the final approach angle to the trailer 200 or other object. Once the operator is satisfied with the choice of approach angle, he/she presses the button 145 to finalize the path choice.
In some implementations, the operator parks the tractor 100 in a location where the trailer 200, or other object or point of interest, is within a field of view of the rear camera 410 of the tractor 100. The engine of the tractor 100 may be idling, and the transmission in Park position. The operator may initiate the trajectory generator 560 by pressing the button 145 and/or making a selection on the display 142. In some examples, the display 142 shows a selectable option or button 145 allowing the operator to initiate the Arc mode 312. The trajectory generator 560 begins by executing the angle sub-mode 314 of the arc mode 312, as shown in
In some implementations, the final approach angle to the trailer 200 or the point of interest is important, for example, for aligning the vehicle fore-aft axis Y with the trailer fore-aft axis Y. In this case, the operator may select or press the “Arc/Bi-Arc Mode” button 145 (displayed on the display 142) and switch to the bi-arc mode 318. In the bi-arc mode 318 the previously set endpoint of the path 552 stays constant, and the operator adjusts the final approach angle with the knob 143. When the operator is satisfied with the final approach angle and with the complete trajectory or path 552, the operator may confirm the selected path 552 by executing an action. In some examples, the operator switches the transmission to reverse which is indicative that the operator is satisfied with the displayed path 552. In some examples, the operator switches the transmission into reverse with the brake on, then releases the brake, and the tractor 100 follows the selected path 552. In some examples, while the vehicle is autonomously maneuvering in the rearward direction R along the path 552, the operator may stop the tractor 100 by, for example, pressing the brake. This causes the vehicle controller 150 to exit the hitch assist system 160.
In some implementations, the trajectory generator 560 sets the path distance at a default, which allows the operator to only adjust the steering angle until it intersects the trailer 200 or other point of interest.
In some implementation, the final approach angle is not adjusted. Instead, the final approach angle is always the same as the initial vehicle departure angle. So, the final vehicle fore-aft axis Y is parallel to the initial vehicle fore-aft axis Y. In this case, the operator adjusts the final location of the path 552 to interest with the trailer.
In some examples, while the tractor 100 is maneuvering in the rearward direction R along the path 552, the display 142 may show a progress of the tractor 100 along the path 552. For example, the display 142 may show an original trajectory projected on the ground, but updated by the vehicle's changing position. The display 142 may also show an indication of how well the vehicle is following this trajectory.
In some implementations, the trajectory generator 560 receives data from other vehicle systems to generate the path 552. In some examples, the trajectory generator 560 receive vehicle pose data defined by (x, y, θ) where x is the position of a center of the tractor 100 along the transverse axis X in and X-Y plane, y is the position of a center of the vehicle along the fore-aft axis Y in the X-Y plane, and θ is the heading of the tractor 100. In addition, the trajectory generator 560 may receive a position of the knob 143, e.g., a knob angle, from the knob 143. The trajectory generator 560 may also receive a mode button state (i.e., arc mode 312 or bi-arc mode 318), and the sub-mode button state (i.e., angle sub-mode 314 or distance sub-mode 316). Based on the received data, the trajectory generator 560 adjusts the path 552 and instructs the display 142 to display the path 552. In some examples, the path 552 includes outer boundaries 554 and a tow ball path 166 being the estimated path of the tow ball 122. The trajectory generator 560 may also instruct the display 142 to show the current mode or sub-mode status indicative of the mode/sub-mode the drive has selected to adjust the path 552.
Referring back to
Referring back to
In some implementations, the motion estimator 570 uses an Extended Kalman Filter (EKF). The EKF equations are also provided below as equations (3)-(7). The state vector has nine elements shown in equation 1:
μ=[xyθvωdlrdrrdlfdrf]T. (1)
The first three are the “pose” of the vehicle, (x, y, θ). The next two are the linear and angular speeds, (v, ω). The final four are the distances traveled by the four tires, (dlr, drr, dlf, drf), left-rear, right-rear, left-front, right-front, respectively.
The complete measurement vector has five elements shown in equation 2:
z=[dlrdrrdlfdrfϕ]T. (2)
The first four are, again, the distances traveled by the four tires, (dlr, drr, dlf, drf). The final element φ is the average front wheel angle (not the steering wheel angle but the average angle of the front tires with respect to the longitudinal axis).
The motion estimator 570 provides a vehicle speed estimate. It is common to compute an approximate speed by dividing distance change by time change (Δd/Δt), however, this would be very noisy for the situation here, where there are relatively few wheel encoder counts, and the vehicle is moving relatively slowly. Thus, to avoid a direct calculation involving dividing distance change by time change (Δd/Δt), the motion estimator 570 estimates the vehicle linear speed v by using the EKF, based on the measured wheel accumulated distances, and there is no explicit rate calculation involving division.
The Extended Kalman Filter may be written as two Prediction equations and three Measurement equations. The Prediction equations are:
t
=g(ut,μt−1), (3)
t
=G
tΣt−1G1T+Rt. (4)
Equation (3) provides an update to the state μ. Equation (4) provides an update to the covariance E. The covariance provides an estimate of the current uncertainty of the states. The matrix R is the noise covariance for the state μ.
The measurement update equations are:
K
t=
μt=
Σt=(I−KtHt)
Equation (5) sets the value of the optimal Kalman gain K. Equation (6) provides an update to the state μ. Equation (7) provides an update to the covariance E. The matrix Q is the noise covariance for the measurement z.
For prediction the nonlinear vector function g needs to be defined. The matrix G is the derivative of this vector function. For convenience, it will be provided as well. The vector function g is given by:
Here, w is the “track width” of the vehicle. More specifically, it is the lateral distance from the center of a left tire to the center of a right tire. It is assumed that the front and rear distances between tires are the same. The wheelbase is denoted by 1. Note that in the last two elements, there is a minus sign where expressions are subtracted from dlf and drf. This minus sign assumes backward motion. Thus, this prediction equation could not be used for forward motion. However, if there were some measurement of the direction of the vehicle (forward or backward), then it would be a simple matter to change the sign of the last two elements (positive for forward, negative for backward) to make the equations valid for both forward and backward directions.
The matrix G has nine rows and nine columns. Let
G=I+
I is the 9×9 identity matrix. Then,
All other elements are zero.
The complete update assumes measurements of the wheel ticks and the wheel angle are all available at the same time. If only the wheel ticks are available, then they may be incorporated separately, and if only the wheel angle is available, it may be incorporated separately. For the complete update, the vector h is defined. The matrix H is the derivative of this vector function. For convenience, it will be provided as well. The vector function h is given by:
The H matrix is the derivative of h and is 5×9. The non-zero elements are
H
11
=H
22
=H
33
=H
44=1
H
54
−lω/(v2+l2ω2),H55=lv/(v2±l2ω2),v2+l2ω2≠0.
Note that certain quantities, h5, H54, H55, involve divisors which could easily be zero. Thus, implementation requires testing that these divisors are not zero before performing the divisions.
A measurement consisting of wheel ticks alone can be accommodated, as can a measurement consisting of steering angle alone. However, these variations are not included because they are straightforward, given the information that has been provided.
During the vehicle's autonomous maneuvering in the rearward direction R along the planned path 552, the trajectory generator 560 determines where the tractor 100 should be based on the planned path 552, i.e., estimated position Pe; while the motion estimator 570 determines an actual position Pa of the tractor 100; therefore the path tracker 580 determines an error 562 based on the estimated position Pe and the actual position Pa. The path tracker 580 adjusts the vehicle's current position Pa based on the error 562 such that the tractor 100 continues to follow the planned path 552.
Referring to
In some examples, the controller includes an object detection system (not shown) that identifies one or more objects along the planned path 552. In this case, the hitch assist system 160 adjusts the path 552 to avoid the detected one or more objects. In some examples, the hitch assist system 160 determines a probability of collision and if the probability of collision exceeds a predetermined threshold, the hitch assist system 160 adjusts the path 552 and sends it to the operator assistance system 320.
Once the operator indicated that the selected path 552 is entered, then the vehicle controller 150 executes a operator assistance system 320, which in turn includes path following behaviors 330. The path following behaviors 330 receive the selected path 552 and executes one or more behaviors 330a-b that send commands 194 to the drive system 110, causing the tractor 100 to autonomously drive along the planned path in the rearward direction R.
The path following behaviors 330a-b may include one or more behaviors, such as, but not limited to, a braking behavior 330a, a speed behavior 330b, and a steering behavior 330c. Each behavior 330a-b causes the tractor 100 to take an action, such as driving backward, turning at a specific angle, breaking, speeding, slowing down, among others. The vehicle controller 150 may maneuver the tractor 100 in any direction across the road surface by controlling the drive system 110, more specifically by issuing commands 194 to the drive system 110.
The braking behavior 330a may be executed to either stop the tractor 100 or to slow down the tractor 100 based on the planned path. The braking behavior 330a sends a signal or command 194 to the drive system 110, e.g., the brake system (not shown), to either stop the tractor 100 or reduce the speed of the tractor 100.
The speed behavior 330b may be executed to change the speed of the tractor 100 by either accelerating or decelerating based on the planned path 552. The speed behavior 330b sends a signal or command 194 to the brake system 120 for decelerating or the acceleration system 1320 for accelerating.
The steering behavior 330c may be executed to change the direction of the tractor 100 based on the planned path 552. As such, the steering behavior 330c sends the acceleration system 140 a signal or command 194 indicative of an angle of steering causing the drive system 110 to change direction.
In some examples, the method 900, 1000 includes overlaying a path on the one or more images 143 and receiving a command by way of the user interface 130 in communication with the data processing hardware 152. The command includes instructions to adjust the path as the operator planned path 162. In some examples, the command includes instructions to adjust a distance of the path. The command may include instructions to adjust an angle of the path and or instructions to adjust an angle of an end portion of the path.
In some implementations, determining the current vehicle position includes receiving wheel encoder sensor data 145 associated with one or more wheels 112 and receiving steering angle sensor data 145. The current vehicle position Pa is based on the wheel encoder sensor data 145 and the steering angle sensor data 145.
As previously discussed, the proposed algorithm is designed to work in real time in a standard CPU with or without the use of GPU, graphic accelerators, training, or FPGAs. In addition, the proposed approach provides an automated method that only needs initial input from the operator. In addition, the described system provides a compromise between providing guidelines for the operator and automating all rearward functions.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” and “computer-readable medium” refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
Implementations of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Moreover, subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The terms “data processing apparatus”, “computing device” and “computing processor” encompass all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multi-tasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims.
This U.S. patent application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application 62/733,458, filed on Sep. 19, 2018, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62733458 | Sep 2018 | US |