The present disclosure relates generally to an agricultural vehicle. More specifically, the present disclosure relates to a steering control system for agricultural vehicles.
One embodiment relates to an agricultural vehicle. The agricultural vehicle includes a chassis, a plurality of tractive elements, a steering input device configured to steer the agricultural vehicle to perform a turn, and a steering control system configured to operate the steering input device. The chassis includes a first chassis portion and a second chassis portion pivotably coupled to the first chassis portion. Each of the tractive elements are coupled to the first chassis portion or the second chassis portion. The steering control system includes processing circuitry configured to obtain steering condition data corresponding to steering conditions of the steering input device, obtain partial curvature data corresponding to curvatures of the first chassis portion, determine, based on the partial curvature data, curvature data corresponding to curvatures of the agricultural vehicle, generate, based on the steering condition data and the curvature data, a primary curvature model that determines steering condition data given commanded curvature data, and operate the steering input device using (1) the primary curvature model and (2) a given command curvature.
Another embodiment relates to a steering control system configured to operate a steering input device of an agricultural vehicle to perform a turn. The steering control system includes processing circuitry configured to obtain steering condition data corresponding to steering conditions of the steering input device, obtain partial curvature data corresponding to curvatures of a first chassis portion of the agricultural vehicle, determine, based on the partial curvature data, curvature data corresponding to combined curvatures of the first chassis portion and a second chassis portion pivotably coupled to the first chassis portion, generate, based on the steering condition data and the curvature data, a primary curvature model that determines steering condition data given commanded curvature data, and operate the steering input device using (1) the primary curvature model and (2) a given command curvature.
Still another embodiment relates to a method for controlling a steering input device of an agricultural vehicle. The method includes obtaining steering condition data corresponding to steering conditions of the steering conditions of the steering input device, obtaining partial curvature data corresponding to curvatures of a first chassis portion of the agricultural vehicle, determining, based on the partial curvature data, curvature data corresponding to combined curvatures of the first chassis portion and a second chassis portion pivotably coupled to the first chassis portion, generating, based on the steering condition data and the curvature data, a primary curvature model that determines steering condition data given commanded curvature data, and operating the steering input device using (1) the primary curvature model and (2) a given command curvature.
This summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices or processes described herein will become apparent in the detailed description set forth herein, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements.
The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements, in which:
Before turning to the figures, which illustrate certain exemplary embodiments in detail, it should be understood that the present disclosure is not limited to the details or methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology used herein is for the purpose of description only and should not be regarded as limiting.
As shown in
According to an exemplary embodiment, the vehicle 10 is an off-road machine or vehicle. In some embodiments, the off-road machine or vehicle is an agricultural machine or vehicle such as a tractor, a telehandler, a front loader, a combine harvester, a grape harvester, a forage harvester, a sprayer vehicle, a speedrower, and/or another type of agricultural machine or vehicle. In some embodiments, the off-road machine or vehicle is a construction machine or vehicle such as a skid steer loader, an excavator, a backhoe loader, a wheel loader, a bulldozer, a telehandler, a motor grader, and/or another type of construction machine or vehicle. In some embodiments, the vehicle 10 includes an implement system which may include one or more attached implements and/or trailed implements as such as a front mounted mower, a rear mounted mower, a trailed mower, a tedder, a rake, a baler, a plough, a cultivator, a rotavator, a tiller, a harvester, and/or another type of attached implement or trailed implement. The implements of implement system may couple to the front or rear of vehicle 10 through various means, including, but not limited to, hydraulic hoses, electrical wires, PTO connection, three-point hitch, ball hitch, front forks, etc.
According to an exemplary embodiment, the cab 30 is configured to provide seating for an operator (e.g., a driver, etc.) of the vehicle 10. In some embodiments, the cab 30 is configured to provide seating for one or more passengers of the vehicle 10. According to an exemplary embodiment, the operator interface 40 is configured to provide an operator with the ability to control one or more functions of and/or provide commands to the vehicle 10 and the components thereof (e.g., turn on, turn off, drive, turn, brake, engage various operating modes, raise/lower an implement, etc.). The operator interface 40 may include one or more displays and one or more input devices. The one or more displays may be or include a touchscreen, an LCD display, an LED display, a speedometer, gauges, warning lights, etc. The one or more input device may be or include a steering wheel, a joystick, buttons, switches, knobs, levers, an accelerator pedal, an accelerator lever, a plurality of brake pedals, etc.
According to an exemplary embodiment, the driveline 50 is configured to propel the vehicle 10. As shown in
As shown in
As shown in
As shown in
In some embodiments, the driveline 50 includes a plurality of prime movers 52. By way of example, the driveline 50 may include a first prime mover 52 that drives the front tractive assembly 70 and a second prime mover 52 that drives the rear tractive assembly 80. By way of another example, the driveline 50 may include a first prime mover 52 that drives a first one of the front tractive elements 78, a second prime mover 52 that drives a second one of the front tractive elements 78, a third prime mover 52 that drives a first one of the rear tractive elements 88, and/or a fourth prime mover 52 that drives a second one of the rear tractive elements 88. By way of still another example, the driveline 50 may include a first prime mover that drives the front tractive assembly 70, a second prime mover 52 that drives a first one of the rear tractive elements 88, and a third prime mover 52 that drives a second one of the rear tractive elements 88. By way of yet another example, the driveline 50 may include a first prime mover that drives the rear tractive assembly 80, a second prime mover 52 that drives a first one of the front tractive elements 78, and a third prime mover 52 that drives a second one of the front tractive elements 78. In such embodiments, the driveline 50 may not include the transmission 56 or the transfer case 58.
As shown in
According to an exemplary embodiment, the braking system 92 includes one or more brakes (e.g., disc brakes, drum brakes, in-board brakes, axle brakes, etc.) positioned to facilitate selectively braking (i) one or more components of the driveline 50 and/or (ii) one or more components of a trailed implement. In some embodiments, the one or more brakes include (i) one or more front brakes positioned to facilitate braking one or more components of the front tractive assembly 70 and (ii) one or more rear brakes positioned to facilitate braking one or more components of the rear tractive assembly 80. In some embodiments, the one or more brakes include only the one or more front brakes. In some embodiments, the one or more brakes include only the one or more rear brakes. In some embodiments, the one or more front brakes include two front brakes, one positioned to facilitate braking each of the front tractive elements 78. In some embodiments, the one or more front brakes include at least one front brake positioned to facilitate braking the front axle 76. In some embodiments, the one or more rear brakes include two rear brakes, one positioned to facilitate braking each of the rear tractive elements 88. In some embodiments, the one or more rear brakes include at least one rear brake positioned to facilitate braking the rear axle 86. Accordingly, the braking system 92 may include one or more brakes to facilitate braking the front axle 76, the front tractive elements 78, the rear axle 86, and/or the rear tractive elements 88. In some embodiments, the one or more brakes additionally include one or more trailer brakes of a trailed implement attached to the vehicle 10. The trailer brakes are positioned to facilitate selectively braking one or more axles and/or one more tractive elements (e.g., wheels, etc.) of the trailed implement.
As shown in
As shown in
As shown in
As shown in
Referring to
As shown in
In some embodiments, a relationship between adjusting the steering condition 306 of the steering input device 302 and adjusting the orientation of the one or more of the front tractive elements 78 or the rear tractive elements 88 to steer the vehicle 10 may be non-linear (e.g., the adjustment of the one or more of the front tractive elements 78 or the rear tractive elements 88 is not proportional to the adjustment of the steering condition 306 of the steering input device 302, etc.). For example, as the steering condition 306 of the steering input device 302 is adjusted further away from a center steering position (e.g., a position of the steering input device 302 that results in the vehicle 10 driving straight, etc.), the front tractive elements 78 may turn at a decreasing rate.
In some embodiments, the steering condition 306 of the steering input device 302 may be adjusted outside of an operational range that corresponds with a maximum orientation of the one or more of the front tractive elements 78 or the rear tractive elements 88 to steer the vehicle 10, such that the steering condition 306 of the steering input device 302 may be adjusted without adjusting the orientation of the one or more of the front tractive elements 78 or the rear tractive elements 88. For example, the steering condition 306 of the steering input device 302 could continue to be adjusted away from the center steering position in a direction after the front tractive elements 78 have reached a maximum orientation and the orientation of the front tractive elements 78 can no longer be adjusted any further in the direction. For example, referring to
The steering system 300 is operable by a controller 402 of the steering control system 400, according to some embodiments. In some embodiments, the controller 402 is configured to receive a steering input 416 from a remote system 412 (e.g., a remote operating system, etc.) or an operator and provide the steering control to the steering control device 304. The steering input 416 indicates at least one of a desired degree, a desired radius, or a desired rate of turn, or may indicate a commanded curvature of the vehicle 10. In some embodiments, the steering input 416 corresponds to a turn that the vehicle should perform. In some embodiments, the controller 402 may receive sensor inputs from a sensor that corresponds with the steering input device 302. In some embodiments, the sensor inputs may be encoder values (e.g., encoder position, encoder feedback, encoder signals, etc.) from an encoder 414 that is configured to detect a position, rate of change, etc., of the steering input device 302. In some embodiments, the encoder 414 is a sensor that is provided as a component of the steering control device 304. The controller 402 is configured to use the steering input 416 to determine and output the steering control to the steering control device 304. In some embodiments, referring to
In some embodiments, the encoder 414 may continue to generate encoder values corresponding to the steering condition 306 of the steering input device 302 when the steering condition 306 is adjusted outside of the operational band 308. For example, referring to
Referring still to
It should be understood that any of the functionality, model generation techniques, regression techniques, autonomous controls, model training techniques, controls, etc., of the controller 402 as described herein with reference to
Referring to
The process 500 includes receiving a steering input (e.g., a first input, a commanded input, a turn request, etc.) indicating a curvature to be performed by a vehicle (step 502), according to some embodiments. In some embodiments, step 502 is performed by the controller 402 by receiving the steering input 416 from an operator, the remote system 412, or another control system. In some embodiments, the curvature is a turn radius for the vehicle. In some embodiments, the curvature is a path that includes multiple turn radii for the agricultural vehicle. For example, the curvature may be a 180 degree turn to change the direction of the agricultural vehicle.
The process 500 includes operating the steering input device using a primary curvature model of the agricultural vehicle and the steering input (step 504), according to some embodiments. For example, an electric motor coupled to a steering wheel may receive a steering control to rotate the steering wheel to a steering condition that would result in the agricultural vehicle performing the turn corresponding to the curvature of the steering input. The electric motor may then rotate the steering wheel to the steering condition that results in the agricultural vehicle performing the turn corresponding to the curvature of the steering input. In some embodiments, step 504 is performed by the controller 402 and includes operating the steering input device 302 using the steering control device 304 based on the steering control.
In some embodiments, the operation of the steering input device at step 504 includes feedback from encoder values of the encoder. For example, an electric motor coupled to a steering wheel may receive a steering control associated with a prescribed encoder value of the encoder that would result in the agricultural vehicle performing the turn corresponding to the curvature of the steering input. The electric motor may then rotate the steering wheel until the encoder provides an encoder value that corresponds to the prescribed encoder value. In some embodiments, the operation of the steering input device may be a closed-loop control system that includes feedback from the encoder.
In some embodiments, step 504 is performed by the controller 402 and includes modeling the steering input 416 with the primary curvature model 420 to generate the steering condition 306 of the steering input device 302 that results in the vehicle 10 performing a turn that corresponds to the curvature of the steering input 416. For example, the controller 402 may model the steering input 416 including a turn with the primary curvature model to determine an angle of a steering wheel that will result in the vehicle 10 completing the turn. The controller 402 may then provide a steering control to the steering control device 304 that will adjust the steering wheel so that the steering wheel is at the angle that will result in the vehicle 10 completing the turn. In some embodiments, the primary curvature model includes a non-linear relationship between the steering input and adjustment of the steering condition. For example, tractive elements of the agricultural vehicle may turn at a decreasing rate relative to the adjustment of the steering condition 306 of the steering input device 302 as the steering condition 306 of the steering input device 302 is adjusted away from a center position.
In some embodiments, the steering control generated at step 504 corresponds to encoder values of an encoder configured to detect a position, rate of change, etc., of the steering input device. For example, the steering control may correspond to adjusting the steering input device until the encoder detects a control position of the steering input device. In some embodiments, the encoder may be retrofit components on the steering input device. In some embodiments, the encoder values correspond to the encoder 414 configured to detect a position, rate of change, etc., of the steering input device 302.
Referring to
Referring to
The process 600 includes obtaining steering condition data corresponding to the steering conditions of the steering input device (step 602), according to some embodiments. In some embodiments, the steering condition data relates to the steering condition of the steering input device of the agricultural vehicle. For example, the steering condition data may include angles of a steering wheel of the steering system of the agricultural vehicle, positions of the steering input device, etc. In some embodiments, step 602 is performed by the controller 402 by obtaining the steering condition data over a time period (e.g., a learning time period). In some embodiments the steering condition data includes the encoder values of the encoder 414 of the steering system 300. For example, the steering condition data may include encoder values of the encoder 414 that correspond with the steering condition 306 of the steering input device 302.
In some embodiments, step 602 may occur during normal operation of the agricultural vehicle (e.g., while the agricultural vehicle is performing a function, operation of the agricultural vehicle outside of a controlled environment, etc.). During normal operation of the agricultural vehicle, an operator may manually adjust a steering condition of a steering input device to adjust an orientation of one or more pairs of tractive elements, resulting in a turn of the agricultural vehicle. For example, the steering condition data of the steering input device may be obtained while an operator is manually controlling the steering input device to operate the agricultural vehicle to perform a function such as plowing a field, baling hay, etc.
In some embodiments, the steering condition data is batched to only include extreme steering conditions (e.g., maximum steering conditions, minimum steering conditions, etc.) and the remaining steering conditions are eliminated from the steering collection data. In some embodiments, the steering condition data is batched by the controller 402. For example, the controller 402 may receive steering conditions from the encoder 414. Due to memory size restrictions associated with the vehicle 10, the controller 402 may batch the extreme steering conditions into the steering collection data and eliminate the remaining steering conditions. The controller 402 may then continue to batch additional extreme steering conditions into the steering collection data over the time period without exceeding the memory size restrictions.
The process 600 includes obtaining curvature data corresponding to the curvatures of the vehicle (step 604), according to some embodiments. In some embodiments, the curvature data relates to actual curvatures of the vehicle (e.g., paths, turns of the vehicle, etc.). In some embodiments, step 604 is performed by the controller 402 by receiving an actual curvature taken by the vehicle 10 that results from the steering condition of the steering input device 302 over the time period. For example, the controller 402 may receive a GNSS curvature (e.g., a series of GNSS coordinates forming a curvature, etc.) corresponding to the actual curvature of the vehicle 10 from a GPS sensor of the vehicle 10. In other embodiments, step 604 is performed by the controller 402 by receiving sensor data associated with the vehicle 10 and determining the curvature data corresponding to the curvatures of the vehicle 10 based on the sensor data. For example, the controller 402 may receive a GNSS curvature from a GPS sensor positioned on the front frame portion 14 of the vehicle 10. The GNSS curvature may correspond to the front frame curvature CF of the front frame portion 14 of the vehicle 10 and the controller 402 may determine the curvature data corresponding to the curvatures of the vehicle 10 based on the front frame curvature CF. As another example, the controller 402 may receive sensor data associated with the driveline 50 and may determine the curvature data corresponding to the curvatures of the vehicle 10 based on the sensor data. The controller 402 may receive data corresponding to a forward travel velocity ν of the vehicle 10 and a heading rate {dot over (θ)} associated with a rate of change of a heading (e.g., direction, etc.) of the vehicle 10 and may determine the curvature data corresponding to the front frame curvature CF based on the equation:
In some embodiments, similar to step 602, step 604 may occur during normal operation of the vehicle. In some embodiments, the curvature data is batched to only include extreme curvatures (e.g., maximum curvatures, minimum curvatures, etc.) and the remaining curvatures are eliminated from the curvature data, similar to the steering condition data obtained in step 602.
The process 600 includes using the steering condition data and the curvature data to obtain a primary curvature model that predicts steering condition data given commanded curvature data (step 606), according to some embodiments. In some embodiments, step 606 includes performing a regression (e.g., a first regression, etc.) based on the steering condition data of the steering input device and the curvature data of the vehicle to generate a primary curvature model of the vehicle. In some embodiments, step 606 includes feeding the steering condition data of the steering input device and the curvature data of the vehicle into a primary curvature perceptron to generate the primary curvature model as the primary curvature perceptron using machine learning techniques. For example, linear regression techniques may be used to create and train the primary curvature perceptron that predicts the steering condition data given the curvature data.
In some embodiments, step 606 includes comparing the primary curvature model to the curvature data to generate estimation errors. In some embodiments, the controller 402 may compare the curvature data of the vehicle 10 with the primary curvature model generated by the regression to generate estimation errors between the curvature data and the primary curvature model. In some embodiments, the regression includes adjusting the primary curvature model to reduce the estimation errors. In some embodiments, the primary curvature perceptron may train using the estimation errors to improve the primary curvature model.
In some embodiments, the primary curvature model is a primary polynomial function that predicts the steering condition data of the steering input device associated with the commanded curvature data. In some embodiments, referring to
In some embodiments, the primary polynomial function may relate the curvatures of the agricultural vehicle to an encoder value associated with a steering input device. For example, the primary polynomial function may receive commanded curvature data and determine the encoder values of the encoder that that is required such that the agricultural vehicle turns according to the commanded curvature data. In some embodiments, the primary polynomial function may be generated by the controller 402 to relate the curvature of the vehicle 10 to the encoder value of the encoder 414 that corresponds to the steering input device 302. For example, the primary polynomial function may receive commanded curvature data for the vehicle 10 and determine steering condition data of the steering input device 302 that is required such that the vehicle 10 turns according to the commanded curvature data.
In some embodiments, an initial form of the primary polynomial function is generated by setting weights corresponding to variables of the primary polynomial function to random values, by setting the weights corresponding to variables of the primary polynomial function to small random values, or using other techniques of creating polynomial functions. For example, the primary polynomial function may have the form Ax3+Bx2+Cx+D=y and the values of A, B, C, and D may be set equal to random values between −1 and 1 to form the initial form of the polynomial function, which provides a starting point for the regression of the polynomial function. In some embodiments, the regression adjusts the primary polynomial function by modifying the constant and the weights corresponding to the variables of the polynomial function. For example, the constant and the weights may be adjusted to reduce the estimation errors by following a method of gradient decent.
The process 600 includes determining that the primary curvature model has converged (step 608), according to some embodiments. In some embodiments, the primary curvature model is considered to have converged after the regression has run for a specified amount of time, after the estimation error is below a predetermined threshold, after the primary curvature model has stabilized, or through other methods of determining convergence for regressions. In some embodiments the predetermined threshold corresponding to the estimation error of the primary curvature model may be set by an operator, be determined by operating conditions, or be selected using a different method. In some embodiments, the predetermined threshold may include a value of the estimation errors, a trend in the estimation errors, the estimation errors holding below a value for a set number of iterations of the primary curvature model, or other methods of determining that the regression has converged based on estimation errors.
The process 600 includes operating a steering control system using the primary curvature model for a given commanded curvature (step 610), according to some embodiments. For example, a steering control system may receive curvature data that includes a commanded curvature for an agricultural vehicle to perform. The steering control system may predict steering condition data for the agricultural vehicle to follow the commanded curvature and operate a steering input device such that the agricultural vehicle performs a turn corresponding to the commanded curvature. In some embodiments, step 610 may be implemented by performing the process 500.
In some embodiments, the steering control system may be partially operated using the primary curvature model for the given command curvature and may partially rely on another means to operate the steering control system (e.g., an operator input, sensors data from the agricultural vehicle, etc.). In some embodiments, step 610 may include converting the commanded curvature into steering control data using the primary curvature model. In some embodiments, step 610 may include converting steering control data into encoder values and operating a steering control device to achieve the encoder values. For example, the controller 402 may convert the steering curvature data into an encoder value of the encoder 414 and then operate the steering control device 304 until the encoder generates the encoder value such that the vehicle 10 performs a turn associated with the commanded curvature. In some embodiments, the controller 402 may convert the steering curvature data into time-series data of positions of the encoder 414 and then operate the steering control device 304 so that the encoder 414 generates the time-series data of the positions of the encoder 414 such that the vehicle 10 performs a turn associated with the commanded curvature.
In some embodiments, step 610 may include activating the primary curvature model so that the primary curvature model may be used to operate the steering control system. In some embodiments, the primary curvature model may not be used by the vehicle to operate the steering control device of the agricultural vehicle according to the steering inputs until the primary curvature model has been activated. For example, the agricultural vehicle may not be operated by the steering control device receiving steering controls to adjust the steering condition of the steering input device until the primary curvature model has been activated. In some embodiments, step 610 is performed by the controller 402 by activating the primary curvature model for the steering system 300 such that the controller 402 may receive a commanded curvature and may autonomously turn the vehicle 10 to follow the commanded curvature. For example, the controller 402 may not utilize the primary curvature model to generate a steering condition of the steering input device 302 and an associated steering control for the steering control device 304 that results in the vehicle 10 performing a turn based on a commanded curvature until the primary curvature model has been activated.
Referring to
Referring to
The process 700 includes operating a steering system of a vehicle for a given commanded curvature using a primary curvature model (step 702), according to some embodiments. In some embodiments, step 702 includes operating the steering system of the vehicle following process 500 using the primary curvature model obtained through process 600.
The process 700 includes obtaining steering condition data of steering input device and curvature data of curvatures of the vehicle (step 704) according to some embodiments. In some embodiments, the steering condition data relates to the steering condition of the steering input device of the agricultural vehicle, similar to the steering condition data obtained during step 602 of the process 600. In some embodiments, at least a portion of the steering condition data is generated while the steering system of the vehicle is operated for the given commanded curvature using the primary control model. For example, at least a portion of the steering control data of the steering input device may be obtained while the agricultural vehicle is being autonomously controlled using the primary curvature model. In some embodiments, the curvature data relates to the actual curvature of the vehicle, similar to the curvature data obtained during step 604 of the process 600. In some embodiments, step 704 is performed by the controller 402 by receiving an actual curvature taken by the vehicle 10 that results from the steering condition of the steering input device 302 over the time period. For example, the controller 402 may receive a GNSS curvature corresponding to the actual curvature of the vehicle 10 from a GPS sensor of the vehicle 10. In other embodiments, step 704 is performed by the controller 402 by receiving sensor data associated with the vehicle 10 and determining the curvature data corresponding to the curvatures of the vehicle 10 based on the sensor data. For example, the controller 402 may receive a GNSS curvature from a GPS sensor positioned on the front frame portion 14 of the vehicle 10. The GNSS curvature may correspond to the front frame curvature CF of the front frame portion 14 of the vehicle 10. The controller 402 may determine the curvature data corresponding to the curvatures of the vehicle 10 based on the front frame curvature CF. As another example, the controller 402 may receive sensor data associated with the driveline 50 and may determine the curvature data corresponding to the curvatures of the vehicle 10 based on the sensor data. In some embodiments, similar to step 702, at least a portion of the curvature data is generated while the steering system of the vehicle is operated for the given commanded curvature using the primary control model.
The process 700 includes using the steering condition data and the curvature data to obtain a calibration curvature model that predicts steering condition data given commanded curvature data (step 706), according to some embodiments. In some embodiments, step 706 includes performing a regression (e.g., a second regression, etc.) based on the steering condition data of the steering input device and the curvature data of the vehicle to generate a calibration curvature model of the vehicle that determines values of a steering condition of the steering input device given a command curvature. In some embodiments, step 706 includes feeding the steering condition data of the steering input device and the curvature data of the vehicle into a calibration curvature perceptron to generate the calibration curvature model as the calibration curvature perceptron using machine learning techniques, similar to step 606 of the process 600. In some embodiments, an initial iteration of the calibration curvature model may be set as equivalent to the primary curvature model generated by process 600 after the primary curvature model has been activated. For example, an initial iteration of the calibration curvature model may be the same as (e.g., equivalent to, etc.) the primary curvature model determined to be converged in the process 600 such that the calibration curvature model may continue to improve off of the primary curvature model. In some embodiments, the regression may occur while the steering system is operating the steering input device of the agricultural vehicle to the steering condition according to the steering control generated by modeling the input with the primary curvature model obtained by process 600.
In some embodiments, the calibration curvature model is a calibration polynomial function that values of the steering condition of the steering input device associated with a curvature, similar to the primary curvature model generated by the process 600. In some embodiments, referring to
The process 700 includes determining that the steering condition is within a center range (Step 708), according to some embodiments. In some embodiments, the center range may be an operational band of the steering input device where the agricultural vehicle is considered to be driving straight forward. In some embodiments, the center range may include the operational band of the steering input device where the agricultural vehicle is driving within an angle of driving straight forward (e.g., within 5 degrees of driving straight forward, within 2 degrees of driving straight forward, within 1 degree of driving straight forward, etc.).
The process 700 includes adjusting the primary curvature model based on the calibration curvature model and returning to step 702 if the steering condition is within the center range (Step 710), according to some embodiments. In some embodiments, the adjustment of the primary curvature model includes updating a constant of the primary curvature model to a calibration constant of the calibration curvature model if the steering condition is within the center range. For example, if the primary curvature model is the polynomial function with the form A1x3+B1x2+C1x+D1=y and the calibration curvature model is the calibration polynomial function with the form A2x3+B2x2+C2x+D2=y, the value of D1 may be updated to the value of D2 when the steering condition is within the center range. By adjusting the primary curvature model based on the calibration curvature model when the steering condition is within the center range, the primary curvature model may be updated when the primary curvature model is not being modeled with an input to generate a steering control, as detailed in process 500.
The process 700 includes determining if a deviation between the calibration curvature model and the primary curvature model is above a threshold if the steering condition is not within the center range (Step 712), according to some embodiments. In some embodiments, the deviation may be a difference between the calibration curvature model and the primary curvature model that results in a different steering condition of the steering input device when a curvature is inputted into the calibration curvature model and the primary curvature model. In some embodiments, the threshold may be a value related to a maximum difference between the steering condition outputted by the calibration curvature model based on a curvature and the steering condition outputted by the primary curvature model based on the curvature. For example, the threshold may be a maximum difference between a first angle of a steering wheel outputted by the primary curvature model and a second angle of the steering wheel outputted by the calibration curvature model. In some embodiments, the determination that the deviation between the calibration curvature model and the primary curvature model is above the threshold may indicate that the primary curvature model is no longer accurate for the steering system of the vehicle and that the primary curvature model should be updated.
The process 700 includes performing at least one of generating an alert or deactivating the primary curvature model for the steering system if the deviation is above the threshold (Step 714), according to some embodiments. In some embodiments, the alert may include at least one of an audio or visual alert provided to an operator of the agricultural vehicle to indicate that the deviation between the calibration curvature model and the primary curvature model is above the threshold. For example, the alert may be provided to a display of the agricultural vehicle and may alert the operator that the steering control of the steering control device may no longer be accurate and needs to be updated. In some embodiments, the alert may indicate that the primary curvature model should be updated based on the calibration curvature model. In some embodiments, the deactivation of the primary curvature model may prevent the primary curvature model from being used by the steering system to operate the steering control device of the agricultural vehicle. In some embodiments, if the primary curvature model is being used by the steering system to operate the steering control device of the agricultural vehicle, the deactivation of the primary curvature model may result in stopping the operation of the steering control device. For example, if the primary curvature model is deactivated while being used to autonomously operate the steering input device, the autonomous operation may be stopped. In some embodiments, the deactivation of the primary curvature model may result in stopping the operation of the agricultural vehicle.
In some embodiments, the process 700 includes returning to step 702 if the deviation is not above the threshold, according to some embodiments. For example, if the deviation between the primary curvature model and the calibration curvature model is below the threshold, then the controller 402 will continue to operate the steering control device 304 for the commanded curvature data using the primary curvature model.
As shown in
The steering control system 400 may be configured to utilize the curvature data corresponding to the combined curvature Cc of the vehicle 10 to operate the vehicle 10. For example, during the process 600, the steering control system 400 may utilize the curvature data corresponding to the combined curvature Cc of the vehicle 10 obtained in step 604 during step 606 to generate estimation errors. As another example, during process 700, the steering control system 400 may utilize the curvature data corresponding to the combined curvature Cc of the vehicle 10 obtained in step 704 during step 706 to obtain the calibration curvature model and/or during step 708 to determine if the vehicle 10 is traveling straight. As a result, the steering control system 400 may use the curvature data corresponding to the combined curvature Cc of the vehicle 10 that is associated with the combined curvature Cc of the vehicle 10 that is associated with an overall curvature of the vehicle 10 instead of the front frame curvature CF of the front frame portion 14 or the rear frame curvature CR of the rear frame portion 16 (e.g., when the front frame curvature CF and the rear frame curvature CR are not equal, etc.).
According to an exemplary embodiment, the steering control system 400 is configured to determine the combined curvature Cc of the vehicle 10 configured as the articulated chassis vehicle based the partial curvature data corresponding to the front frame curvature CF of the front frame portion 14. In some embodiments, the partial curvature data relates to actual front frame curvatures CF of the front frame portion 14 (e.g., paths of the front frame portion 14, turns of the front frame portion 14, etc.). In some embodiments, the partial curvature data received by the controller 402 corresponds to the front frame curvatures CF of the front frame portion 14 that results from the steering condition of the steering input device 302. For example, the controller 402 may receive a GNSS curvature corresponding to the actual front frame curvatures CF of the front frame portion 14 from a GPS sensor of the vehicle 10 that is configured to move with the front frame portion 14 (e.g., coupled to the front frame portion 14, coupled to a portion of the body 20 coupled to the front frame portion 14, coupled to the cab 30, etc.). In other embodiments, the controller 402 is configured to determine the partial curvature data corresponding to the front frame curvatures CF of the front frame portion 14 based on sensor data associated with the operation of the vehicle 10. For example, the controller 402. For example, the controller 402 may receive sensor data associated with the driveline 50 and may determine the partial curvature data corresponding to the front frame curvatures CF of the front frame portion 14 of the vehicle 10 based on the sensor data. In other embodiments, the steering control system 400 is configured to determine the combined curvature Cc of the vehicle 10 configured as the articulated chassis vehicle based on the rear frame curvature CR. For example, the controller 402 may receive a GNSS curvature corresponding to the actual rear frame curvatures CR of the rear frame portion 16 from a GPS sensor of the vehicle 10 that is configured to move with the rear frame portion 16 (e.g., coupled to the rear frame portion 16, coupled to a portion of the body 20 coupled to the rear frame portion 16, etc.).
As shown in
where L1 is a first length of the front frame portion, shown as front length L1, L2 is a second length of the rear frame portion, shown as rear length L2, {dot over (γ)} is a rate of change of the articulation angle γ (e.g., a derivative of the articulation angle γ, etc.), and ν is a forward travel velocity ν of the vehicle 10 (e.g., a forward velocity of the vehicle 10, a velocity of the vehicle 10 in a direction of travel of the vehicle 10, etc.). In some embodiments, the controller 402 determines (e.g., estimates, etc.) the rate of change {dot over (γ)} based on past values of the articulation angle γ (e.g., historical values of the articulation angle γ, etc.). For example, if the articulation angle γ is equal to 15 degrees at a first time and 20 degrees at a second time, the rate of change γ may be estimated by dividing the difference between the articulation angle γ at the second time and the articulation angle γ at the first time (e.g., 5 degrees, etc.) by the difference between the second time and the first time.
The articulation angle γ may be determined by applying the condition:
to determine the minimum absolute value of the articulation angle γ that allows for the equation to be solved (e.g., minimizing the absolute value of the articulation angle γ, etc.). In some embodiments, the equation may be solved under the condition that the articulation angle γ is greater than or equal to zero and less than or equal to a maximum allowable value of the articulation angle γ. For example, the maximum allowable value of the articulation angle γ may be a value of the articulation angle γ that causes the rear frame portion 16 to gooseneck relative to the front frame portion 14. As another example, the maximum allowable value of the articulation angle γ may be a value of the articulation angle γ where the rear frame portion 16 contacts the front frame portion 14, preventing the articulation angle γ from increasing any further. In other embodiments, the controller 402 determines the articulation angle γ based on sensor data received by the controller 402. For example, the vehicle 10 may include a sensor configured to generate sensor data corresponding to the articulation angle γ between the front frame portion 14 and the rear frame portion 16. The controller 402 may receive the sensor data and determine the articulation angle γ based on the sensor data.
The controller 402 determines the curvature data corresponding to the combined curvature CC of the vehicle 10 based on the articulation angle γ using the equation below:
to determine the combined curvature CC of the vehicle 10 based on the articulation angle γ, the front length L1 of the front frame portion 14, and the rear length L2 of the rear frame portion 16. In some embodiments, the controller 402 is configured to filter out noise from the curvature data that may be formed from noise in the partial curvature data corresponding to the front frame curvature CF of the front frame portion 14. For example, the controller 402 may apply a low pass filter on the curvature data to filter out the noise from the curvature data.
As shown in
The process 800 includes obtaining partial curvature data corresponding to partial curvatures of at least one of a front portion of a vehicle or a rear portion of a vehicle (step 802), according to some embodiments. In some embodiments, the partial curvature data relates to actual partial curvatures of the vehicle (e.g., a path of a first portion of the vehicle, a turn of a first portion of the vehicle, etc.). In some embodiments, step 802 is performed by the controller 402 by obtaining an actual front frame curvature CF of the front frame portion 14 or an actual rear frame curvature CR of the rear frame portion 16 of the vehicle 10. For example, the controller 402 may receive a GNSS curvature (e.g., positional data, GNSS coordinate data, etc.) corresponding to the actual front frame curvature CF of the front frame portion 14 or the actual rear frame curvature CR of the rear frame portion 16 from a GPS sensor of the vehicle 10 configured to move the with front frame portion 14 or the rear frame portion 16 respectively. In other embodiments, step 802 is performed by the controller 402 by receiving sensor data associated with the vehicle 10 and determining the partial curvature data corresponding to the front frame curvature CF of the front frame portion 14 or the rear frame curvature CR of the rear frame portion 16.
The process 800 includes determining an articulation angle between the front portion and the rear portion (step 804), according to some embodiments. In some embodiments, the articulation angle may be determined based on the partial curvature data obtained during step 804. For example, the articulation angle may be determined using the partial curvature data and static data associated with the front portion and the rear portion of the vehicle (e.g., lengths of the first portion and the second portion of the vehicle, drivetrain attributes of the first portion and the second portion of the vehicle, etc.). In some embodiments, step 804 is performed by the controller 402 to determine the articulation angle γ between the front frame portion 14 and the rear frame portion 16 of the vehicle 10 based on the partial curvature data corresponding to the front frame curvature CF of the front frame portion 14 or the rear frame curvature CR of the rear frame portion 16.
The process 800 includes determining curvature data corresponding to curvatures of the vehicle (step 806), according to some embodiments. In some embodiments, the curvature data may be determined based on the articulation angle determined in step 804. In some embodiment, the curvature data corresponds to the curvatures of an entirety of the vehicle (e.g., all of the vehicle, overall curvatures of the vehicle, etc.). For example, the curvature data may correspond to curvatures that are a composite of a first curvature of the first portion of the vehicle and a second curvature of the second portion of the vehicle. In some embodiments, step 806 is performed by the controller 402 to determine curvature data corresponding to the combined curvature CC of the vehicle 10 based on the articulation angle γ between the front frame portion 14 and the rear frame portion 16. As a result, the controller 402 may determine the curvature data that corresponds to an entirety of the vehicle 10 which may be utilized to more accurately operate the vehicle 10. For example, when the vehicle 10 is operated according to the curvature that corresponds to the entirety of the vehicle 10, the vehicle 10 may be operated based on curvatures that the vehicle 10 will follow instead of front curvatures followed by the front frame portion 14 or rear curvatures followed by the rear frame portion 16 which may be different from the curvatures that the vehicle 10 will follow.
In some embodiments, step 806 includes filtering the curvature data to generate filtered curvature data. For example, a low pass filer may be applied to the curvature data to generate the filtered curvature data that includes less noise than the curvature data. The noise in the curvature data may be a result of noise in the partial curvature data. As a result, the filtered curvature data may be utilized to operate the vehicle in a smoother manner. In some embodiments, the controller 402 filters the curvature data to generate the filtered curvature data such that the controller 402 may operate the vehicle 10 based on the filtered curvature data which may result in smoother performance than when the vehicle 10 is operated based on the curvature data.
As utilized herein with respect to numerical ranges, the terms “approximately,” “about,” “substantially,” and similar terms generally mean+/−10% of the disclosed values, unless specified otherwise. As utilized herein with respect to structural features (e.g., to describe shape, size, orientation, direction, relative position, etc.), the terms “approximately,” “about,” “substantially,” and similar terms are meant to cover minor variations in structure that may result from, for example, the manufacturing or assembly process and are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of this disclosure pertains. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of the disclosure as recited in the appended claims.
It should be noted that the term “exemplary” and variations thereof, as used herein to describe various embodiments, are intended to indicate that such embodiments are possible examples, representations, or illustrations of possible embodiments (and such terms are not intended to connote that such embodiments are necessarily extraordinary or superlative examples).
The term “coupled” and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, or fluidic.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” or “below”) are merely used to describe the orientation of various elements in the figures. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
The present disclosure contemplates methods, systems, and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures, and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general-purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. Also, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen, and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.
The terms “client or “server” include all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus may include special purpose logic circuitry, e.g., a field programmable gate array (FPGA) or an application specific integrated circuit (ASIC). The apparatus may also include, in addition to hardware, a code that creates an execution environment for the computer program in question (e.g., a code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them). The apparatus and execution environment may realize various different computing model infrastructures, such as web services, distributed computing, and grid computing infrastructures.
The systems and methods of the present disclosure may be completed by any computer program. A computer program (also known as a program, software, software application, script, or code) may be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program may be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program may be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification may be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows may also be performed by, and apparatus may also be implemented as, special purpose logic circuitry (e.g., an FPGA or an ASIC).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data (e.g., magnetic, magneto-optical disks, or optical disks). However, a computer need not have such devices. Moreover, a computer may be embedded in another device (e.g., a vehicle, a Global Positioning System (GPS) receiver, etc.). Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media, and memory devices, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD ROM and DVD-ROM disks). The processor and the memory may be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, implementations of the subject matter described in this specification may be implemented on a computer having a display device (e.g., a CRT (cathode ray tube), LCD (liquid crystal display), OLED (organic light emitting diode), TFT (thin-film transistor), or other flexible configuration, or any other monitor for displaying information to the user. Other kinds of devices may be used to provide for interaction with a user as well; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback).
Implementations of the subject matter described in this disclosure may be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer) having a graphical user interface or a web browser through which a user may interact with an implementation of the subject matter described in this disclosure, or any combination of one or more such back end, middleware, or front end components. The components of the system may be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a LAN and a WAN, an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
It is important to note that the construction and arrangement of the vehicle 10 and the systems and components thereof (e.g., the driveline 50, the braking system 92, the control system 96, etc.) as shown in the various exemplary embodiments are illustrative only. Additionally, any element disclosed in one embodiment may be incorporated or utilized with any other embodiment disclosed herein.
This application is a continuation-in-part of U.S. patent application Ser. No. 18/467,544 filed Sep. 14, 2023, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 18467544 | Sep 2023 | US |
Child | 18883713 | US |