The present description relates to controlling agricultural harvesters. More specifically, the present description relates to controlling end dividers on a head of an agricultural harvester.
There are several different types of agricultural harvesters. One type of agricultural harvester is a combine harvester which can have different heads attached to harvest different types of crops.
In one example, a corn head can be attached to the combine harvester in order to harvest corn. A corn head may have row dividers and gathering chains. The row dividers help to divide the rows of corn and the gathering chains pull the corn stalks into a set of snap rolls that separate the ears of the corn plant from the stalks. The ears are then moved by an auger toward the center of the corn head where the ears enter the feeder house of the combine harvester. The ears are then further processed within the combine harvester to remove the kernels of corn from the cobs.
During a harvesting operation, after the ears of corn are separated from the stalk, the ears can bounce around on the head and can bounce off of the head onto the field and be lost. In order to address this type of loss, some corn heads have end dividers on the ends of the corn head. The end dividers can be raised manually to inhibit ear loss over the sides of the corn head. The end dividers can also be lowered manually.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
An agricultural system includes a head configured to be mounted to an agricultural harvester, an end divider, and an actuator configured to actuate the end divider. The agricultural system further includes an actuator controller that identifies a control action, corresponding to the end divider, to take based on an end divider action criterion detected by an input mechanism. The agricultural system also includes a control signal generation system that automatically generates a control signal to control the actuator to actuate the end divider based on the identified control action.
Example 1 is an agricultural system comprising:
Example 2 is the agricultural system of any or all previous examples, wherein the input mechanism comprises:
Example 3 is the agricultural system of any or all previous examples, wherein the input mechanism comprises:
Example 4 is the agricultural system of any or all previous examples, wherein the end divider comprises a plurality of end dividers and wherein the control signal generation system comprises:
Example 5 is the agricultural system of any or all previous examples, wherein the sensor comprises a sensor configured to sense, as the end divider action criterion, vegetation that is wrapped around the end divider and generate, as the criterion signal, a wrapping signal, and
Example 6 is the agricultural system of any or all previous examples, wherein the sensor comprises a sensor configured to detect, as the end divider action criterion, a crop state characteristic of crop proximate the agricultural harvester and generate, as the criterion signal, a crop state signal indicative of the crop state characteristic of crop proximate the agricultural harvester, and
Example 7 is the agricultural system of any or all previous examples, wherein the sensor comprises a sensor configured to detect, as the end divider action criterion, a harvest state of crop proximate the agricultural harvester and generate, as the criterion signal, a harvest state signal indicative of the harvest state of crop proximate the agricultural harvester, and
Example 8 is the agricultural system of any or all previous examples, wherein the sensor comprises a sensor configured to detect, as the end divider action criterion, material flow and generate, as the criterion signal, a material flow signal indicative of the material flow, and
Example 9 is the agricultural system of any or all previous examples, wherein the sensor comprises a sensor configured to detect, as the end divider action criterion, an orientation of ears of corn proximate the agricultural harvester and generate, as the criterion signal, an ear orientation signal indicative of the orientation of ears proximate the agricultural harvester, and
Example 10 is the agricultural system of any or all previous examples, wherein the sensor comprises a sensor configured to detect, as the end divider action criterion, a direction of travel of the agricultural harvester and generate, as the criterion signal, a heading signal indicative of the direction of travel of the agricultural harvester, and
Example 11 is the agricultural system of claim 1 and further comprising:
Example 12 is a method of controlling an end divider on a head of an agricultural harvester, the method comprising:
Example 13 is the method of any or all previous examples, wherein detecting the end divider action criterion comprises detecting, as the end divider action criterion, an operator input command on an operator interface mechanism.
Example 14 is the method of any or all previous examples, wherein detecting the end divider action criterion comprises:
Example 15 is the method of any or all previous examples, wherein detecting an end divider action criterion comprises:
Example 16 is the method of any or all previous examples, wherein detecting an end divider action criterion comprises:
Example 17 is the method of any or all previous examples, wherein detecting an end divider action criterion comprises:
Example 18 is the method of any or all previous examples, wherein detecting the end divider action criterion comprises:
Example 19 is the method of any or all previous examples, wherein detecting an end divider action criterion comprises:
Example 20 is an agricultural system comprising:
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
For the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the examples illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the disclosure is intended. Any alterations and further modifications to the described devices, systems, methods, and any further application of the principles of the present disclosure are fully contemplated as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, steps, or a combination thereof described with respect to one example may be combined with the features, components, steps, or a combination thereof described with respect to other examples of the present disclosure.
As shown, head 104 is a rigid head, meaning that head 104 is not foldable. Head 104 has a plurality of row dividers 106 and augers 108 and 110. Row dividers 106 separate the corn rows as agricultural harvester 100 moves through a field. The stalks are guided between row dividers 106 where gathering chains move the stalks into a set of snap rolls that remove the ears from the stalks. The ears are then moved toward a central portion of head 104 by augers 108 and 110, where the ears enter a feeder house, which feeds the ears into the combine harvester 102 for further processing.
As discussed above, after the ears are separated from the stalks, the ears can bounce around on head 104 and bounce over the end 112 of head 104 in the direction indicated by arrow 116. The ears can also bounce over end 114 of head 104 in the direction indicated by arrow 118. If the ears bounce over either end 112 or end 114, the ears fall to the ground and are lost.
Head 122 has opposite ends 140 and 142. Once ears of corn are separated from the stalks by the head 122 shown in
In some current systems, end dividers 146 and 150 are manually movable between the raised position and the retracted position. Therefore, in order to change the position of an end divider 146 or 150, the operator of the agricultural harvester 100 or 120 must exit the operator compartment 103 in order to effectuate a positional change of the end dividers 146 and 150. For instance, if the operator wishes to lower end divider 146 to the retracted position, the operator must exit the operator compartment 103 and manually lower end divider 146 into its retracted position. Similarly, if the operator then wishes to raise end divider 146, the operator, in current systems, must exit operator compartment 103 and manually raise end divider 146. Similarly, in current systems, the end dividers 146 and 150 are only positionable between the fully retracted position in which the end divider is fully retracted, and the fully raised position in which the end divider is fully raised.
The present description thus proceeds with respect to a system in which the end dividers 146 and 150 are automatically movable between the fully retracted position and the fully raised position. In some examples, positions of the end dividers 146 and 150 are selectable to any of a plurality of different positions between the fully retracted position and the fully raised position. Also, in some examples, the end dividers 146 and 150 are movable to a position based upon an operator input, such as an operator input made from within the operator compartment 103 of the combine harvester 102. Also, in some examples, the position of the end dividers 146 and 150 is automatically controlled based upon sensor inputs, operator inputs, or other inputs.
Referring again to
An actuator controller 160 generates control signals to control actuator 158 and actuator 159 based upon inputs from one or more input mechanisms 162. Input mechanisms 162 may include one or more sensors 164, one or more operator interface mechanisms 166, and one or more other input mechanisms 168. The operator interface mechanism 166 may be one or more of pedals, levers, joysticks, a steering wheel, buttons, switches, keypads, keyboards, a point and click device, a touch sensitive display device, an actuator displayed on a user interface, a speaker, speech synthesis and speech recognition functionality, and other audible, visible and haptic operator input and output devices. An operator 170 may therefore provide an input through operator interface mechanisms 166 to command end divider 146, or end divider 150, or both end dividers 146 and 150, to move to a desired position. The operator input mechanisms 166 may detect the command from operator 170 and provide an indication of the command to actuator controller 160. Actuator controller 160 generates control signals to control actuator 158 to control the position of end divider 146 in response to the provided command indication. Similarly, actuator 160 generates control signals to control actuator 159 to control the position of end divider 150 based on the command from operator 170. In some examples, actuator controller 160 generates separate control signals for each of the actuators 158 and 159. Consequently, in some instances, actuator 158 and actuator 159 are independently controllable relative to one another. Therefore, in some implementations, the position of end divider 146 is independently controllable relative to the position of end divider 150.
Also, in some examples, sensors 164 include a plurality of different sensors that generate sensor signals. The sensor signals are used by actuator controller 160 to automatically generate control signals to control actuator 158 and actuator 159 to thereby control the position of end divider 146 and end divider 150 based upon the sensor signals. Some examples of different types of sensor signals that are within the scope of signals generated by sensors 164 and used by actuator controller 160 to generate control signals to control actuator 158 and actuator 159 are discussed in greater detail below. In one example, sensors 164 can include observation sensor system 117.
In some examples, the rotation speed and direction of the end dividers 147 are automatically controlled based upon sensor inputs, operator inputs, or other inputs. For instance, sensors 164 can include a plurality of different sensors that generate sensor signals. The sensor signals are used by actuator controller 163 to automatically generate control signals to control actuator 161 to thereby control the speed and direction of rotation of end divider 147 based upon the sensor signals. Some examples of different types of sensor signals that are within the scope of signals generated by sensors 164 and used by actuator controller 163 to generate control signals to control actuator 161 are discussed in greater detail below. While some examples are discussed in the context of actuator 158 and end dividers 146 and 150, it is expressly contemplated that these examples are also applicable to actuator 161 and end divider 147.
Also, as illustrated in
In other examples, the operator 170 may be a remote operator 170, and thus some of the input mechanisms 162 (e.g., operator interface mechanisms 166) are located remotely from agricultural harvester 100, 120. Alternatively, even where an operator 170 is remote, some input mechanisms 162 (e.g., operator interface mechanisms 166) may remain local to the agricultural harvester 100, 120 and a remote operator interface may include its own respective operator interface mechanisms that provide similar functionality as operator interface mechanisms 166. The inputs into the remote operator interface mechanisms can be communicated to agricultural system 172 over a communication network. In yet other examples, the operator 170 may be an automated system. The automated system operator may be onboard or remote from agricultural harvester 100, 120. The automated system operator may provide inputs through input mechanisms 162 for the control of agricultural harvester 100, 120, such as control inputs to provide operating settings.
In
Also, it will be noted that, while head 144 is shown as a rigid head, head 144 could be a foldable head such as head 122 shown in
Some of the sensors 164 will now be described by way of example only. Geographic position sensor 173 senses a position of agricultural harvester 100, 120. Geographic position sensor 173 may be a global navigation satellite system (GNSS) receiver, a cellular triangulation sensor, or another type of sensor that senses the position of agricultural harvester 100, 120 in a global or local coordinate system.
Ear loss sensor 174 illustratively detects ear loss over the sides 148 and 152 of head 144, as well as ear loss from contact between the header 104 or component of head 104 (e.g., an end divider 146 and/or 150 or 147) and crop in an adjacent crop row. Ear loss sensor 174 includes optical sensors, such as an image capture device (e.g., a camera) that captures one or more images of an area proximate the ends 148 and 152 of head 144. In some implementations, the ear loss sensor 174 also includes image processing systems, such as an image processing system that processes the one or more captured images to identify any ears that are lost over the ends 148 and 152 of head 144. In some implementations, ear loss sensor 174 includes, for example, mechanical sensors, such as deflectable fingers that extend above the ends 148 and 152 of head 144 and are deflected by ears traveling over the top of head 144. In still other implementations, ear loss sensor 174 can be or include another type of sensor as well. Ear loss sensor 174 generates a signal indicative of detected ear loss. In one example, observation sensor system 117 is or includes an ear loss sensor.
The terrain sensor 176 detects the terrain over which agricultural harvester 100, is traveling, the terrain of ahead of agricultural harvester 100, 120 in the direction of travel, or both. Therefore, in some instances, terrain sensor 176 includes, for example, one or more accelerometers, one or more inertial measurement units, an optical sensor that senses the slope of the terrain in front of agricultural harvester 100, 120, or any of a variety of other terrain sensors. Additionally, or alternatively, terrain sensors 176 may include (or utilize sensor data from) sensors on the head that detect a distance of the head (at various points along the width of the head) from the surface of the field. Terrain sensor 176 generates a signal indicative of the terrain. In one example, observation sensor system 117 is or include a terrain sensor.
Heading sensor 178 detects the heading of agricultural harvester 100, 120. In some implementations, heading sensor 178 includes, for example, a GNSS receiver that detects a current location of agricultural harvester 100, 120. Two measurements can be taken from the GNSS receiver to determine a direction of travel of agricultural harvester 100, 120. In some instances, heading sensor 178 includes, for example, a compass or other heading sensor that detects the heading of agricultural harvester 100, 120. Heading sensor 178 generates a signal indicative of the heading.
In some implementations, map input mechanism 180 is a computer system through which one or more different maps can be downloaded and stored or otherwise accessed by agricultural system 172. In some implementations, map input mechanism 180 is an interactive computer system that can obtain or access maps that are stored in a remote location. Map input mechanism 180 generates a signal indicative of information on the map. Thus, map input mechanism 180 can detect characteristics on the one or more maps and generate signals indicative of those characteristics. For instance, map input mechanism 180 can detect crop state based on a crop state map and generate a crop state signal indicative of the crop state. Map input mechanism 180 can detect a harvest state based on a harvest map and generate a crop state signal indicative of the harvest state. Map input mechanism 180 can detect weed characteristic(s) based on a weed map or a vegetative index map, or both, and generate a weed signal indicative of the weed characteristic(s). Map input mechanism 180 can detect a variety of other characteristics from a variety of other types of maps and generate a variety of other corresponding signals indicative of the variety of other characteristics.
Road mode sensor 182 detects when agricultural harvester 100, 120 is in, or is changing to, a road mode in which agricultural harvester 100, 120 is about to travel out of a field. Road mode sensor 182 may detect that agricultural harvester 100, 120 is in road mode by detecting that agricultural harvester 100, 120 is on a road, is no longer in a field, or is about to leave a field. Road mode sensor 182 may take a variety of different forms. For instance, in some implementations, road mode sensor 182 receives an input from a geographic position sensor 173 to identify a current position of agricultural harvester 100, 120. Road mode sensor 182 compares that geographic position against a map that is downloaded or received by map input mechanism 180 to determine where agricultural harvester 100, 120 is located on the map. The fields on the map and the roads on the map are identified beforehand or identified during runtime processing. Therefore, road mode sensor 182 determines whether agricultural harvester 100, 120 is in a field or at a location other than a field, such as on a road. In some implementations, if agricultural harvester 100, 120 is on a road (or at a location other than a field), then road mode sensor 182 detects that agricultural harvester 100, 120 is in road mode. In another example, road mode sensor 182 receives an input indicative of the ground speed of agricultural harvester 100, 120. If the ground speed of agricultural harvester 100, 120 exceeds a threshold level, this may indicate that agricultural harvester 100, 120 is in road mode. In such an instance, the road mode sensor 182 interprets a speed in excess of a threshold level as an indication that agricultural harvester 100, 120 is traveling along a road and, thus, in road mode. Further, in some implementations where head 144 is a foldable head 144, road mode sensor 182 detects the position of the foldable portions 124 and 126 or the position of actuators 132 and 138 to determine whether the head 144 is in the deployed position, is in the folded position, whether the head is being commanded to move from the deployed position to the folded position, or whether the header is being moved from the deployed position to the folded position. When the head is in the folded position, is being commanded to moved to the folded position, or is being moved to the folded position, this indicates that agricultural harvester 100, 120 is in the road mode or is about to be placed in the road mode. In other examples, such as where the head 144 is a rigid head, road mode sensor 182 may detect the operation or other characteristics of other components of the harvester 100, 120 to determine whether the harvester 100, 120 is in road mode, for instance, grain tank covers 177 being folded, being moved to a folded position, or commanded to fold may indicate that the agricultural harvester 100, 120 is on the road or is about to leave the field, and, thus, is in road mode. In some instances, road mode sensor 182 also detects an operator input through an operator interface mechanism 166 to determine whether agricultural harvester 100, 120 is in the road mode. For instance, operator may depress a button or actuate another operator input mechanism such as any of the operator interface mechanisms 166 to place agricultural harvester 100, 120 in the road mode. The operator input is detected by road mode sensor 182 to determine whether agricultural harvester 100, 120 is in road mode. Road mode sensor 182 generates a signal indicative of whether agricultural harvester 100, 120 in in road mode. In some examples, an imaging system (e.g., observation sensor system 117) may detect when agricultural harvester 100, 120 is in, or is changing to a road mode.
Field mode sensor 184 may detect whether agricultural harvester 100, 120 is in the field mode. Detecting that agricultural harvester 100, 120 is in the field mode when agricultural harvester 100, 120 is in a field and is configured to perform a harvesting operation or is performing a harvesting operation. For instance, field mode sensor 184 detects whether the crop processing mechanisms in combine harvester 102, 120 are operating (such as whether threshing and separating mechanisms are operating, whether the gathering chain and rotors 108 and 110 on the head 144 are operating, among other things). For example, if the crop processing mechanisms are operating, then field mode sensor 184 detects that agricultural harvester 100, 120 is in the field mode. In some examples, field mode sensor 184 detects whether the crop processing mechanisms in combine harvester 102, 120 are being commanded to operate, such as through operator interface mechanisms 166, In some implementations, field mode sensor 184 compares a current geographic location of agricultural harvester 100, 120 against a map to determine whether agricultural harvester 100, 120 is in a field or in an area other than a field (such as on a road). Field mode sensor 184 determines that, if agricultural harvester 100, 120 is in a field, agricultural harvester 100, 120 is in field mode. In some instances, field mode sensor 184 receives an operator input through operator interface mechanisms 166 indicating that operator 170 has placed the agricultural harvester 100, 120 in field mode. In other examples, field mode sensor 184 may detect the operation or other characteristics of other components of the harvester 100, 120 to determine whether the harvester 100, 120 is in field mode, for instance, grain tank covers 177 being opened, being moved to an opened position, or commanded to open may indicate that the agricultural harvester 100, 120 is on the field or is about to enter a field, and, thus, is in field mode. Field mode sensor 184 generates a signal indicative of whether agricultural harvester 100, 120 is in field mode.
Adjacent pass harvest state sensor 186 detects whether crops in the field adjacent the current position of agricultural harvester 100, 120 have been harvested or are still unharvested. For instance, adjacent pass harvest state sensor 186 can determine whether the crops in the area of the field immediately adjacent the left-hand side of head 144 has been harvested as well as whether the crops in the field immediately adjacent the right-hand side of head 144 have been harvested. Adjacent pass crop state sensor 186 can thus include a processor that processes a harvest map that maps where crops in a field have already been harvested. Based upon the harvested locations on the harvest map, and the current location of agricultural harvester 100, 120, adjacent pass harvest state sensor 186 may generate an output indicating whether the crops have been harvested adjacent the sides of head 144. Adjacent pass harvest state sensor 186 may also include an image capture device, such as a camera, along with an image processing computer system that receives images captured by the image capture device and processes those images to identify items in the images, such as crop stalks, standing crops, harvested crops, or other items. Images of the field adjacent the sides of head 144 can be captured and image processing can be performed to determine whether crop is still standing or has been harvested. Adjacent pass harvest state sensor 186 can detect whether the crop adjacent the sides of head 144 have been harvested in other ways as well.
Weed sensor 188 detects characteristics of weeds, such as weed type and the intensity of weeds. Without limitation, weed intensity may include at least one of weed presence, weed population, weed growth stage, weed biomass, weed moisture, weed density, a height of weeds, a size of a weed plant, an age of weeds, or health condition of weeds at a location within an area. The measure of weed intensity may be a binary value (such as weed presence or weed absence), or a continuous value (such as a percentage of weeds in a defined area or volume) or a set of discrete values (such as low, medium, or high weed intensity values). Without limitation, weed type may include categorization of weeds, such as identification of species or a broader classification, such as vine type weeds vs. non-vine type weeds. In one example, observation sensor system 117 is or includes adjacent pass harvest state sensor 186.
A vegetative index map illustratively maps vegetative index values (which may be indicative of vegetative growth) across different geographic locations in a field of interest. One example of a vegetative index includes a normalized difference vegetation index (NDVI). There are many other vegetative indices that are within the scope of the present disclosure. In some examples, a vegetative index may be derived from sensor readings of one or more bands of electromagnetic radiation reflected by the plants. Without limitations, these bands may be in the microwave, infrared, visible, or ultraviolet portions of the electromagnetic spectrum.
In some implementations, a vegetative index map is used to identify the presence and location of vegetation. In some examples, these maps enable weeds to be identified and georeferenced in the presence of bare soil, crop residue, or other plants, including crop or other weeds. For instance, at the end of a growing season, when a crop is mature, the crop plants may show a reduced level of live, growing vegetation. However, weeds often persist in a growing state after the maturity of the crop. Therefore, if a vegetative index map is generated relatively late in the growing season, the vegetative index map may be indicative of the location of weeds in the field. In some instances, though, the vegetative index map may be less useful (or not at all useful) in identifying an intensity of weeds in a weed patch or the types of weeds in a weed patch. Thus, in some instances, a vegetative index map may have a reduced usefulness in predicting how to control an agricultural harvester as the agricultural harvester moves through the field.
Weed sensor 188 may, thus, include an image capture device that captures images of the field immediately forward of agricultural harvester 100, 120, along with an image processing system that processes the image to identify the intensity of weeds. Weed sensor 188 may also include a map accessing system that obtains vegetative index values from a vegetative index map, such as an NDVI map, along with the current location of agricultural harvester 100, 120, to determine the intensity of weeds. Weed sensor 188 can include other weed sensors as well. In one example, observation sensor system 117 is or includes weed sensor 188.
Ground speed sensor 190 may detect the ground speed of agricultural harvester 100, 120. Ground speed sensor 190 may thus be a sensor that senses the rotational speed of an axel or another sensor that generates an output indicative of the ground speed of agricultural harvester 100, 120.
Hair pinning sensor 181 detects hair pinning on the end dividers 146, 150. Hair pinning is the accumulation of vegetation on an edge of the end dividers, such as the front edge of the end dividers. Hair pinning sensor 181, in one example, includes a camera or other optical sensor that captures images including the end dividers. The captured images are then processed to receive the vegetation hair pinning information. Hair pinning sensor 181, in one example, includes one or more operator interface mechanism 166 that allows a user to provide hair pinning information. In one example, observation sensor system 117 is or includes hair pinning sensor 181.
Wrapping sensor 183 detects wrapping on the end dividers. Wrapping is the accumulation of vegetation (or other objects) around an active end divider. Wrapping sensor 183, in one example, includes a camera or other optical sensor that captures images including the end dividers. The captured images are then processed to receive the wrapping information. Wrapping sensor 183, in one example, includes one or more operator interface mechanism 166 that allows a user to provide wrapping information. In some examples, wrapping sensor 183 includes a force sensor. In one example, observation sensor system 117 is or includes wrapping sensor 183.
Crop state sensor 185 detects the crop state proximate agricultural system 172. Crop state sensor 185, in one example, includes a camera or other optical sensor that captures images of the field proximate agricultural machine 100, 120. The captured images are then processed to receive the crop state adjacent to agricultural machine 100, 120 (e.g., in front of, behind, or to the sides of agricultural machine 100, 120). Thus, crop state sensor 185 can detect crop state of crop in the current row that is being harvested as well as the row(s) adjacent to agricultural machine 100, 120. Crop state sensor 185, in one example, includes one or more operator interface mechanism 166 that allows a user to provide crop state information. Crop state information can be indicative of the amount of downed crop, the magnitude of the downing (e.g., fully down, half down, slightly down, etc.), the direction of downing, etc. In one example, observation sensor system 117 is or includes crop state sensor 185.
In other examples, the crop state can be derived from a map of the field, such as a crop state map that maps crop state values to different geographic locations across the field. The crop state values can be indicative of the locations of downed crop, the amount of downed crop, the magnitude of downing, and the direction of downing.
Material flow sensor 187 detects material flow falling over the side of head 144. Material flow sensor 187, in one example, includes a camera or other optical sensor that captures images proximate head 144. The captured images are then processed to receive the material flow over the sides of head 144. Material flow sensor 187, in one example, includes one or more operator interface mechanism 166 that allows a user to provide material flow information. In some examples, material flow sensor 187 includes radar, sonar or lidar systems. In one example, observation sensor system 117 is or includes material flow sensor 187.
Ear orientation sensor 189 detects the orientation of ears of corn on the crop to be harvested. Ear orientation sensor 189, in one example, includes a camera or other optical sensor that captures images of the crop to be harvested. The captured images are then processed to receive the ear orientation information. Ear orientation sensor 189, in one example, includes one or more operator interface mechanism 166 that allows a user to provide ear orientation information. In one example, observation sensor system 117 is or includes ear orientation sensor 189.
Operator interface mechanisms 166 may include a wide variety of different operator interface mechanisms that can be used to provide information to operator 170 and receive inputs from operator 170. Therefore, operator interface mechanisms 166 include, for example, a steering wheel, one or more joysticks, buttons, levers, linkages, pedals, or an operator interface display screen that generates displays for operator 170. An operator interface display screen may also display a graphical user interface with operator actuatable mechanisms (such as links, buttons, icons, etc.) that can be actuated by operator 170 to provide an input to agricultural system 172. The operator actuatable mechanisms can be actuated using a point and click device, such as a mouse or trackball, or by a touch gesture where the operator interface display mechanism is a touch sensitive display screen. The operator interface mechanisms 166 may include a microphone and speaker where speech recognition and speech synthesis are provided. The operator interface mechanisms 166 may also include other audio, visual, or haptic devices.
Before describing actuator controller 1600 in more detail, and by way of overview, data store 206 stores maps and data values that can be used by sensor signal/operator input signal processing system 208. Sensor signal/operator input signal processing system 208 receives the signals from sensors 164 and processes the signals to detect variables indicated by the sensor signals. Sensor signal/operator input signal processing system 208 provides an output indicative of the detected variables to control signal generation system 210. Control signal generation system then generates control signals that are output from actuator controller 1600 (e.g., 160) and transmitted to actuators 158 and 159 to control actuators 158 and 159 to move end dividers 146 and 150 to the desired positions or generates control signals that are output from actuator controller 1600 (e.g., 163) and transmitted to actuators 161 to control actuators 161 to control rotation of end dividers 147.
Data store 206 includes information that is used by one or more of sensor signal/operator input signal processing system 208 or control signal generation system 210. Therefore, as an example, data store 206 may include default end divider set point values 214 (e.g., default position values, default rotation speed values, etc.), prior end divider set point values 216 (e.g., prior position values, prior rotation speed values, etc.), maps 218, and other items 220.
Sensor signal/operator input signal processing system 208 may include ear loss signal processor 222, terrain signal processor 224, prior set point processor 226, direction of travel processor 228, map processor 230, road mode signal processor 232, field mode signal processor 234, harvested/unharvested signal processor 236, weed signal processor 238, ground speed signal processor 240, operator input processor 242, other input processor 244, and combination signal processor 246. Control signal generation system 210 may include control action identification system 248, control signal generator 250, and other items 252. Control action identification system 248 may include end divider identifier 254, raise/lower action identifier 256, set point identifier 258, speed identifier 259, direction identifier 261, and other items 260.
Ear loss signal processor 222 receives a signal from ear loss sensor 174 and processes that signal to determine whether ear loss is occurring. Further, if ear loss is occurring, ear loss signal processor 222 determines the location on header 144 where the ear loss is occurring. The ear loss may be occurring over either end or both ends of head 144, for example. When ear loss is detected over one or both ends of the head 144, ear loss signal processor 222 generates a signal indicative thereof and control action identification system identifies an end divider command commanding one or both of end dividers 146 and 150 to be raised to prevent ear loss at the end of the head 144 where ear loss is detected. In another example, ear loss, in the form of ears being knocked off of crop plants in adjacent crop row(s), may be occurring due to contact between the end dividers 146 or 150, or both, and the crop plants in the adjacent crop row(s). When ear loss, in the form of ears being knocked off of crop plants, is detected, ear loss signal processor 222 generates a signal indicative thereof and control action identification system identifies an end divider command commanding one or both of end dividers 146 and 150 to be lowered to prevent ears from being knocked off of crop plants in adjacent crop row(s). In another example, when ear loss is detected, ear loss signal processor 222 generates a signal indicative thereof and control action identification system identifies an end divider command commanding one or both end dividers 147, such as to adjust a speed of rotation (including stopping or starting) of one or both end dividers 147.
Terrain signal processor 224 receives the signal from terrain sensor 176 and determines whether the terrain is sloping, so that one of the ends of header 144 is lower than the other end of header 144. Terrain signal processor 224 also determines whether agricultural harvester 100, 120 is approaching a trench or other terrain feature. If the terrain is sloping so that one of the ends of header 144 is lower than the other end of header 144 or if agricultural harvester 100, 120 is approaching a trench, then terrain signal processor 224 outputs a signal to control signal generation system 210 indicating the direction of slope or the location of the trench and control action identification system 248 identifies an end divider command so that the position of end dividers 146 and 150 can be controlled to avoid ear loss due to the terrain. Control action identification system 248 may also identify an end divider command commanding that one or both end dividers 146 and 150 be raised. For instance, if the terrain slopes so that the left end of header 144 is lower than the right end of header 144, then end divider 146 is raised to avoid ear loss over the left end of header 144. If agricultural harvester 100, 120 is about to traverse a trench, then both end dividers 146 and 150 can be raised to avoid ear loss over both ends of the header 144 while agricultural harvester 100, 120 traverses the trench.
Prior set point processor 226 accesses the prior set point values 216 in data store 206. The prior set point values 216 indicate the position to which end dividers 146 and 150 have been set in the past or the speeds of rotation at which end dividers 147 have been set in the past. The prior set point values 216 are also geo-referenced values to indicate the location of agricultural harvester 100, 120 corresponding to the prior set point value 216. These geo-referenced set point values 216 may be used in automatically controlling the positions of end dividers 146 and 150. For instance, where operator 170 disengages the automatic control of end dividers 146 and 150, and then re-engages the automated control of end dividers 146 and 150, then prior set point processor 226 can obtain the prior set point values 216 for the end dividers 146 and 150 just prior to disengaging the automatic control of the end dividers 146 and 150. Prior set point processor 226 may then generate an output signal to control signal generation system 210 indicating the prior set point values. As a result, the end dividers 146 and 150 can be automatically set to the prior positions.
Direction of travel processor 228 receives a signal from heading sensor 178 and identifies the direction of travel of agricultural harvester 100, 120. By way of example, it may be that the operator 170 had controlled the end divider position so that the right end divider 150 was in the raised position while the left end divider 146 was in the retracted position. This may happen, for instance, because the crop to the right of the head 144 has already been harvested while the crop to the left of head 144 has not been harvested, and a raised end divider may dislodge or otherwise separate ears from the unharvested row adjacent the left side of head 144. However, once the agricultural harvester 100, 120 makes a headland turn, then direction of travel processor 228 determines that the agricultural harvester is now heading in the opposite direction from the last pass and control action identification system 248 identifies that now the left end divider 146 should be moved to its raised position and the right end divider 150 should be moved to its retracted position because now the unharvested crop is to the right of head 144. Direction of travel processor may provide an output indicative of the direction of travel of agricultural harvester 100, 120 and the desired end divider positions based upon the direction of travel. In another example, based on an indication of the direction of travel of agricultural harvester 100, 120 from direction of travel processor 228, control action identification system 248 identifies speeds of rotation for the end dividers 147, such as to slow rotation or stop rotation of one end divider 147 and to increase rotation or start rotation of the other end divider 147.
Map processor 230 may process any maps that are received through map input mechanism 180. Map processor 230 may receive, for instance, a harvest map to determine what portions of the current field are harvested and where those portions lie relative to the current position of agricultural harvester 100, 120 and relative to the direction of travel of agricultural harvester 100, 120. Map processor 230 may determine the location of agricultural harvester 100, 120 relative to the edges of the field, relative to fence lines or tree lines, or relative to other features, such as other features noted on a map. Map processor 230 may process a vegetative index map to identify the location of weed patches. Map processor 230 may process a crop state map to identify the locations of downed crop, the amounts of downed crop, the magnitude of downing, and the direction of downing. Map processor 230 may process a harvest map to identify the harvest state (e.g., harvested or unharvested) of crop in adjacent rows. Map processor 230 may process a weed map to identify weed characteristics, such as weed presence, weed type, weed intensity, as well as various other weed characteristics. Map processor 230 may process a map to identify whether agricultural harvester 100, 120 is currently located in a field or on a road or located elsewhere. In some implementations, map processor 230 may be used as the road mode sensor 182 and generate an output signal indicating that agricultural harvester 100, 120 is located on a road. Map processor 230 may generate an output signal indicative of the features on one or more maps and control action identification system 248 can identify commanded end divider actions based on the features on the one or more maps.
Road mode signal processor 232 receives an input from road mode sensor 182 and determines whether agricultural harvester 100, 120 is in road mode or is being moved into the road mode from the field mode. Agricultural harvester 100, 120 is in road mode when it is physically configured to travel on the road as opposed to through a field. For instance, if road mode sensor 182 detects that the actuators 132 and 138 on a foldable head are being moved from a position in which the head is unfolded to a position in which the head is folded, road mode sensor 182 may provide an output indicative of the changing positions of actuators 132 and 138 to road mode signal processor 232. Based upon the signal from road mode sensor 182, road mode signal processor 232 may determine that agricultural harvester 100, 120 is being moved to the road mode and provide an output indicating that agricultural harvester 100, 120 is being moved to the road mode and control action identification system 248 identifies a command to lower the end dividers 146 and or to stop rotation of end dividers 247 to control signal generator 250.
Field mode signal processor 234 receives an input from field mode sensor 184 and determines whether agricultural harvester 100, 120 is in the field mode and generates an output indicating whether agricultural harvester 100, 120 is in the field mode. Agricultural harvester 100, 120 is in field mode when it is physically configured to travel through a field as opposed to on a road. For instance, if field mode sensor 184 provides an output indicating that the crop processing systems in agricultural harvester 100, 120 are operating, field mode signal processor 234 may determine that agricultural harvester 100, 120 is in the field mode. Field mode signal processor 234 may then generate an output indicating that agricultural harvester 100, 120 is in field mode and control action identification system 248 identifies a commanded end divider action, such as a commanded end divider action that commands end divider position so that the end dividers 146, 150 should be raised or an end divider action that commands end dividers 147 should begin rotation.
Harvested/unharvested signal processor 236 may receive a signal from adjacent pass harvest state sensor 186 and determine whether the crop adjacent the sides of head 144 has been harvested or is still unharvested. Based upon the signal from adjacent pass harvest state sensor 186, harvested/unharvested signal processor 236 may determine, for instance, that the crop on the right side of head 144 has already been harvested, while the crop on the left side of head 144 has not been harvested. Thus, control action identification system 248 may command the end divider 150 to be raised and end divider 146 to be lowered or may command one end divider 147 to begin or increase rotation and the other end divider 147 to stop or decrease rotation.
Weed signal processor 238 may receive a signal from weed sensor 188 and determine whether weeds are currently being encountered by head 144. Weed signal processor 238 may also determine the intensity of the weeds. Weed signal processor 238 may also determine the type of weeds (e.g., viny or non-viny). For instance, where weed sensor 188 is an optical sensor and provides an output indicative of the presence of weeds over a pre-defined area (e.g., field of view of the sensor), weed signal processor 238 may process that signal to indicate that weeds are present, the type of weeds, and that the intensity of the weeds is at a certain intensity level. Control action identification system 248 may then command the end dividers 146, 150 to be lowered to avoid entanglement in heavy weeds or that the end dividers 146 and 150 be moved to another position or may then command the one or more end dividers 147 to control their rotation based on the weed characteristics.
Ground speed signal processor 240 may receive an input signal from ground speed sensor 190 indicative of the ground speed of agricultural harvester 100, 120. Ground speed signal processor 240 may process that signal to determine the ground speed of agricultural harvester 100, 120, which can be used to determine that agricultural harvester 100, 120 is in road mode or field mode, for example. Control action identification system 248 can the identify a command commanding whether the end dividers 146 and 150 should be raised or lowered or whether the rotation of end dividers 147 should be changed. Thus, control action identification system 248 may identify an end divider command (to control end dividers 146, 150 or end dividers 147) based upon the ground speed.
Operator input processor 242 may receive a signal from operator interface mechanisms 166 indicative of an input from operator 170. Operator input processor 242 may process the operator input to indicate the desired positions of end dividers 146 and 150 or desired rotations of end dividers 147 based upon the operator input. By way of example, it may be that operator 170 provides an input through operator interface mechanisms 166 commanding that end divider 146 be raised to the fully raised position and commanding end divider 150 to be raised only to a halfway point between the fully retracted position and the fully raised position. In another example, it may be that the operator 170 provides an input through operator interface mechanisms 166 commanding that rotational speed of one end divider 147 be increased while the rotational speed of another end divider 147 be decreased. These are merely examples. Control action identification system 248 may then identify an end divider command that commands an end divider position or an end divider rotation based upon the detected operator input. Other input processor 244 may receive inputs from other sensors.
Hair pinning signal processor 241 may receive a signal from hair pinning sensor indicative of vegetation hair pinning on one or more surfaces of the agricultural harvester 100, 120. Hair pinning signal processor 241 may process that signal to determine that vegetation is hair pinning, for example, on one or more end divider 146, 150. Hair pinning signal processor 241 may also determine the intensity of the hair pinning. For instance, where hair pinning sensor 181 is an optical sensor and provides an output indicative of the presence of hair pinning on a portion of agricultural machine 100, 120, hair pinning signal processor 241 may process that signal to indicate that hair pinning is present, and that the intensity or amount of hair pinning is at a certain intensity level. Hair pinning signal processor 241 may also determine the type of vegetation causing the hair pinning. For instance, where hair pinning sensor 181 is an optical sensor and provides an output indicative of the presence of hair pinning on a portion of agricultural machine 100, 120, hair pinning signal processor 241 may process that signal to indicate that hair pinning is present, and that the hair pinning is being caused by a specific type of vegetation (e.g., a specific type of weed, the crop plants, or a specific portion of a plant). Control action identification system 248 may then identify an end divider command to control the end dividers 146, 150 based on the detected hair pinning, such as to lower one or both of end dividers 146, 150.
Wrapping signal processor 243 may receive a signal from wrapping sensor 183 indicative of vegetation wrapping on one or more surfaces of agricultural machine 100, 120. Wrapping signal processor 243 may process that signal to determine that an object is wrapping, for example, on end dividers 147. Wrapping signal processor 243 may also determine the intensity of the wrapping. For instance, where wrapping signal processor 243 is an optical sensor and provides an output indicative of the presence of wrapping on a portion of agricultural machine 100, 120, wrapping signal processor 243 may process that signal to indicate that wrapping is present, and that the intensity or amount of wrapping is at a certain intensity level. Wrapping signal processor 243 may also determine the type of vegetation or object causing the wrapping. For instance, where wrapping sensor 183 is an optical sensor and provides an output indicative of the presence of wrapping on a portion of agricultural machine 100, 120, wrapping signal processor may process that signal to indicate that wrapping is present, and that the wrapping is being caused by a specific object (e.g., a piece of wire or twine, a specific type of weed, the crop plants, or a specific portion of a plant). Action signal identification system 248 may then identify an end divider command to control the end dividers 147 based on the detected wrapping, such as to reduce, reverse, or stop rotation of one or both of end dividers 147.
Crop state signal processor 245 may receive a signal from crop state sensor 185 indicative of the crop state of crop proximate agricultural machine 100, 120. Crop state signal processor 245 may process that signal to determine that crop proximate to agricultural machine 100, 120 is in some state of being downed. Crop state signal processor 245 may also determine the intensity (magnitude) of the downing. For instance, where crop state sensor 185 is an optical sensor and provides an output indicative of the presence of downed crop proximate agricultural machine 100, 120, crop state signal processor 245 may process that signal to indicate that downed crop is present, and that the intensity (e.g., not downed, partially downed, fully downed, etc.) or amount of downed crop is at a certain intensity level. Crop state signal processor 245 may also determine the direction (e.g., compass direction, or direction relative to the agricultural harvester) of the downed crop. For instance, where crop state sensor 185 provides an output indicative of the presence of downed crop proximate agricultural machine 100, 120, crop state signal processor 245 may process that signal to indicate that downed crop is present, and that the crop is downed to the East. Action signal identification system 248 may then identify an end divider command to control the end dividers 146, 150 (e.g., raise or lower) or the end dividers 147 (e.g., adjust rotation) based on the detected crop state.
Material flow signal processor 247 may receive a signal from material flow sensor 187 indicative of material flow over the side of head 144. Material flow signal processor 247 may process that signal to determine that material is flowing over the side of head 144. Material flow signal processor 247 may also determine the intensity of the material flow. For instance, where material flow sensor 187 is an optical sensor and provides an output indicative of the presence of material flow over the side of head 144, material flow signal processor 247 may process that signal to indicate that material flow over the side is present, and that the intensity or amount of material flow over the side of head 144 is at a certain intensity level. Material flow signal processor 247 may also determine the type of material flowing over the side. For instance, where material flow sensor 187 is an optical sensor and provides an output indicative of the presence of material flow over the side of head 144, material flow signal processor 247 may process that signal to indicate that material flow over the side is present, and that the material flowing over the size comprises specific types of material (e.g., weeds, the crop plants, or a specific portion of a plant). Action signal identification system 248 may then identify an end divider command to control the end dividers 146, 150 (e.g., raise or lower) or the end dividers 147 (e.g., adjust rotation) based on the detected material flow.
Ear orientation signal processor 249 may receive a signal from ear orientation sensor 189 indicative of ear orientation proximate agricultural machine 100, 120. Ear orientation signal processor 249 may process that signal to determine the orientation of ears of corn. Ear orientation signal processor 249 may also determine the distribution of varying ear orientations. For instance, where ear orientation sensor 189 is an optical sensor and provides an output indicative of the presence of ear orientation, ear orientation signal processor 249 may process that signal to indicate that a first percent of ears are in a first orientation, a second percent of ears are in a second orientation, etc. Action signal identification system 248 may then identify an end divider command to control the end dividers 146, 150 (e.g., raise or lower) or the end dividers 147 (e.g., adjust rotation) based on the detected crop state.
Combination signal processor 246 may receive inputs from a combination of the different sensors 164 and operator interface mechanisms 166 and generate an output indicative of the desired position of end dividers 146 and 150. For instance, combination signal processor 246 may receive an input from geographic position sensor 173 identifying the geographic position of agricultural harvester 100, 120. Combination signal processor 246 may also receive an input from map input mechanism 180 that includes a map of field boundaries with fences. Combination signal processor 246 may also receive an input from heading sensor 178 that identifies the heading of agricultural harvester 100, 120. Based upon the location of harvester 100, 120 relative to the fences identified in the map, and based upon the heading of agricultural harvester 100, 120 detected by heading sensor 178, control action identification system 248 may identify an end divider command that that commands the end divider 146, 150 closest to the fence line should be moved to the retracted position in order to avoid being caught on the fence or that commands the end divider 147 to stop or slow rotation to avoid being caught in the fence. Control action identification system 248 may identify an end divider command indicative of a commanded position or a commanded rotation of the end divider(s) based upon the combination of inputs.
Once control signal generation system 210 receives one or more inputs from sensor signal/operator input signal processing system 208, control action identification system 248 identifies the control action (e.g., end divider command) that is to be taken and control signal generator 250 generates control signals to execute the identified control action. For example, assume that the end divider command from control action identification system 248 indicates that end divider 146 should be moved to the fully retracted position and end divider 150 should be raised to a raised position midway between the retracted position and fully raised position. End divider identifier 254 then identifies which end dividers 146 and 150 is affected by the end divider command. In the present example, both end dividers 146 and 150 will be affected by the end divider command. Raise/lower action identifier 256 determines whether the end divider command is to raise or lower a particular end divider, and set point identifier 258 identifies the set point (which may be indicative of the desired end divider position) for the end divider that is to be raised or lowered. Continuing with the present example in which the end divider 150 is to be raised to the midway point between the fully retracted and fully raised positions, end divider identifier 254 identifies the affected end divider as end divider 150. Raise/lower action identifier 256 identifies that the end divider 150 is to be raised and set point identifier 258 identifies, from the end divider command, that the set point for end divider 150 is the midway point between the fully retracted position and the fully raised position. Control action identification system 248 provides an output to control signal generator 250 indicating that end divider 150 is to be raised to the midpoint position. Control signal generator 250 then generates control signals to control actuator 159 to raise end divider 150 to the midpoint position. Position/height sensor 194 may sense the position or height of end divider 150 and provide a feedback signal to control signal generator 250. In another example, control signal generator 250 generates control signals in an open loop fashion in which the set point is commanded, and no feedback is used.
In another example, assume that the end divider command from control action identification system 248 indicates that the rotation of one end divider 147 should be increased to 80% (or given RPMs) speed and the rotation of another end divider 147 should be decreased to 50% (or given RPMs) speed. End divider identifier 254 then identifies which end dividers 147 is affected by the end divider command. In the present example, both end dividers 147 will be affected by the end divider command. Speed identifier 259 determines whether the end divider command is to raise or lower the rotational speed of a particular end divider, and set point identifier 258 identifies the set point (which may be indicative of the desired end divider rotational speed) for the end divider that is to be increased or decreased in speed. Continuing with the present example in which the rotational speed of one end divider 147 is to be increased to 80%, end divider identifier 254 identifies the affected end divider 147 (e.g., the end divider 147 on a first end of head 144). Speed identifier 259 identifies that the rotational speed of the end divider 147 on the first end is to be increased and set point identifier 258 identifies, from the end divider command, that the set point for end divider 147 is 80% (or given RPMs). Control action identification system 248 provides an output to control signal generator 250 indicating that the rotation of the end divider 147 on the first end is to be increased to the set point. Control signal generator 250 then generates control signals to control actuator 161 to increase the speed of the identified end divider 147 to the set point. Speed sensor 197 may sense the speed of the end divider 147 on the first end and provide a feedback signal to control signal generator 250. In another example, control signal generator 250 generates control signals in an open loop fashion in which the set point is commanded, and no feedback is used.
The present description will proceed with respect to head 144, but it will be appreciated that the description could be applied to head 104 shown in
Operator 170 then provides an operator input to actuate the end dividers, such as to raise the end dividers 146, 150 or to initiate rotation of end dividers 147. In the example illustrated in
In one example, the end dividers 146 and 150 are each configured with an actuator 158 and 159, respectively, so that each end dividers 146 and 150 can be controlled individually as indicated by block 278. In one example, the end dividers 147 are each configured with an actuator 161, respectively, so that each end divider 147 can be controlled individually as indicated by block 278. In one example, block 280 shows that the actuators 158 and 159 can be set to multiple different positions between the fully retracted and fully raised positions so that end dividers 146 and 150 can be set to any of multiple different heights. In one example, block 280 shows that actuators 161 can be set to multiple different settings such that end dividers 147 can be set to any of a variety of different speeds of rotation. In one example, block 282 shows that operator 170 can provide an input to actuator controller 160 so that actuator controller 160 is placed in an auto-control mode so that actuator controller 160 automatically controls the height of end dividers 146 and 150 based upon inputs from input mechanism(s) 162. In one example, block 282 shows that operator 170 can provide an input to actuator controller 163 so that actuator controller 163 is placed in auto-control mode so that actuator controller 163 automatically controls the rotation of end dividers 147. In one example, automatically means that the operation is performed without further operator involvement except, perhaps, to initiate or authorize the operation. Block 284 shows that operator 170 may provide an operator input in other ways, and the operator input may be detected in other ways as well.
In one example, control action identifier system 248 then identifies the end dividers that are being commanded, and determines that the commanded action is to raise the end dividers 146 and 150. Set point identifier 258 identifies the set point of the command, indicating that particular position of end dividers 146 and 150 relative to their fully retracted or fully raised positions. Block 286 shows that control signal generator 250 then generates control signals to control the actuators 158 and 159 to thereby move end dividers 146 and 150, respectively, to the commanded positions.
Keeping with the above example, block 288 shows that actuator controller 160 then begins to automatically control the position of end dividers 146 and 150 based upon the inputs from input mechanisms 162. In one example, the automated control of the position of end dividers 146 and 150 can continue until the harvesting operation is complete or until some other end criteria are met, as indicated by block 290 in the flow diagram of
In another example, control action identifier system 248 then identifies the end dividers that are being commanded, and determines that the commanded action is to control rotation of the end dividers 147. Set point identifier identifies the set point of the command, indicating that particular rotation of end dividers 147 relative to their range of rotation. Block 286 shows that control signal generator 250 the generates control signals to control the actuators 161 to thereby actuate end dividers 147, respectively, based on the commanded rotation.
Keeping with the above example, block 288 shows that actuator controller 163 then begins to automatically control the rotation (e.g., speed and/or direction) of end dividers 147 based upon the inputs from input mechanisms 162. In one example, the automated control of the rotation of end dividers 147 can continue until the harvesting operation is complete or until some other end criteria are met, as indicated by block 290 in the flow diagram of
Control signal generation system 210 then generates control signals to actuate one or more of the end dividers based upon the identified end divider command. For instance, terrain signal processor 224 may generate a signal identifying a downhill end divider and control action identification system 248 may identify a command indicating that the downhill end divider should be raised. In another example, terrain signal processor 224 may also determine that the terrain signal output by terrain sensor 176 shows an upcoming ditch. Control action identification system 248 may then determine that both end dividers should be raised while the agricultural harvester 100, 120 traverses the ditch and generates an end divider command. In another example, terrain signal processor 224 may also determine that the terrain signal output by terrain sensor 176 shows that the harvester (or head) will be rolled (e.g., one side of the head will be lower than another side of the head) due to upcoming terrain. Control action identification system 248 may then determine that the end divider on the lower end should be actuated (e.g., increase rotational speed) and generates an end divider command. Block 298 shows that control signal generator 250 then generates control signals to actuate the end dividers based upon the signal from terrain signal processor 224 and the action identified by control action identification system 248.
In one example, if the weed intensity exceeds a threshold, the end dividers can be lowered. When the end dividers are lowered, the likelihood of hair pinning weeds is reduced. Similarly, if the weed intensity falls below a threshold, the end dividers can be raised. In one example, the end dividers may be lowered or may be raised depending on the weed type. For instance, some weed types may be more or less likely to result in hair pinning. Similarly, the active end dividers may be slowed (or stopped) or increased in rotational speed depending on the weed type. For instance, some weed types (e.g., viny weeds) may be more likely to wrap, whereas other weed types (e.g., non-viny weeds) may be less likely to wrap.
In one example, if the weed intensity exceeds a threshold, the active end dividers can be slowed down. When the active end dividers are slowed down, wrapping can be prevented, possibly different depending on weed type. Particularly, viny weeds are more prone to wrap and the reduction of speed may need to be more aggressive. Some tall, coarse, stiff weeds such as common ragweed, may cause the active divider speed to be increased to prevent the ragweed from falling outward and pulling corn plants with them.
These are merely some examples.
If the ear orientation is hanging down, control signal generator 250 generates control signals to slow down the rotation of active end dividers 147. If the ear orientation is up, control signal generator 250 generates control signals to speed up the rotation of the active end dividers or keep the active end dividers at a normal rotation speed. If the ear orientation is outward or perpendicular to the stalk, control signal generator 250 generates control signals to slow down (or perhaps even stop) the rotation of the active end dividers 147.
It can thus be seen that the present description describes a system in which the end dividers 146 and 150 can be controlled through an operator input from an operator compartment 103 of an agricultural harvester 100, 120. The present description also describes a system in which the position of the end dividers 146 and 150 can be automatically controlled based upon a wide variety of different sensed inputs or operator inputs.
The present discussion has mentioned processors and servers. In one example, the processors and servers include computer processors with associated memory and timing circuitry, not separately shown. The processors or servers are functional parts of the systems or devices to which they belong and are activated by, and facilitate the functionality of, the other components or items in those systems.
Also, a number of user interface displays have been discussed. The user interface displays can take a wide variety of different forms and can have a wide variety of different user actuatable input mechanisms disposed thereon. For instance, the user actuatable input mechanisms can be text boxes, check boxes, icons, links, drop-down menus, search boxes, etc. The user actuatable input mechanisms can also be actuated in a wide variety of different ways. For instance, they can be actuated using a point and click device (such as a track ball or mouse). The user actuatable input mechanisms can be actuated using hardware buttons, switches, a joystick or keyboard, thumb switches or thumb pads, etc. The user actuatable input mechanisms can also be actuated using a virtual keyboard or other virtual actuators. In addition, where the screen on which the user actuatable input mechanisms are displayed is a touch sensitive screen, they can be actuated using touch gestures. Also, where the device that displays them has speech recognition components, they can be actuated using speech commands.
A number of data stores have also been discussed. It will be noted the data stores can each be broken into multiple data stores. All can be local to the systems accessing them, all can be remote, or some can be local while others are remote. All of these configurations are contemplated herein.
Also, the figures show a number of blocks with functionality ascribed to each block. It will be noted that fewer blocks can be used so the functionality is performed by fewer components. Also, more blocks can be used with the functionality distributed among more components.
It will be noted that the above discussion has described a variety of different systems, components and/or logic. It will be appreciated that such systems, components and/or logic can be comprised of hardware items (such as processors and associated memory, or other processing components, some of which are described below) that perform the functions associated with those systems, components and/or logic. In addition, the systems, components and/or logic can be comprised of software that is loaded into a memory and is subsequently executed by a processor or server, or other computing component, as described below. The systems, components and/or logic can also be comprised of different combinations of hardware, software, firmware, etc., some examples of which are described below. These are only some examples of different structures that can be used to form the systems, components and/or logic described above. Other structures can be used as well.
In the example shown in
It will also be noted that the elements of the previous FIGS. (e.g.,
In other examples, applications can be received on a removable Secure Digital (SD) card that is connected to an interface 15. Interface 15 and communication links 13 communicate with a processor 17 (which can also embody processors or servers previous FIGS.) along a bus 19 that is also connected to memory 21 and input/output (I/O) components 23, as well as clock 25 and location system 27.
I/O components 23, in one example, are provided to facilitate input and output operations. I/O components 23 for various examples of the device 16 can include input components such as buttons, touch sensors, optical sensors, microphones, touch screens, proximity sensors, accelerometers, orientation sensors and output components such as a display device, a speaker, and or a printer port. Other I/O components 23 can be used as well.
Clock 25 illustratively comprises a real time clock component that outputs a time and date. It can also, illustratively, provide timing functions for processor 17.
Location system 27 illustratively includes a component that outputs a current geographical location of device 16. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. Memory 21 can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.
Memory 21 stores operating system 29, network settings 31, applications 33, application configuration settings 35, data store 37, communication drivers 39, and communication configuration settings 41. Memory 21 can include all types of tangible volatile and non-volatile computer-readable memory devices. Memory 21 can also include computer storage media (described below). Memory 21 stores computer readable instructions that, when executed by processor 17, cause the processor to perform computer-implemented steps or functions according to the instructions. Processor 17 can be activated by other components to facilitate their functionality as well.
Note that other forms of the devices 16 are possible.
Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810. Communication media may embody computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation,
The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only,
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (e.g., ASICs), Application-specific Standard Products (e.g., ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
The drives and their associated computer storage media discussed above and illustrated in
A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures. A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.
The computer 810 is operated in a networked environment using logical connections (such as a controller area network—CAN, local area network—LAN, or wide area network WAN) to one or more remote computers, such as a remote computer 880.
When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. In a networked environment, program modules may be stored in a remote memory storage device.
It should also be noted that the different examples described herein can be combined in different ways. That is, parts of one or more examples can be combined with parts of one or more other examples. All of this is contemplated herein.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.