The present disclosure generally relates to systems and methods for providing a visual indicator of the surface profile of a field and, more particularly, to systems and methods for providing a visual indicator of the surface profile of a field in association with displayed image data of the field during the performance of an agricultural operation.
Agricultural implements, such as planters, seeders, tillage implements, and/or the like, are typically configured to perform an agricultural operation within a field, such as a planting/seeding operation, a tillage operation, and/or the like. When performing such agricultural operations, it is desirable to be able to adjust the operation of the implement to account for variations in the surface profile of the field that could potentially impact the effectiveness and/or efficiency of the operation. In this regard, systems have been developed that allow the surface profile of the field to be determined as the implement is traveling across the field. Adjustments to the operation of the implement may then be made based on the determined soil profile. These systems typically include cameras that capture images of the field. The images are then automatically analyzed using image processing techniques to assess the surface profile of the portion of the field present within the captured images. However, such image processing often requires significant processing power.
Accordingly, an improved system and method for providing a visual indicator of the surface profile of a field would be welcomed in the technology.
Aspects and advantages of the technology will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the technology.
In one aspect, the present subject matter is directed to a system for providing a visual indication of field surface profile as an agricultural implement is moved across a field by a work vehicle. The system may include a user interface and an imaging device provided in operative association with one of the work vehicle or the agricultural implement such that the imaging device has a field of view directed to a portion of the field. Furthermore, the system may include a profile sensor provided in operative association with one of the work vehicle or the implement such that the profile sensor is configured to capture profile data indicative of a surface profile of the portion of the field present within the field of view of the imaging device. Additionally, the system may include a controller communicatively coupled to the user interface, the imaging device, and the profile sensor. As such, the controller may be configured to receive, from the imaging device, image data associated with an imaged portion of the field. Moreover, the controller may be configured to determine the surface profile of the imaged portion of the field based on the profile data. In addition, the controller may be configured to control the user interface to display a visual indicator in association with the image data, with the visual indicator providing a visual reference indicative of the determined surface profile of the imaged portion of the field.
In another aspect, the present subject matter is directed to a method for providing a visual indication of field surface profile as an agricultural implement is moved across a field by a work vehicle. The method may include receiving, with one or more computing devices, image data associated with an imaged portion of the field. Furthermore, the method may include receiving, with the one or more computing devices, profile data indicative of a surface profile of the imaged portion of the field. Moreover, the method may include determining, with the one or more computing devices, the surface profile of the imaged portion of the field based on the received profile data. Additionally, the method may include and controlling, with the one or more computing devices, a user interface to display a visual indicator in association with the image data, with the visual indicator providing a visual reference indicative of the determined surface profile of the imaged portion of the field.
These and other features, aspects and advantages of the present technology will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the technology and, together with the description, serve to explain the principles of the technology.
A full and enabling disclosure of the present technology, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures, in which:
Repeat use of reference characters in the present specification and drawings is intended to represent the same or analogous features or elements of the present technology.
Reference now will be made in detail to embodiments of the invention, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the invention, not limitation of the invention. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope or spirit of the invention. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present invention covers such modifications and variations as come within the scope of the appended claims and their equivalents.
In general, the present subject matter is directed to systems and methods for providing a visual indication of the surface profile or of a field to an operator as an agricultural implement is moved across a field by a work vehicle. Specifically, in several embodiments, a controller of the disclosed system may be configured to receive image data associated with an imaged portion of the field from an imaging device (e.g., a camera) provided in operative association with the implement or the vehicle. Furthermore, the controller may be configured to receive profile data associated with the imaged portion of the field from a profile sensor (e.g., a LIDAR sensor). Thereafter, the controller may be configured to determine the surface profile of the imaged portion of the field based on the received profile data.
In accordance with aspects of the present subject matter, the controller may be configured to control the operation of a user interface to display a visual indicator in association with the received image data. Specifically, the visual indicator may generally be an object, such as a line or number, that provides a visual reference indicative of the determined surface profile of the imaged portion of the field. For example, in one embodiment, the controller may be configured to layer the visual indicator (e.g., a line showing the surface profile of the imaged portion of the field) onto the received image data to form layered image data. In such embodiment, the controller may then transmit the layered image data to the user interface such that the visual indicator is superimposed over the image data being displayed on the user interface. Alternatively, the visual indicator may be displayed on the user interface adjacent to the image data. Thereafter, the operator may determine from the combination of the displayed image data and the visual indicator whether the surface profile of the field is acceptable. In instances when the surface profile is unacceptable, the operator may provide an input to the controller (e.g., via the user interface) associated with adjusting one or more operating parameters of the implement and/or vehicle.
Thus, the disclosed systems and methods enable an operator to efficiently and more accurately assess field surface conditions, which improves control of the work vehicle and/or implement to obtain desired field surface conditions within a field and, as a result, leads to superior agricultural outcomes. Additionally, by determining the surface profile based on the profile data and not the image data, the disclosed systems and methods require less processing power and memory.
Referring now to drawings,
As particularly shown in
Moreover, as shown in
As particularly shown in
Additionally, as shown in
Moreover, similar to the central and forward frames 40, 42, the aft frame 44 may also be configured to support a plurality of ground-engaging tools. For instance, in the illustrated embodiment, the aft frame 44 is configured to support a plurality of leveling blades 52 and rolling (or crumbler) basket assemblies 54. However, in other embodiments, any other suitable ground-engaging tools may be coupled to and supported by the aft frame 44, such as a plurality closing disks.
In addition, the implement 12 may also include any number of suitable actuators (e.g., hydraulic cylinders) for adjusting the relative positioning, penetration depth, and/or down force associated with the various ground-engaging tools 46, 50, 52, 54. For instance, the implement 12 may include one or more first actuators 56 coupled to the central frame 40 for raising or lowering the central frame 40 relative to the ground, thereby allowing the penetration depth of and/or the force being applied to the shanks 46 to be adjusted. Similarly, the implement 12 may include one or more second actuators 58 coupled to the forward frame 42 to adjust the penetration depth and/or the down pressure of the disk blades 50. Moreover, the implement 12 may include one or more third actuators 60 coupled to the aft frame 44 to allow the aft frame 44 to be moved relative to the central frame 40, thereby allowing the relevant operating parameters of the ground-engaging tools 52, 54 supported by the aft frame 44 (e.g., the down pressure and/or the penetration depth) to be adjusted.
It should be appreciated that the configuration of the work vehicle 10 described above and shown in
It should also be appreciated that the configuration of the implement 12 described above and shown in
Additionally, in accordance with aspects of the present subject matter, the vehicle/implement 10/12 may include one or more imaging devices coupled thereto and/or mounted thereon. As will be described below, each imaging device may be configured to capture image data associated with a portion of the field across which the vehicle/implement 10/12 is traveling. Such image data may, in turn, be indicative of the visual appearance of the field. As such, in several embodiments, the imaging device(s) may be provided in operative association with the vehicle/implement 10/12 such that the associated device(s) has a field of view or detection range directed towards a portion(s) of the field adjacent to the vehicle/implement 10/12. For example, as shown in
Furthermore, the vehicle/implement 10/12 may include one or more profile sensors coupled thereto and/or mounted thereon. As will be described below, each profile sensor may be configured to capture profile data associated with a portion of the field across which the vehicle/implement 10/12 is traveling. Such profile data may, in turn, be indicative of the surface profile of the field. As such, in several embodiments, the profile sensor(s) may be provided in operative association with the vehicle/implement 10/12 such that the associated sensor(s) has a field of view or sensor detection range directed towards a portion(s) of the field adjacent to the vehicle/implement 10/12. For example, as shown in
Referring now to
Moreover, in several embodiments, the profile sensor 104 may correspond to any suitable device(s) configured to capture data indicative of surface profile of the field. Specifically, in several embodiments, the profile sensor 104 may be configured as light detection and ranging (LIDAR) sensor. In such embodiments, as the vehicle/implement 10/12 travel across the field, the profile sensor 104 may be configured to emit one or more light/laser output signals (e.g., as indicated by arrows 110 in
Referring now to
In several embodiments, the system 100 may include a controller 114 and various other components configured to be communicatively coupled to and/or controlled by the controller 114, such as one or more imaging devices 102, one or more profile sensors 104, and/or various components of the work vehicle 10 and/or the implement 12. As will be described in greater detail below, the controller 114 may be configured to receive images or other image data from the imaging device(s) 102 that depict portions of the field as an operation (e.g., a tillage operation) is being performed within the field. Furthermore, the controller 114 may be configured to receive profile data from the profile sensor(s) that is indicative of the surface profile of the imaged portion(s) of the field as the operation is being performed. In this regard, the controller 114 may be configured to analyze the received profile data to estimate or determine the surface profile(s) of the imaged portion(s) of the field. Thereafter, the controller 114 may be configured to control a user interface 116 to display a visual indicator(s) in association with the received image data, with such visual indicator(s) providing a visual reference(s) indicative of the determined surface profile(s) of the imaged portion(s) of the field. For example, in one embodiment, the visual indicator(s) may be superimposed onto the image data. Alternatively, the visual indicator(s) may be displayed adjacent to the image data. The operator may determine from the combination of the displayed image data and visual indicator(s) whether the surface profile of the field is acceptable. In instances when the surface profile is unacceptable, the operator may provide an input to the controller 114 (e.g., via the user interface 116) associated with adjusting one or more operating parameters of the vehicle 10 and/or implement 12.
In several embodiments, the the user interface 116 may be configured to display the image data depicting the field and the visual indicator(s) associated with such data to the operator of the vehicle/implement 10/12. As such, the user interface 116 may include one or more feedback devices (not shown), such as display screens, configured to display the image data and visual indicator(s) to the operator. In addition, some embodiments of the user interface 116 may include one or more input devices (not shown), such as touchscreens, keypads, touchpads, knobs, buttons, sliders, switches, mice, microphones, and/or the like, which are configured to receive user inputs from the operator. In one embodiment, the user interface 116 may be mounted or otherwise positioned within the cab 20 of the vehicle 10. However, in alternative embodiments, the user interface 116 may mounted at any other suitable location.
In general, the controller 114 may correspond to any suitable processor-based device(s), such as a computing device or any combination of computing devices. Thus, as shown in
In several embodiments, the data 121 may be stored in one or more databases. Specifically, the memory 120 may include an image database 124 for storing images or other image data received from the imaging device(s) 102. For example, the imaging device(s) 102 may be configured to continuously or periodically capture images of adjacent portion(s) of the field as an agricultural operation is being performed on the field. In such an embodiment, the images transmitted to the controller 114 from the imaging device(s) 102 may be stored within the image database 124 for subsequent processing and transmission to the user interface 116. It should be appreciated that, as used herein, the term image data may include any suitable type of data received from the imaging device(s) 102 that depicts the visual appearance of the field, including photographs and other image-related data.
In addition, as shown in
Moreover, in several embodiments, the memory 120 may also include a location database 128 storing location information about the work vehicle/implement 10, 12 and/or information about the field being processed (e.g., a field map). Specifically, as shown in
Additionally, in several embodiments, the location data stored within the location database 128 may also be correlated to the images stored within the image database 118 and/or the profile data stored within the surface profile database 126. For instance, in one embodiment, the location coordinates derived from the positioning device(s) 130, the images captured by the imaging device(s) 102, and the profile data captured by the profile sensor(s) 104 may each be time-stamped. In such an embodiment, the time-stamped data may allow each image captured by the imaging device(s) 104 and each surface profile measurement (e.g., each captured data point scan line) captured by the profile sensor(s) 104 to be matched or correlated to a corresponding set of location coordinates received from the positioning device(s) 130, thereby allowing the precise location of the portion of the field depicted within a given image as well as the precise location of a given surface profile determination to be known (or at least capable of calculation) by the controller 114.
Moreover, by matching each image and each surface profile measurement to a corresponding set of location coordinates, the controller 114 may also be configured to generate or update a corresponding field map associated with the field being processed. For example, in instances in which the controller 114 already includes a field map stored within its memory 120 that includes location coordinates associated with various points across the field, the surface profile value(s) determined from the profile data captured by the profile sensor(s) 104 may be mapped or correlated to a given location within the field map. Alternatively, based on the location data and the associated profile data, the controller 114 may be configured to generate a field map for the field that includes the geo-located surface profile measurements associated therewith.
Additionally, the controller 114 may also be configured to match or correlate each surface profile determination with the corresponding image of the field. As mentioned above, the profile sensor(s) 104 may capture profile data that is indicative of the portion(s) of the field imaged by the imaging device(s) 106. In this regard, each surface profile determination may correspond to a specific captured image. As such, by matching each image and each surface profile measurement to a corresponding set of location coordinates, the controller 114 may be configured to match or correlate each surface profile determination made to the corresponding captured image. Thereafter, a visual indicator(s) indicative of the determined surface profile(s) may be displayed in association with the corresponding image data on the user interface 116 to the operator.
Referring still to
Furthermore, the visual indicator display module 132 may be configured to generate one or more visual indicators based on the determined surface profile(s). In general, the visual indicator(s) may provide a visual reference(s) indicative of the determined surface profile(s) of the imaged portion of the field. For example, in one embodiment, the visual indicator(s) may be superimposed onto the image data for display on the user interface 116. Specifically, in such embodiment, the visual indicator display module 132 may be configured to layer the visual indicator(s) onto the image data and transmit the layered data to the user interface 116 for display thereon. Alternatively, the visual indicator(s) may be displayed adjacent to the image data on the user interface 116. Additionally, the statistical parameter(s) may be displayed on the user interface 116 adjacent to the displayed image data and/or visual indicator(s).
It should be appreciated that the visual indicator(s) may correspond to any to any suitable visual display object(s) configured to provide the operator with an indication of the surface profile of the portion of the field depicted in the displayed image data. For example, in one embodiment, the visual indicator(s) may be a line(s) depicting the determined profile(s) of the portion(s) of the field depicted in the displayed image data. That is, the line(s) may have the same shape as the determined surface profile(s). Moreover, such line(s) may be superimposed onto the image data or displayed adjacent to the image data. In another embodiment, the visual indicator(s) may be numbers or values associated with the determined surface profile(s). For instance, such number/values may be the minimum, maximum, and/or average amplitude of the surface profile; the frequency of the surface profile; and period of the surface profile; and/or the like. However, in alternative embodiments, the visual indicator(s) may correspond to any to any suitable visual display object(s), such as a color(s) or symbol(s).
Referring now to
Referring now to
Additionally, as shown in
Referring again to
It should be appreciated that the controller 114 may be configured to implement various control actions to adjust the operation of the work vehicle 10 and/or the implement 12 in a manner that adjusts the surface profile of the field. In one embodiment, the controller 114 may be configured to increase or decrease the operational or ground speed of the implement 12 to affect an adjustment in the surface profile of the field. For instance, as shown in
In addition to the adjusting the ground speed of the vehicle/implement 10/12 (or as an alternative thereto), the controller 114 may also be configured to adjust one or more operating parameters associated with the ground-engaging tools of the implement 12. For instance, as shown in
Moreover, as shown in
Referring now to
As shown in
In some embodiments, the image data obtained at (202) may include a single image frame. Thus, in some embodiments, the method 200 may be performed iteratively for each new image frame as such image frame is received. For example, method 200 may be performed iteratively in real-time as new images are received from the imaging device(s) 102, while the imaging device(s) 102 are moved throughout the field (e.g., as a result of being installed on the vehicle 10 or the implement 12).
In other embodiments, the image data obtained at (202) may include a plurality of image frames. In such embodiments, the plurality of image frames may be concatenated or otherwise combined and processed as a single batch (e.g., by way of a single performance of method 200 over the batch). For example, in one embodiment, image frames from several imaging devices 102 may be concatenated to form a single image frame depicting a portion of the field aft of the implement 10 and extending the entire width of the implement 10 along a lateral direction.
Furthermore, at (204), the method 200 may include receiving, from a profile sensor, profile data associated with the imaged portion of the field as the agricultural implement is moved across the field by the work vehicle. As described above, the vehicle/implement 10/12 may include one or more profile sensor(s) 104 (e.g., a LIDAR sensor(s)), with each profile sensor 104 configured to capture profile data indicative of the surface profile of the portion of the field within the field of view 108 of one of the imaging devices 102. In this regard, as the vehicle/implement 10/12 travels across the field to perform the agricultural operation thereon (e.g., a tillage operation), the controller 114 may be configured to receive the captured profile data from the profile sensor(s) 104 (e.g., via the communicative link 158). As will be described below, the controller 114 may be configured to analyze the received profile data to determine the surface profile of the imaged portion(s) of the field.
In some embodiments, the profile data received at (204) may be a plurality of single data point scan lines (e.g., the vertical profile of a 2-D swath of the field associated with the scan line). Thus, in some embodiments, the method 200 may be performed iteratively for each new data scan line as such scan line is received. For example, method 200 may be performed iteratively in real-time as new data scan lines are received from the profile sensor(s) 104, while the profile sensor(s) 104 is moved throughout the field (e.g., as a result of being installed on the vehicle 10 or the implement 12). Alternatively, the vision data received at (202) may include a data point cloud associated with a 3-D portion of the field (e.g., from a profile sensor 104 capable of scanning a 3-D portion of the field).
Additionally, as shown in
Furthermore, at (208), the method 200 may include controlling a user interface to display a visual indicator in association with the image data. For example, as described above, the visual indicator display module 132 of the controller 114 may, in accordance with aspects of the present subject matter, be configured to determine one or more visual indicators based on the determined surface profile(s) and control the user interface 116 to display the visual indicator(s) in association with the image data. Moreover, in several embodiments, at (208), the controller 114 may be configured to receive one or more inputs from the operator (e.g., via the interface elements 144, 146, 148) associated with adjusting the type, size, and/or position of the visual indicator(s).
In addition, as shown in
Furthermore, at (212), the method may include controlling the operation of at least one of the agricultural implement or the work vehicle based on the received control action input. For example, as indicated above, in several embodiments, the control module 150 of the controller 114 may be configured to adjust one or more operating parameters of the vehicle 10 and/or the implement 12, such as ground speed of the vehicle/implement 10/12 and or the force(s) applied to the ground-engaging tool(s) (e.g., the leveling blades 52) of the implement 12, in a manner that adjusts the surface profile or soil profile of the field.
It is to be understood that the steps of the method 200 are performed by the controller 114 upon loading and executing software code or instructions which are tangibly stored on a tangible computer readable medium, such as on a magnetic medium, e.g., a computer hard drive, an optical medium, e.g., an optical disc, solid-state memory, e.g., flash memory, or other storage media known in the art. Thus, any of the functionality performed by the controller 114 described herein, such as the method 200, is implemented in software code or instructions which are tangibly stored on a tangible computer readable medium. The controller 114 loads the software code or instructions via a direct interface with the computer readable medium or via a wired and/or wireless network. Upon loading and executing such software code or instructions by the controller 114, the controller 114 may perform any of the functionality of the controller 114 described herein, including any steps of the method 200 described herein.
The term “software code” or “code” used herein refers to any instructions or set of instructions that influence the operation of a computer or controller. They may exist in a computer-executable form, such as machine code, which is the set of instructions and data directly executed by a computer's central processing unit or by a controller, a human-understandable form, such as source code, which may be compiled in order to be executed by a computer's central processing unit or by a controller, or an intermediate form, such as object code, which is produced by a compiler. As used herein, the term “software code” or “code” also includes any human-understandable computer instructions or set of instructions, e.g., a script, that may be executed on the fly with the aid of an interpreter executed by a computer's central processing unit or by a controller.
This written description uses examples to disclose the technology, including the best mode, and also to enable any person skilled in the art to practice the technology, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the technology is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they include structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.