The present subject matter relates generally to systems and methods for real time imaging and surveilling of agricultural machines and, more particularly, to a system and method for producing virtualized views external to a cabin of a work vehicle to aid in ascertaining performance of an agricultural implement.
Currently, operators of work vehicles, such as tractors and other agricultural vehicles, have a field of view from a cabin of the work vehicle that is relatively fixed. Namely, the operator can adjust his seating position or move his head to obtain a slightly different perspective, but generally the field of view has a maximum based on the size and shape of windows and view ports on the machine. In many instances, an operator may desire an alternate view of external equipment of performance of the work vehicle. In these instances, the operator typically must cease work, exit the cabin, and physically move about the agricultural machine.
Accordingly, a system and method for increasing visibility to aid in ascertaining performance while limiting the need for an operator to exit a cabin of a work vehicle would be welcomed in the technology.
Aspects and advantages of the invention will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the invention.
In one aspect, the present subject matter is directed to a system for surveilling an agricultural machine. The system can include the agricultural machine. The agricultural machine can include a work vehicle having a cabin and being coupled to an agricultural implement. The system can also include an imaging system proximate the work vehicle. The imaging system can include at least two imaging devices configured to generate a plurality of images of the agricultural implement from at least two perspectives. The system can also include one or more processors configured to process the plurality of images to create a virtualized 3D view of the agricultural implement, and a display system within the cabin. The display system can be configured to display the virtualized 3D view of the agricultural implement.
In another aspect, the present subject matter is directed to a method of surveilling agricultural machines. The method can include generating a plurality of images of an agricultural implement coupled to a work vehicle. The plurality of images can be taken from at least two perspectives of the agricultural implement. The method can also include processing the plurality of images to create a virtualized 3D view of the agricultural implement. Furthermore, the method can include displaying the virtualized 3D view on a display within a cabin of the work vehicle.
According to another aspect, the present subject matter is directed to an apparatus for surveilling agricultural machines. The apparatus can include an imaging system configured to be arranged proximate a work vehicle. The imaging system can include at least two imaging devices configured to generate a plurality of images of an agricultural implement coupled to the work vehicle from at least two perspectives. The apparatus can further include one or more processors configured to process the plurality of images to create a virtualized 3D view of the agricultural implement. The apparatus can also include a display system configured to be mounted within the cabin, the display system configured to display the virtualized 3D view of the agricultural implement.
These and other features, aspects and advantages of the present invention will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
A full and enabling disclosure of the present invention, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures, in which:
Reference now will be made in detail to embodiments of the invention, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the invention, not limitation of the invention. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope or spirit of the invention. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present invention covers such modifications and variations as come within the scope of the appended claims and their equivalents.
In general, the present subject matter is directed to systems, apparatuses, and methods for surveilling agricultural machines. For example, a system can include an agricultural machine. The agricultural machine can include a work vehicle having a cabin and being coupled to an agricultural implement. The particular work vehicle and agricultural implement are variable, but, in one embodiment, can include at least a work vehicle being operated by an operator, and an agricultural implement being coupled to and towed behind the work vehicle. In this manner, the operator may not have a forward view of the agricultural implement from the cabin during normal operation of the work vehicle.
The system can also include an imaging system proximate the work vehicle. The imaging system can include at least two imaging devices configured to generate a plurality of images of the agricultural implement from at least two perspectives. The imaging devices can be cameras mounted rearward of the cabin such that the at least two perspectives include overlapping fields of view. Accordingly, the overlapping fields of view may provide a larger viewing angle as compared to a single imaging device when considered together. This larger viewing angle may be virtualized or rendered in a virtual 3D view, such as an axonometric projection. The rendered axonometric projection can include any appropriate projection, including an isometric projection or a projection from other viewing angles of the agricultural implement.
The system can also include one or more processors configured to process the plurality of images to create the virtualized 3D view of the agricultural implement, and a display system within the cabin. The processors may process the images based on preconfigured algorithms configured to generate axonometric views, wireframe plots, or other views automatically. The display system can be configured to display the virtualized 3D view of the agricultural implement. Generally, the display system can include a display, such as a touchscreen display, allowing some interaction with the display and/or the virtualized 3D view.
In one embodiment, a computer-implemented graphical user interface may be provided to allow interaction with the virtualized 3D view. For example, at least partial rotation, change of perspective, and/or increase/decrease of the zoom level associated with the virtualized 3D view may be possible.
In another embodiment, an apparatus for surveilling an agricultural machine may also be provided. The apparatus may include the imaging system and display system described above, and may be configured to be installed on a work vehicle. In this manner, work vehicles and agricultural machines of many forms may be altered to include the features described herein.
Referring now to the drawings,
As shown in
In the particular arrangement illustrated in
As further shown in
With reference to both
The imaging devices 122 can be cameras mounted rearward R1 of the cabin 102 such that the at least two perspectives include overlapping fields of view 124. The overlapping fields of view may provide a larger viewing angle as compared to a single imaging device when considered together, which is described more fully with reference to
The system can also include an image processor 126 configured to process the plurality of images to create the virtualized 3D view of the agricultural implement, and a display system 128 within the cabin. The image processor 126 may include one or more processors or other units configured to perform various methods and operations. The image processor 126 or processors may process the images based on preconfigured algorithms configured to generate axonometric views, wireframe plots, or other views automatically. The display system 128 can be configured to display the virtualized 3D view of the agricultural implement 104 for view by an operator within the cabin 102. Generally, the display system 128 can include a display, such as a touchscreen display, allowing some interaction with the display and/or virtualized 3D view by the operator. The display system 128 may also include a standalone display, a heads-up display, a display projected on windows of a cabin of the work vehicle, a flexible/foldable touchable layer overlaid on windows of a cabin of the work vehicle, or any other suitable display system.
As described above, a system for surveilling an agricultural machine may include an imaging system arranged to generate a plurality of images of an agricultural implement from at least two perspectives. The two perspectives may include at least a partially overlapping field of view such that a virtualized 3D view of the implement may be generated by an imaging processor or processors 126. Hereinafter, virtualized 3D views are discussed in detail with reference to
Through image processing, an axonometric view can be generated having a portion of the field of view 308′ and 308″ projected upon a two dimensional viewing surface, such as a display. For example,
Accordingly, instead of having a ‘fish-eye’ or distorted view of the object 304, a relatively undistorted view 312 can be provided that can be at least partially manipulated to aid in operating the object 304 or an implement being surveilled. For example,
Through the image processing described above, an axonometric view can be generated having a portion of the field of view 408′ and 408″ projected upon a two dimensional viewing surface 410. Thus, when viewed on a display, such as the display system 128, only a portion of the entire field of view 408′ and 408″ is provided. However, this virtualized 3D view 412 may be at least partially rotated about the power coupling 108. This provides an operator with views of the implement 104, the hitch assembly 106, the surface 112, and the worked surface features 420 and 422.
Accordingly, instead of having a ‘fish-eye’ or distorted view of the implement 104, a relatively undistorted view 412 can be provided. The view 412 can aid an operator in ascertaining the performance of the implement 104 through views of the worked surface features 420 and 422, as well as determine if there are disconnections or assembly issues in the power coupling 108 and the hitch assembly 106 during operation of the working vehicle 101.
It is further noted that the plurality of images can include a video feed of the agricultural implement 104. For example, the imaging devices 122′ and 122″ may be cameras generating video of the implement 104. Thus, the virtualized 3D view 412 can also include a virtualized 3D video of the agricultural implement during operation. It should be readily understood that similar techniques in providing static virtualized 3D views may be used to generate the virtualized 3D video.
Although particularly described as being a virtualized 3D view using images from at least two imaging devices, further virtualization may be possible. For example, the images received from the at least two imaging devices may be processed to generate a view in a format similar to a wireframe plot or computer-aided design (CAD) format.
The plane or viewing area 410 including the view 412 (or virtualized 3D video) may be displayed on a display system, such as display system 128 from within a cabin of the vehicle 101. Additionally, the virtualized 3D view may be streamed or transmitted to other devices for display, such as a smartphone, tablet, or another computing device. These displays may be provided with a graphical user interface, as described more fully below.
Generally, the performance metrics view 522 may include performance metrics arranged on a pane or interface portion 520, and may display metrics gathered from sensors or devices on one or both of the vehicle 101 and the implement 104. The metrics may include speed, depth, downward pressure, downward force, or any suitable performance metrics. If displayed on a touch screen interface, the portion 520 may provide for an increase or decrease in size of the metrics view 522 based on input received from the touch-screen display. The input may include gestures, voice commands, or face recognition. Other manipulation may include at least scrolling up/down the list of metrics, increase/decrease in font size, or other manipulations
Similarly, the implement view 524 may be provided on a pane or interface portion 510, and may display a view similar to view 412 described above. Furthermore, if displayed on a touch screen interface, the portion 510 may provide for an increase or decrease in size of the virtualized 3D view 524 based on input received from the touch-screen display. The input may include gestures, voice commands, or face recognition. Other manipulation may include at least partial rotation of the view, as described above, or other manipulations.
The virtualized 3D view may be generated based on any suitable algorithm, including computer-aided design algorithms configured to generate axonometric views of objects surveilled from at least two different perspectives with at least partially overlapping fields of view. Thus, relatively undistorted views of equipment, objects, and implements may be displayed as shown in the interface 500, for use by an operator. Hereinafter, the operation of the imaging system 120 and image processors 126 are described with reference to
The method 600 further includes processing the plurality of images to create a virtualized 3D view of the agricultural implement, at block 604. For example, the image processor 126 may process the images based on preconfigured algorithms configured to generate axonometric views, wireframe plots, or other views automatically. The preconfigured algorithms may include readily available computer aided design algorithms configured to generate axonometric views of objects surveilled from at least two different perspectives with at least partially overlapping fields of view.
The method 600 further includes displaying the virtualized 3D view on a display within a cabin 102 of the work vehicle 101, at block 606. For example, the virtualized 3D view may be provided on a graphical user interface, such as the interface 500.
The method 600 can also include receiving sensor data related to performance metrics of the agricultural implement 104, at block 608. The metrics gathered from the sensors or devices may be received from on one or both of the vehicle 101 and the implement 104. The metrics may include speed, depth, downward pressure, downward force, or any suitable performance metrics.
The method 600 can also include displaying the performance metrics with the virtualized 3D view, at block 610. For example, a performance metric view 522 can be provided through the graphical user interface 500.
As described above, a plurality of systems and methods for surveilling agricultural machines have been described. The systems and methods may be facilitated through two or more imaging devices, an image processor, and a display system. The image processor may include one or more processors or a computer apparatus configured to process images to create virtualized 3D views. The computer apparatus may be a general or specialized computer apparatus configured to perform various functions related to image or video manipulation and processing.
For example,
The one or more memory device(s) 706 can store information accessible by the one or more processor(s) 704, including computer-readable instructions 708 that can be executed by the one or more processor(s) 704. The instructions 708 can be any set of instructions that when executed by the one or more processor(s) 704, cause the one or more processor(s) 704 to perform operations. The instructions 708 can be software written in any suitable programming language or can be implemented in hardware. In some embodiments, the instructions 708 can be executed by the one or more processor(s) 704 to cause the one or more processor(s) 704 to perform operations, such as the operations for surveilling agricultural machines, as described with reference to
The memory device(s) 706 can further store data 710 that can be accessed by the processors 704. For example, the data 710 can include prior tool adjustment data, current tool adjustment data, wireframe examples of virtualized 3D views, instructions for generating virtualized 3D views from two or more images or video feeds, user interface wireframes or graphical data, and other suitable data, as described herein. The data 710 can include one or more table(s), function(s), algorithm(s), model(s), equation(s), etc. for presenting virtualized 3D views according to example embodiments of the present disclosure.
The one or more computing device(s) 702 can also include a communication interface 712 used to communicate, for example, with the other components of the system and/or other computing devices. The communication interface 712 can include any suitable components for interfacing with one or more network(s), including for example, transmitters, receivers, ports, controllers, antennas, or other suitable components.
It is also to be understood that the steps of the method 600 is performed by the controller 126 upon loading and executing software code or instructions which are tangibly stored on a tangible computer readable medium, such as on a magnetic medium, e.g., a computer hard drive, an optical medium, e.g., an optical disc, solid-state memory, e.g., flash memory, or other storage media known in the art. Thus, any of the functionality performed by the controller 126 described herein, such as the method 600, is implemented in software code or instructions which are tangibly stored on a tangible computer readable medium. The controller 126 loads the software code or instructions via a direct interface with the computer readable medium or via a wired and/or wireless network. Upon loading and executing such software code or instructions by the controller 126, the controller 126 may perform any of the functionality of the controller 126 described herein, including any steps of the method 600 described herein.
The term “software code” or “code” used herein refers to any instructions or set of instructions that influence the operation of a computer or controller. They may exist in a computer-executable form, such as machine code, which is the set of instructions and data directly executed by a computer's central processing unit or by a controller, a human-understandable form, such as source code, which may be compiled in order to be executed by a computer's central processing unit or by a controller, or an intermediate form, such as object code, which is produced by a compiler. As used herein, the term “software code” or “code” also includes any human-understandable computer instructions or set of instructions, e.g., a script, that may be executed on the fly with the aid of an interpreter executed by a computer's central processing unit or by a controller.
The technology discussed herein makes reference to computer-based systems and actions taken by and information sent to and from computer-based systems. One of ordinary skill in the art will recognize that the inherent flexibility of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and functionality between and among components. For instance, processes discussed herein can be implemented using a single computing device or multiple computing devices working in combination. Databases, memory, instructions, and applications can be implemented on a single system or distributed across multiple systems. Distributed components can operate sequentially or in parallel.
Although specific features of various embodiments may be shown in some drawings and not in others, this is for convenience only. In accordance with the principles of the present disclosure, any feature of a drawing may be referenced and/or claimed in combination with any feature of any other drawing.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they include structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.