The present disclosure relates to management systems for agricultural machines.
Research and development has been directed to the automation of agricultural machines to be used in agricultural fields. For example, work vehicles, such as tractors, combines, and rice transplanters, which automatically travel within fields by utilizing a positioning system, e.g., a GNSS (Global Navigation Satellite System), are coming into practical use. Research and development is also under way for work vehicles which automatically travel not only within fields, but also outside the fields (including public roads).
Japanese Laid-Open Patent Publication No. 2018-132326 discloses a system that manages a signal strength distribution of satellite radio waves received by work vehicles that automatically travel in work fields by satellite navigation. In this system, in the case where the reception strength of radio waves detected by a satellite positioning module is lower than a predetermined threshold, a control unit of such a work vehicle generates information indicating that the reception strength is lowered. This information includes, for example, the receiving strengths of signals transmitted from a plurality of satellites, position data about the satellites (angles of elevation, angles of direction, etc.), information about the position of the work vehicle itself, the work field ID, and the like. The control unit transmits the information to a management computer. Based on the information, the management computer generates information indicating the distribution of the reception strengths of the satellite radio waves for each of the work fields. For example, the management computer generates map information indicating regions where the reception strengths are lowered in each work field by date and time. Such map information may be referred to, so that a work plan can be generated to allow the work vehicle to perform tasked travel efficiently even in a work field in which the regions, where the reception strengths are lowered in accordance with positions of the satellites and geographical states, are scattered.
According to the technique disclosed in Japanese Laid-Open Patent Publication No. 2018-132326, information indicating the distribution of the reception strengths of the satellite radio waves in each work field by date and time may be generated. Based on the information, a path estimated not to have the reception strengths lowered may be generated, so that the work vehicle automatically can travel along the path. However, if an obstacle that was not present at the time of the generation of the information indicating the distribution of the reception strengths of the satellite radio waves is now present on or near the path, the reception strengths of the satellite radio waves are lowered, which may prevent positioning from being performed.
Example embodiments of the present invention provide techniques to, in a case where reception interference such as a decrease in reception strength of a satellite signal occurs in an agricultural machine including a GNSS receiver, easily identify a cause of such reception interference.
This specification discloses the following techniques.
A management system according to an example embodiment of the present disclosure includes a storage to store, while an agricultural machine including an GNSS receiver is traveling, GNSS data output from the GNSS receiver and sensing data output from a sensor to sense a surrounding environment of the agricultural machine in association with each other, and a processor configured or programmed to generate and output visualized data, which represents the surrounding environment of the agricultural machine, based on the GNSS data and the sensing data, when reception interference of a satellite signal occurs.
According to another example embodiment of the present disclosure, the processor may be configured or programmed to further generate visualized data, which represents the surrounding environment of the agricultural machine, based on the GNSS data and the sensing data, when the reception interference is not occurring, and output the visualized data when the reception interference is occurring and the visualized data when the reception interference is not occurring, in distinguishable forms.
According to another example embodiment of the present disclosure, the management system may further include a display to display the visualized data when the reception interference is occurring and the visualized data when the reception interference is not occurring.
According to another example embodiment of the present disclosure, when the reception interference occurs, the processor may be configured or programmed to further generate and output visualized data, which represents the surrounding environment of the agricultural machine, based on the GNSS data and the sensing data, at least before or after a period when the reception interference is occurring.
According to another example embodiment of the present disclosure, the management system may further include a display to display the visualized data when the reception interference is occurring and the visualized data at least before or after the period when the reception interference is occurring.
According to another example embodiment of the present disclosure, the storage may be configured to store the GNSS data and the sensing data in association with points of time while the agricultural machine is traveling, and the processor may be configured or programmed to generate, as the visualized data, data representing a motion picture of the surrounding environment of the agricultural machine, based on the GNSS data and the sensing data at each of the points of time, in a period when the reception interference is occurring.
According to another example embodiment of the present disclosure, the processor may be configured or programmed to identify a position of a satellite when the satellite signal is received, based on the GNSS data or satellite position data acquired from an external device, and generate, as the visualized data, data representing an image including one or more marks, each representing the position of the satellite, superimposed on an image of the surrounding environment of the agricultural machine.
According to another example embodiment of the present disclosure, the visualized data may include information representing a reception level of the satellite signal received by the GNSS receiver.
According to another example embodiment of the present disclosure, the agricultural machine may be configured to automatically travel based on the GNSS data, the storage may be configured to store the GNSS data and the sensing data during a portion of a time period when the agricultural machine is automatically traveling, the portion including a time period when the reception interference is occurring, and the processor may be configured or programmed to generate the visualized data during the portion of the time period.
According to another example embodiment of the present disclosure, the GNSS data may include information representing a reception level of the satellite signal and an estimated position of the agricultural machine, the sensing data may include information representing an image of the surrounding environment of the agriculture machine, and the storage may be configured to store a database including the reception level, the estimated position, the image and information about a point of time.
According to another example embodiment of the present disclosure, the processor may be configured or programmed to send a notice to an external terminal when the reception interference occurs.
According to another example embodiment of the present disclosure, the sensor may include at least one of a camera or a LIDAR sensor provided in the agricultural machine.
According to another example embodiment of the present disclosure, the sensor may include a plurality of cameras provided in the agricultural machine, and the processor may be configured or programmed to generate the visualized data by a process including synthesizing images output from the plurality of cameras.
According to another example embodiment of the present disclosure, the processor may be configured or programmed to recognize based on the sensing data, an object that causes the reception interference in the surrounding environment, and output an alert signal when recognizing the object.
A method according to an example embodiment of the present disclosure, to be executed by a computer, includes, causing, while an agricultural machine including a GNSS receiver is traveling, a storage to store GNSS data output from the GNSS receiver and sensing data output from a sensor sensing a surrounding environment of an agricultural machine in association with each other, and generating and outputting visualized data, which represents the surrounding environment of the agricultural machine, based on the GNSS data and the sensing data, when reception interference of a satellite signal occurs.
General or specific aspects of the present disclosure may be implemented using a device, a system, a method, an integrated circuit, a computer program, a non-transitory computer-readable storage medium, or any combination thereof. The computer-readable storage medium may be inclusive of a volatile storage medium, or a non-volatile storage medium. The device may include a plurality of devices. In the case where the device includes two or more devices, the two or more devices may be disposed within a single apparatus, or divided over two or more separate apparatuses.
According to example embodiments of the present disclosure, it is possible to, in a case where reception interference such as a decrease in reception strength of a satellite signal occurs with regard to an agricultural machine including a GNSS receiver, easily identify a cause of the reception interference based on visualized data.
The above and other elements, features, steps, characteristics and advantages of the present invention will become more apparent from the following detailed description of the example embodiments with reference to the attached drawings.
Hereinafter, example embodiments of the present disclosure will be described more specifically. Note, however, that unnecessarily detailed descriptions may be omitted. For example, detailed descriptions on what is well known in the art or redundant descriptions on what is substantially the same configuration may be omitted. This is to avoid lengthy description, and facilitate the understanding of those skilled in the art. The accompanying drawings and the following description, which are provided by the present inventors so that those skilled in the art can sufficiently understand the present disclosure, are not intended to limit the scope of the claims. In the following description, components or elements having identical or similar functions are denoted by identical reference numerals.
The following example embodiments are only exemplary, and the techniques according to the present disclosure is not limited to the following example embodiments. For example, numerical values, shapes, materials, steps, orders of steps, layout of a display screen, etc., that are indicated in the following example embodiments are only exemplary, and admit of various modifications so long as it makes technological sense. Any one implementation may be combined with another so long as it makes technological sense to do so.
A management system for agricultural machines according to an illustrative example embodiment of the present disclosure includes a storage and a processor. While an agricultural machine including a GNSS receiver is traveling, the storage stores GNSS data output from the GNSS receiver and sensing data output from a sensor to sense a surrounding environment of the agricultural machine in association with each other. Based on the GNSS data and the sensing data, the processor is configured or programmed to generate and output visualized data representing the surrounding environment of the agricultural machine when reception interference of a satellite signal is occurring.
In the present disclosure, an “agricultural machine” means a machine for agricultural applications. Examples of agricultural machines include tractors, harvesters, rice transplanters, vehicles for crop management, vegetable transplanters, mowers, seeders, spreaders, and mobile robots for agriculture. Not only may a work vehicle (such as a tractor) function as an “agricultural machine” alone by itself, but also a combination of a work vehicle and an implement that is attached to, or towed by, the work vehicle may function as an “agricultural machine”. For the ground surface within a field, an agricultural machine performs agricultural work such as tilling, seeding, preventive pest control, manure spreading, planting of crops, or harvesting. Such agricultural work or tasks may be referred to as “groundwork”, or simply as “work” or “tasks”. Travel of a vehicle-type agricultural machine performed while the agricultural machine also performs agricultural work may be referred to as “tasked travel”.
A “GNSS receiver” is configured or programmed to receive radio waves transmitted from a plurality of satellites in the Global Navigation Satellite System (GNSS) and perform positioning based on a signal superimposed on the radio waves. GNSS is the general term for satellite positioning systems such as GPS (Global Positioning System), QZSS (Quasi-Zenith Satellite System), GLONASS, Galileo, and BeiDou. As used herein, satellites in these positioning systems will be referred to as “GNSS satellites”. A signal transmitted from a GNSS satellite will be referred to as a “satellite signal”. “GNSS data” is data output from the GNSS receiver. The GNSS data may be generated in a predetermined format such as, for example, the NMEA-0183 format. GNSS data may include, for example, information indicating receiving states of satellite signals received from individual satellites. For example, the GNSS data may include an identification number, an angle of elevation, an angle of direction, and a value indicating the reception strength of each of the satellites from which the satellite signals are received. Hereinafter, the reception strength may be referred to as a “reception level”. The reception strength is a numerical value indicating the strength of the received satellite signal. The reception strength may be expressed by a value such as, for example, the Carrier to Noise Density Ratio (C/NO). The GNSS data may include positional information on the GNSS receiver or the agricultural machine as calculated based on the received plurality of satellite signals. The positional information may be expressed by, for example, the latitude, the longitude, and the altitude from the mean sea level, and the like. The GNSS data may further include information indicating the reliability of the positional information.
A “sensor” is a device including one or more sensors. Such a sensor is disposed so as to sense a surrounding environment of the agricultural machine. The sensor may include, for example, at least one of a camera including an image sensor (i.e., an imager) or a LiDAR (Light Detection and Ranging) sensor. “Sensing data” is data output from the sensor. The sensing data may include, for example, image data generated by the camera, or point cloud data or distance distribution data generated by the LiDAR sensor. The point cloud data represents a distribution of a plurality of points, the distance between which is measured by the LiDAR sensor. The sensing data may be used, for example, by the agricultural machine traveling by self-driving to detect an obstacle, or by a user performing remote monitoring of the agricultural machine to check the surrounding environment of the agricultural machine. The sensor may be provided in the agricultural machine or outside the agricultural machine. The sensor provided outside the agricultural machine may include, for example, a plurality of cameras or a plurality of LiDAR sensors disposed along a path along which the agricultural machine travels.
“Reception interference of a satellite signal” means a state where the reliability of positioning is lowered, as compared with in a normal state, by deterioration in the receiving state of the satellite signal. The reception interference may occur, for example, in the case where the number of the detected satellites is small (e.g., 3 or smaller), in the case where the reception strength of each of the satellite signals is low, or in the case where multipath propagation occurs. The processor can determine whether the reception interference is occurring or not based on, for example, the information on the satellites included in the GNSS data. The processor can determine whether the reception interference is occurring or not based on, for example, the value of the reception strength of each of the satellites included in the GNSS data, the value of DOP (Dilution of Precision) indicating a positional arrangement of the satellites, or the like.
“Visualized data representing the surrounding environment of the agricultural machine” is data by which an image allowing the user to check the surrounding environment of the agricultural machine is displayed on a display. As used herein, displaying an image based on visualized data may be expressed as “displaying visualized data”.
“Storing the GNSS data and the sensing data in association with each other” means storing the GNSS data and the sensing data corresponding to the GNSS data in a format readable by another device. For example, in order to associate GNSS data and sensing data generated at substantially the same point of time, the storage may store such data and information on the point of time of the acquisition of such data in association with each other. Storing the GNSS data is not limited to storing the entirety of the GNSS data, but may be storing a portion of the GNSS data or storing data generated based on the GNSS data. Similarly, storing the sensing data is not limited to storing the entirety of the sensing data, but may be storing a portion of the sensing data or storing data generated based on the sensing data.
A “management system” is a computer system to manage the agricultural machine. The processor in the management system may be a computer including, for example, one or more processors and one or more memories. In this case, the processor can consecutively execute computer programs stored in the memory to realize desired processing. The processor may be mounted on the agricultural machine, or may be installed in a place far from the agricultural machine, for example, at the home or an office of the user monitoring the agricultural machine. One of a plurality of electronic control units (ECU) mounted on the agricultural machine may define and function as the processor. Alternatively, a server computer or an edge computer communicating with the agricultural machine via a network may define and function as the processor. Still alternatively, a terminal used by the user may have at least a portion of functions of the processor. Examples of the processor include a stationary computer, a smartphone, a tablet computer, a laptop computer, and the like.
A storage is a device that includes one or more storage mediums and is capable of storing data. Such a storage medium may be, for example, a semiconductor storage medium, a magnetic storage medium or an optical storage medium. The storage and the processor may be disposed inside one device. Alternatively, the storage and the processor may be disposed separately from each other and connected to each other via a network. The storage may be provided in the agricultural machine.
According to an example embodiment of the present disclosure, in the case where reception interference of a satellite signal is experienced by the GNSS receiver, the user can check the state of the surroundings of the agricultural machine from the image displayed based on the visualized data. This allows the user to confirm that, for example, an obstacle present around the agricultural machine hinders reception of the satellite signal.
An example of an obstacle hindering the reception of a satellite signal may be, for example, branches and leaves of a tree or a tall crop. The tree or the crop, even though not hindering the reception of a satellite signal at a certain moment, grows over time and may possibly cause reception interference. The above-described configuration allows the user to find out that such an obstacle is present on or near the path. Therefore, the user can cut a branch of the tree or remove the crop that causes the reception interference, and thus can prevent reception interference from occurring in the future. Another example of an obstacle that hinders the reception of a satellite signal may be a large vehicle such as a truck or the like. Such a large vehicle may define and function as an obstacle causing reception interference in the case where the agricultural machine travels, for example, at or near an edge of a field that is at a lower level than that of surrounding roads.
An agricultural machine according to an example embodiment of the present disclosure may have a self-driving function. As used herein, “self-driving” means controlling the movement of an agricultural machine by the action of a controller, rather than through manual operations of a driver. An agricultural machine that performs self-driving may be referred to as a “self-driving agricultural machine” or a “robotic agricultural machine”. During self-driving, not only the movement of the agricultural machine, but also the operation of agricultural work may be controlled automatically. In the case where the agricultural machine is a vehicle-type machine, traveling of the agricultural machine via self-driving will be referred to as “self-traveling”. The controller may be configured or programmed to control at least one of: steering that is required in the movement of the agricultural machine; adjustment of the speed; and beginning and ending a travel. In the case of controlling a work vehicle having an implement attached thereto, the controller may be configured or programmed to control raising or lowering of the implement, beginning and ending of an operation of the implement, and so on. A move based on self-driving may include not only moving of an agricultural machine that goes along a predetermined path toward a destination, but also moving of an agricultural machine that follows a target of tracking. An agricultural machine that performs self-driving may also have the function of moving partly based on the user's instructions. Moreover, an agricultural machine that performs self-driving may operate not only in a self-driving mode but also in a manual driving mode, where the agricultural machine moves through manual operations of the driver. When performed not manually but through the action of a controller, the steering of a movable body will be referred to as “automatic steering”. A portion of the controller may reside outside the agricultural machine. Control signals, commands, data, etc., may be communicated between the agricultural machine and a controller residing outside the agricultural machine. An agricultural machine that performs self-driving may move autonomously while sensing the surrounding environment, without any person being involved in the controlling of the movement of the agricultural machine. An agricultural machine that is capable of autonomous movement is able to travel within the field or outside the field (e.g., on roads) in an unmanned manner. During an autonomous move, operations of detecting and avoiding obstacles may be performed.
Now, an example configuration of the management system will be described in more detail with reference to
The server 40 includes a storage 42 to store information on a position or an orbit of each of GNSS satellites. The server 40 transmits satellite position data representing the positions of the satellites to a processor 24 in response to a request of the processor 24.
The agricultural machine 10 includes a GNSS receiver 12 and a sensor 14. The GNSS receiver 12 includes an antenna to receive satellite signals transmitted from the plurality of satellites, and a processing circuit to generate GNSS data by computation based on the satellite signals. The GNSS data may include positional information on the agricultural machine 10 and information on a receiving state of the satellite signal from each of the satellites (e.g., reception strength, etc.). The sensor 14 may include, for example, a camera or a LiDAR sensor. The sensor 14 senses a surrounding environment of the agricultural machine 10 and generates sensing data. The sensing data may be, for example, image data generated by the camera or point cloud data generated by the LiDAR sensor. In the example of
During travel of the agricultural machine 10, the GNSS data and the sensing data are transmitted to a storage 22 and stored therein. The storage 22 may be configured to store the GNSS data and the sensing data only while the agricultural machine 10 is travelling by self-driving. Alternatively, the storage 22 may be configured to store the GNSS data and the sensing data also while the agricultural machine 10 is travelling by manual driving. During travel, the agricultural machine 10 may transmit the GNSS data and the sensing data to the storage 22 at, for example, a certain cycle. Such GNSS data and sensing data may define and function as travel log data on the agricultural machine 10.
The storage 22 stores the GNSS data and the sensing data transmitted from the agricultural machine 10 in association with each other. The storage 22 may store the GNSS data and the sensing data in association with, for example, points of time. This makes it easy for the processor 24 to acquire the GNSS data and the sensing data corresponding to a specific point of time. Based on the GNSS data and the sensing data acquired from the storage 22, the processor 24 generates visualized data representing the surrounding environment of the agricultural machine 10 when the reception interference of the satellite signals is occurring. It may be determined whether the reception interference is occurring based on the information on the receiving states of the satellite signals included in the GNSS data as described above. For example, the processor 24 may generate, as the visualized data, image data including an image based on the sensing data and information indicating that the reception interference is occurring (e.g., a letter(s), a graphic pattern (a), a symbol(s), etc.). Based on the GNSS data and the sensing data at each point of time, the processor 24 may generate, as the visualized data, data representing a motion picture of the surrounding environment of the agricultural machine 10 in a period when the reception interference is occurring.
The processor 46 transmits the generated visualized data to the terminal 30. The terminal 30 is a device used by a user remotely monitoring the agricultural machine 10. The terminal 30 may be, for example, a computer such as a personal computer, a laptop computer, a tablet computer, a smartphone or the like. The terminal 30 shown in
The processor 24 may generate visualized data when no reception interference is occurring as well as visualized data when reception interference is occurring. The processor 24 may output visualized data when reception interference is occurring and visualized data when no reception interference is occurring, in distinguishable forms. For example, the processor 24 may generate, as visualized data when reception interference is occurring, image data including information in the form of a letter(s), a symbol(s) or a graphic pattern(s) indicating that reception interference is occurring. By contrast, the processor 24 may generate, as visualized data when no reception interference is occurring, image data not including such information or image data including information indicating that the receiving state is good. The user can easily determine whether reception interference is occurring or not based on the image displayed on the terminal 30.
When reception interference occurs, the processor 24 may generate and output visualized data representing the surrounding environment of the agricultural machine 10 at least before or after the period when the reception interference is occurring, based on the GNSS data and the sensing data. This allows a cause of the reception interference to be identified more easily by a comparison between the visualized data in the period when the reception interference is occurring and the visualized data before or after the period.
The processor 24 can acquire satellite position data representing the positions of the GNSS satellites from the server 40. The processor 24 may identify, based on the satellite position data, the positions of the satellites when the satellite signals are received, and generate visualized data representing an image in which one or more marks representing the positions of the satellites are superimposed on an image of the surrounding environment of the agricultural machine 10. In the case where the GNSS data includes the information indicating the position of each of the satellites, the processor 24 can also generate similar visualized data based on the GNSS data. It should be noted that when reception interference is occurring, there are often cases where sufficient information on the positions of the satellites is not obtained from the GNSS data. Therefore, in the present example embodiment, in the case where the information on the positions of the satellites is to be included in the visualized data when reception interference is occurring, the processor 24 acquires the satellite position data from an external device such as the server 40 or the like.
Visualized data may include information indicating reception levels of the satellite signals received by the GNSS receiver. The information indicating the reception levels may be, for example, a numerical value of the reception strength of each of the satellite signals included in the GNSS, or information such as a numerical value or a letter obtained by a calculation performed based on the above-mentioned numerical value. The information indicating the reception levels may indicate the reception levels by a plurality of stages. For example, the reception levels may be expressed by numerical values or letters of a three-stage system including, for example, “3 (high)”, “2 (intermediate)” and “1 (low)”. Alternatively, the reception levels may be expressed by a gauge.
The agricultural machine 10 may be configured to automatically travel based on the GNSS data. For example, the agricultural machine 10 may be configured to automatically travel based on positional information indicating results of positioning included in the GNSS data and information on a preset target path. The storage 22 may store the GNSS data and the sensing data during only a portion of a period when the agricultural machine 10 is automatically traveling, the portion including a time period when reception interference is occurring. In this case, the processor 24 generates visualized data during only the portion of the time period. This can reduce or minimize the amount of the data to be stored, and thus reduce or minimize the capacity of the storage 22.
As described above, the GNSS data may include information indicating the reception levels of the satellite signals and an estimated position of the agricultural machine 10. In the case where the sensor 14 includes a camera, the sensing data may include information indicating an image of the surrounding environment of the agricultural machine 10. The storage 22 may store a database including the reception levels of the satellite signals, the estimated position of the agricultural machine 10, the image of the surrounding environment of the agricultural machine 10, and information on the points of time. The storage of such a database allows the processor 24 to refer to the database and thus to generate visualized data efficiently.
When reception interference occurs, the processor 24 may transmit a notice of the reception interference to the external terminal 30. The processor 24 can detect the occurrence of the reception interference based on the GNSS data. Upon receipt of the notice, the terminal 30 may display information such as a message indicating that the reception interference has occurred on the display 32. This allows the user to promptly notice that the reception interference of a satellite signal has occurred. The user can check the displayed image to promptly identify the cause of the reception interference.
The sensor 14 may include at least one of the camera or the LiDAR sensor provided in the agricultural machine 10. In this case, the sensor 14 can generate sensing data based on image data output from the camera and/or point cloud data output from the LiDAR sensor. Based on such sensing data, the processor 24 can generate visualized data clearly representing the surrounding environment of the agricultural machine 10.
The sensor 14 may include a plurality of cameras provided in the agricultural machine 10. In this case, the processor 24 may generate visualized data by a process including synthesizing images output from the plurality of cameras. This allows the processor 24 to generate visualized data representing a synthesized image covering a wide range (e.g., 180°, 270°, 360°, etc.) around the agricultural machine 10. This allows the user to check the state of the surroundings of the agricultural machine 10 in more detail, and thus to identify the cause of the reception interference easily.
The processor 24 may recognize an object that causes reception interference in the environment based on sensing data, and present an alert signal when recognizing the object. For example, the processor 24 can recognize an object that causes reception interference based on an image or a point cloud represented by the sensing data, satellite position data acquired from the server 40, and information indicating the receiving states of satellite signals included in GNSS data. The processor 24 may transmit an alert signal to, for example, the terminal 30. Upon receipt of the alert signal, the terminal 30 can display a warning message on the display 32 or cause a loudspeaker to output an alarm sound. This allows the user to recognize that there is an object that causes reception interference. The processor 24 may transmit an alert signal to a controller of the agricultural machine 10. Upon receipt of the alert signal, the controller of the agricultural machine 10 may sound a buzzer, activate an alarm light, or stop traveling. This allows the agricultural machine 10 to stop safely or allows people around the agricultural machine 10 to be alerted by the buzzer or the alarm light when reception interference is occurring and thus the reliability of positioning is low.
A method for managing an agricultural machine according to another example embodiment of the present disclosure is a method to be executed by a computer. The method includes: causing, during travel of the agricultural machine including a GNSS receiver, a storage to store GNSS data output from the GNSS receiver and sensing data output from a sensor sensing a surrounding environment of the agricultural machine in association with each other; and generating and outputting visualized data representing the surrounding environment of the agricultural machine based on the GNSS data and the sensing data when reception interference of a satellite signal is occurring.
A computer program according to still another example embodiment of the present disclosure is stored on a non-transitory computer-readable storage medium. The computer program causes a computer to execute: causing, during travel of an agricultural machine including a GNSS receiver, a storage to store GNSS data output from the GNSS receiver and sensing data output from a sensor sensing a surrounding environment of the agricultural machine in association with each other; and generating and outputting visualized data representing the surrounding environment of the agricultural machine based on the GNSS data and the sensing data when reception interference of a satellite signal is occurring.
Hereinafter, an example embodiment will be described where the techniques according to the present disclosure are applied to a work vehicle (e.g., a tractor) as an example of an agricultural machine. The techniques according to the present disclosure are applicable not only to work vehicles such as tractors, but also to other types of agricultural machines.
In the present example embodiment, the work vehicle 100 is a tractor, for example. The tractor can have an implement attached to its rear and/or its front. While performing agricultural work according to the particular type of implement, the tractor is able to automatically travel within a field. The techniques according to the present example embodiment are similarly applicable to agricultural machines other than tractors so long as it makes sense.
The work vehicle 100 has a self-driving function. In other words, the work vehicle 100 travels by the action of a controller, rather than manually. The controller according to the present example embodiment is provided inside the work vehicle 100, and is able to control both the speed and steering of the work vehicle 100.
The work vehicle 100 includes a positioning device 110 including a GNSS receiver. Based on the position of the work vehicle 100 as identified by the positioning device 110 and a target path stored in a storage previously, the controller causes the work vehicle 100 to automatically travel. In addition to controlling the travel of the work vehicle 100, the controller also controls the operation of the implement. As a result, while automatically traveling, the work vehicle 100 is able to perform a task (work) by using the implement.
The work vehicle 100 also includes a sensor to detect obstacles. The sensor may include, for example, a camera 120. The camera 120 generates image data for use in remote monitoring. The controller of the work vehicle 100 consecutively transmits the image data acquired by the camera 120 to the management computer 500. The sensor may include a LiDAR sensor. The controller of the work vehicle 100 may consecutively transmit point cloud data or distance distribution data generated by the LiDAR sensor to the management computer 500. Hereinafter, an example in which the sensor includes the camera 120 and outputs image data generated by the camera 120 as sensing data will be mainly described.
The management computer 500 is a computer to realize the above-described functions of the management system. The management computer 500 may be, for example, a server computer that performs centralized management on information regarding the field on the cloud and supports agriculture by use of the data on the cloud. The management computer 500 receives GNSS data output from the positioning device 110 of the work vehicle 100 and sensing data output from the sensor, and causes the storage to store the GNSS data and the sensing data. Based on the GNSS data and the sensing data, the management computer 500 generates visualized data representing the surrounding environment of the work vehicle 100. Based on the satellite position data acquired from the streaming server 600, the management computer 500 may generate visualized data including the information on the positions of the satellites. The management computer 500 transmits the generated visualized data to the terminal 400.
The terminal 400 is a computer that is used by a user who is at a remote place from the work vehicle 100. The terminal 400 may be provided at the home or an office of the user, for example. The terminal 400 may be a mobile terminal such as, for example, a laptop computer, a smartphone or a tablet computer, or a stationary computer such as a desktop PC (Personal Computer). The terminal 400 causes a display to display a video based on the visualized data transmitted from the management computer 500. By watching the video on the display, the user is able to grasp the state of the surroundings of the work vehicle 100.
Hereinafter, the configuration and operation of a system according to the present example embodiment will be described in more detail.
As shown in
The work vehicle 100 shown in
The work vehicle 100 further includes the positioning device 110. The positioning device 110 includes a GNSS receiver. The GNSS receiver includes an antenna to receive a signal(s) from a GNSS satellite(s) and a processor to calculate the position of the work vehicle 100 based on the signal(s) received by the antenna. The positioning device 110 receives satellite signals transmitted from the plurality of GNSS satellites, and performs positioning based on the satellite signals. Although the positioning device 110 in the present example embodiment is disposed above the cabin 105, it may be disposed at any other position.
The positioning device 110 may include an inertial measurement unit (IMU). Signals from the inertial measurement unit (IMU) can be utilized to complement position data. The IMU can measure a tilt or a small motion of the work vehicle 100. The data acquired by the IMU can be used to complement the position data based on the satellite signals, so as to improve the performance of positioning.
The work vehicle 100 shown in
The positioning device 110 may utilize the data acquired by the cameras 120 or the LIDAR sensor 140 for positioning. When objects serving as characteristic points exist in the environment that is traveled by the work vehicle 100, the position of the work vehicle 100 can be estimated with a high accuracy based on data that is acquired by the cameras 120 or the LiDAR sensor 140 and an environment map that is previously recorded in the storage. By correcting or complementing position data based on the satellite signals using the data acquired by the cameras 120 or the LiDAR sensor 140, it becomes possible to identify the position of the work vehicle 100 with a higher accuracy.
The work vehicle 100 further includes a plurality of obstacle sensors 130. In the example shown in
The prime mover 102 may be a diesel engine, for example. Instead of a diesel engine, an electric motor may be used. The transmission 103 can change the propulsion and the moving speed of the work vehicle 100 through a speed changing mechanism. The transmission 103 can also switch between forward travel and backward travel of the work vehicle 100.
The steering device 106 includes a steering wheel, a steering shaft connected to the steering wheel, and a power steering device to assist in the steering by the steering wheel. The front wheels 104F are the wheels responsible for steering, such that changing their angle of turn (also referred to as “steering angle”) can cause a change in the traveling direction of the work vehicle 100. The steering angle of the front wheels 104F can be changed by manipulating the steering wheel. The power steering device includes a hydraulic device or an electric motor to supply an assisting force to change the steering angle of the front wheels 104F. When automatic steering is performed, under the control of a controller disposed in the work vehicle 100, the steering angle may be automatically adjusted by the power of the hydraulic device or the electric motor.
A linkage device 108 is provided at the rear of the vehicle body 101. The linkage device 108 includes, e.g., a three-point linkage (also referred to as a “three-point link” or a “three-point hitch”), a PTO (Power Take Off) shaft, a universal joint, and a communication cable. The linkage device 108 allows the implement 300 to be attached to, or detached from, the work vehicle 100. The linkage device 108 is able to raise or lower the three-point link with a hydraulic device, for example, thus changing the position and/or attitude of the implement 300. Moreover, motive power can be sent from the work vehicle 100 to the implement 300 via the universal joint. While towing the implement 300, the work vehicle 100 allows the implement 300 to perform a predetermined task. The linkage device may be provided frontward of the vehicle body 101. In that case, the implement may be connected frontward of the work vehicle 100.
Although the implement 300 shown in
The work vehicle 100 shown in
In addition to the positioning device 110, the cameras 120, the obstacle sensors 130, the LiDAR sensor 140 and the operational terminal 200, the work vehicle 100 in the example of
The GNSS receiver 111 in the positioning device 110 receives satellite signals transmitted from a plurality of GNSS satellites and generates GNSS data based on the satellite signals. The GNSS data may be generated in a predetermined format such as, for example, the NMEA-0183 format. The GNSS data may include, for example, the identification number, the angle of elevation, the angle of direction, and a value representing the reception strength of each of satellites from which the satellite signals are received. The reception strength may be expressed by a value of, for example, the carrier-to-noise power density (C/NO) or the like. The GNSS data may also include positional information on the work vehicle 100 as calculated based on the plurality of received satellite signals, and also include information indicating the reliability of the positional information. The positional information may be expressed by, for example, the latitude, the longitude, the altitude from the mean sea level, and the like. The reliability of the positional information may be expressed by, for example, a DOP value, which indicates the positional arrangement of the satellites.
The positioning device 110 shown in
Note that the positioning method is not limited to being performed by use of an RTK-GNSS; any arbitrary positioning method (e.g., an interferometric positioning method or a relative positioning method) that provides positional information with the necessary accuracy can be used. For example, positioning may be performed by utilizing a VRS (Virtual Reference Station) or a DGPS (Differential Global Positioning System). In the case where positional information with the necessary accuracy can be obtained without the use of the correction signal transmitted from the reference station 60, positional information may be generated without using the correction signal. In that case, the positioning device 110 does not need to include the RTK receiver 112.
The positioning device 110 in the present example embodiment further includes the IMU 115. The IMU 115 includes a 3-axis accelerometer and a 3-axis gyroscope. The IMU 115 may include a direction sensor such as a 3-axis geomagnetic sensor. The IMU 115 functions as a motion sensor which can output signals representing parameters such as acceleration, velocity, displacement, and attitude of the work vehicle 100. Based not only on the satellite signals and the correction signal but also on a signal that is output from the IMU 115, the processing circuit 116 can estimate the position and orientation of the work vehicle 100 with a higher accuracy. The signal that is output from the IMU 115 may be used for the correction or complementation of the position that is calculated based on the satellite signals and the correction signal. The IMU 115 outputs a signal more frequently than the GNSS receiver 111. Utilizing this signal that is output highly frequently, the processing circuit 116 allows the position and orientation of the work vehicle 100 to be measured more frequently (e.g., about 10 Hz or above). Instead of the IMU 115, a 3-axis accelerometer and a 3-axis gyroscope may be separately provided. The IMU 115 may be provided as a separate device from the positioning device 110.
In the example of
The calculation of the position is not limited to being performed by the positioning device 110, and may be performed by any other device. For example, the controller 180 or an external computer may acquire output data from each of the receivers and each of the sensors as is required for positioning, and estimate the position of the work vehicle 100 based on such data.
The cameras 120 are imagers that image the surrounding environment of the work vehicle 100. Each of the cameras 120 includes an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), for example. In addition, each camera 120 may include an optical system including one or more lenses and a signal processing circuit. During travel of the work vehicle 100, the cameras 120 image the surrounding environment of the work vehicle 100, and generate image data (e.g., motion picture data). The cameras 120 are able to capture motion pictures at a frame rate of 3 frames/second (fps: frames per second) or greater, for example. The images generated by the cameras 120 may be used when a remote supervisor checks the surrounding environment of the work vehicle 100 with the terminal 400, for example. The images generated by the cameras 120 may also be used for the purpose of positioning and/or detection of obstacles. As shown in
The obstacle sensors 130 detect objects around the work vehicle 100. Each of the obstacle sensors 130 may include a laser scanner or an ultrasonic sonar, for example. When an object exists at a position within a predetermined distance from the obstacle sensor 130, the obstacle sensor 130 outputs a signal indicating the presence of the obstacle. The plurality of obstacle sensors 130 may be provided at different positions on the work vehicle 100. For example, a plurality of laser scanners and a plurality of ultrasonic sonars may be disposed at different positions on the work vehicle 100. Providing such a great number of obstacle sensors 130 can reduce blind spots in monitoring obstacles around the work vehicle 100.
The steering wheel sensor 152 measures the angle of rotation of the steering wheel of the work vehicle 100. The angle-of-turn sensor 154 measures the angle of turn of the front wheels 104F, which are the wheels responsible for steering. Measurement values by the steering wheel sensor 152 and the angle-of-turn sensor 154 are used for steering control by the controller 180.
The wheel axis sensor 156 measures the rotational speed, i.e., the number of revolutions per unit time, of a wheel axis that is connected to a tire 104. The wheel axis sensor 156 may be a sensor including a magnetoresistive element (MR), a Hall generator, or an electromagnetic pickup, for example. The wheel axis sensor 156 outputs a numerical value indicating the number of revolutions per minute (unit: rpm) of the wheel axis, for example. The wheel axis sensor 156 is used to measure the speed of the work vehicle 100.
The drive device 240 includes various types of devices required to cause the work vehicle 100 to travel and to drive the implement 300; for example, the prime mover 102, the transmission 103, the steering device 106, the linkage device 108 and the like described above. The prime mover 102 may include an internal combustion engine such as, for example, a diesel engine. The drive device 240 may include an electric motor for traction instead of, or in addition to, the internal combustion engine.
The storage 170 includes one or more storage mediums such as a flash memory or a magnetic disc. The storage 170 stores various data that is generated by the positioning device 110, the cameras 120, the obstacle sensors 130, the sensors 150, and the controller 180. The data that is stored by the storage 170 may include map data on the environment that is traveled by the work vehicle 100 (may also be referred to as an “environment map”), and data on a target path for self-driving. The environment map and the target path may be generated by the controller 180 itself or generated by the processor 560 in the management computer 500. The storage 170 also stores a computer program(s) to cause each of the ECUs in the controller 180 to perform various operations (to be described later). Such a computer program(s) may be provided to the work vehicle 100 via a storage medium (e.g., a semiconductor memory, an optical disc, etc.) or through telecommunication lines (e.g., the Internet). Such a computer program(s) may be marketed as commercial software.
The controller 180 includes the plurality of ECUs. The plurality of ECUs may include, for example, the ECU 181 for speed control, the ECU 182 for steering control, the ECU 183 for implement control, the ECU 184 for self-driving control, and the ECU 185 for communication control. The ECU 181 controls the prime mover 102, the transmission 103 and brakes included in the drive device 240, thus controlling the speed of the work vehicle 100. The ECU 182 controls the hydraulic device or the electric motor included in the steering device 106 based on a measurement value of the steering wheel sensor 152, thus controlling the steering of the work vehicle 100. In order to cause the implement 300 to perform a desired operation, the ECU 183 controls the operation of the three-point link, the PTO shaft and the like that are included in the linkage device 108. Also, the ECU 183 generates a signal to control the operation of the implement 300, and transmits this signal from the communication device 190 to the implement 300. Based on signals which are output from the positioning device 110, the steering wheel sensor 152, the angle-of-turn sensor 154 and the wheel axis sensor 156, the ECU 184 performs computation and control for achieving self-driving. During self-driving, the ECU 184 sends the ECU 181 a command value to change the speed, and sends the ECU 182 a command value to change the steering angle. Based on the command value to change the speed, the ECU 181 controls the prime mover 102, the transmission 103, or the brakes to change the speed of the work vehicle 100. Based on the command value to change the steering angle, the ECU 182 controls the steering device 106 to change the steering angle. The ECU 185 controls communication with another device performed by the communication device 190. During travel of the work vehicle 100, the ECU 185 controls the communication device 190 to transmit, to the management computer 500, the GNSS data output from the GNSS receiver 111 and the image data output from the cameras 120. The ECU 185 may transmit only a portion of the GNSS data, instead of the entirety thereof, to the management computer 500. Among the GNSS data, for example, only information indicating the reception strength of each satellite signal or only information indicating the reliability of positioning may be transmitted to the management computer 500. The data to be transmitted to the management computer 500 may include information indicating the estimated position of the work vehicle 100 as calculated by the processing circuit 116.
Through the action of these ECUs, the controller 180 realizes self-driving and communication with another device. During self-driving, the controller 180 controls the drive device 240 based on the position of the work vehicle 100 as measured or estimated by the positioning device 110 and the target path stored in the storage 170. As a result, the controller 180 can cause the work vehicle 100 to travel along the target path.
The plurality of ECUs included in the controller 180 may communicate with one another according to a vehicle bus standard such as, for example, a CAN (Controller Area Network). Instead of the CAN, faster communication methods may be used, e.g., Automotive Ethernet (registered trademark). Although the ECUs 181 to 185 are illustrated as individual corresponding blocks in
The communication device 190 is a device including an antenna and a circuit to communicate with the implement 300, the management computer 500 and the terminal 400. The communication device 190 includes circuitry to perform exchanges of signals complying with an ISOBUS standard such as ISOBUS-TIM, for example, between itself and the communication device 390 of the implement 300. This allows the implement 300 to perform a desired operation, or allows information to be acquired from the implement 300. The communication device 190 may further include an antenna and a communication circuit to exchange signals via the network 80 between the communication device 490 of the terminal 400 and the communication device 590 of the management computer 500. The network 80 may include a 3G, 4G, 5G, or any other cellular mobile communications network and the Internet, for example. The communication device 190 may have the function of communicating with a mobile terminal that is used by a supervisor who is situated near the work vehicle 100. With such a mobile terminal, communication may be performed based on any arbitrary wireless communication standard, e.g., Wi-Fi (registered trademark), 3G, 4G, 5G or any other cellular mobile communication standard, or Bluetooth (registered trademark).
The buzzer 220 is an audio output device to present an alarm sound to alert the user of an abnormality. For example, the buzzer 220 may present an alarm sound when an obstacle is detected during self-driving. The buzzer 220 is controlled by the controller 180.
The operational terminal 200 is a terminal for the user to perform a manipulation related to the travel of the work vehicle 100 and the operation of the implement 300, and may also be referred to as a virtual terminal (VT). The operational terminal 200 may include a display such as a touch screen panel, and/or one or more buttons. The display may be a display such as a liquid crystal display or an organic light-emitting diode (OLED) display, for example. By manipulating the operational terminal 200, the user can perform various manipulations, such as, for example, switching ON/OFF the self-driving mode, setting a target path, recording or editing an environment map, and switching ON/OFF the implement 300. At least a portion of these manipulations may also be realized by manipulating the operation switches 210. The operational terminal 200 may be configured so as to be detachable from the work vehicle 100. A user who is at a remote place from the work vehicle 100 may manipulate the detached operational terminal 200 to control the operation of the work vehicle 100. Instead of the operational terminal 200, the user may manipulate a computer on which necessary application software is installed, for example, the terminal 400, to control the operation of the work vehicle 100.
The drive device 340 in the implement 300 performs a necessary operation for the implement 300 to perform a predetermined task. The drive device 340 includes a device adapted to the intended use of the implement 300, e.g., a hydraulic device, an electric motor, or a pump. The controller 380 controls the operation of the drive device 340. In response to a signal that is transmitted from the work vehicle 100 via the communication device 390, the controller 380 causes the drive device 340 to perform various operations. Moreover, a signal that is in accordance with the state of the implement 300 may be transmitted from the communication device 390 to the work vehicle 100.
The input device 420 in the terminal 400 is a device that accepts input operations from the user. The input device 420 may include a mouse, a keyboard, or one or more buttons or switches, for example. The display 430 may be a display such as a liquid crystal display or an OLED display, for example. The input device 420 and the display 430 may be implemented as a touch screen panel. The storage 450 includes one or more storage mediums, such as a semiconductor storage medium and a magnetic storage medium. The storage 450 stores a computer program(s) to be executed by the processor 460 and various data that is generated by the processor 460. The processor 460 operates by executing the computer program(s) stored in the storage 450. In response to the manipulation made by the user via the input device 420, the processor 460 causes the display 430 to display an image based on visualized data generated by the management computer 500.
The storage 550 in the management computer 500 includes one or more storage mediums such as a semiconductor storage medium and a magnetic storage medium. The storage 550 stores a computer program(s) to be executed by the processor 560 and various data generated by the processor 560. The storage 550 stores information included in GNSS data output from the GNSS receiver 111 and image data output from the cameras 120 in association with each other. For example, the storage 550 may store a database that associates information indicating the reception levels of the satellite signals included in the GNSS data, the image data, and the time/date of acquisition of the information and the image data. The database may further include information on the identification number of the work vehicle 100.
The processor 560 operates by executing the computer program(s) stored in the storage 550. The processor 560 can refer to the database stored in the storage 550 and thus generate visualized data representing the surrounding environment of the work vehicle 100.
Now, an example operation of the work vehicle 100 will be described.
Now, an example control by the controller 180 during self-driving will be described.
In the example shown in
Hereinafter, with reference to
As shown in
As shown in
As shown in
As shown in
For the steering control and speed control of the work vehicle 100, control techniques such as PID control or MPC (Model Predictive Control) may be applied. Applying these control techniques will make for smoothness of the control of bringing the work vehicle 100 closer to the target path P.
Note that, when an obstacle is detected by one or more obstacle sensors 130 during travel, the controller 180 stops the work vehicle 100. Alternatively, when an obstacle is detected, the controller 180 may control the drive device 240 so as to avoid the obstacle. Based on the data output from the LiDAR sensor 140, the controller 180 is able to detect an object located at a relatively distant position from the work vehicle 100 (e.g., another vehicle, a pedestrian, etc.). By performing the speed control and steering control so as to avoid the detected object, the controller 180 can achieve self-traveling on public roads.
Now, an example of a process of recording travel log data and generating visualized data will be described.
In the present example embodiment, the work vehicle 100 can automatically travel in an unmanned manner inside and outside the field. In the storage 170, information on the environment map including the field and public roads outside the field and on the target path is recorded previously. The environment map and the target path are generated by, for example, the controller 180 or the processor 560 of the management computer 500. In the case of traveling on a public road, the work vehicle 100 travels along the target path while sensing the surroundings thereof by use of the sensors such as the cameras 120 and the LiDAR sensor 140, with the implement 300 being raised.
During self-traveling of the work vehicle 100, the ECU 185 in the controller 180 transmits data output from the positioning device 110 and image data output from the cameras 120 to the management computer 500 as travel log data. The processor 560 of the management computer 500 records the received travel log data in the storage 550.
Based on the travel log data, the processor 560 generates image data representing the surrounding environment of the work vehicle 100 as visualized data. The processor 560 generates the visualized data based on a predetermined condition. For example, the processor 560 may generate the visualized data when determining that reception interference has occurred. For example, the processor 560 can determine whether reception interference of a satellite signal is occurring or not based on the reception strength 83 or the positioning reliability 84 included in the travel log data shown in
An obstacle hindering the reception of a satellite signal may possibly exist in the environment in which the work vehicle 100 travels. Particularly in the case where the work vehicle 100 travels on the road 76, there are often cases where the receipt of the satellite signal is hindered by the obstacle. The obstacle may be, for example, a tree along the road, a large vehicle, a high-rise building or the like. Among these obstacles, there may be an obstacle that was not present at the time when the target path was generated. For example, a plant such as a tree grows over time. Such a plant, even though not causing reception interference at the time of the generation of the target path, may possibly cause reception interference as a result of growing over time.
In such a state, the processor 560 of the management computer 500 detects that reception interference is occurring based on information indicating the receiving states of the satellite signals included in the log data acquired from the work vehicle 100. For example, the processor 560 can detect that reception interference is occurring based on a value of the reception strength of each of the satellite signals.
In the case where reception interference is occurring, the processor 560 generates visualized data including information indicating the occurrence of the reception interference. For example, the processor 560 can generate, as visualized data, a message indicating that the reception strength is low and image data including information on the positions of the satellites at that moment. The information on the positions of the satellites is generated based on satellite position data acquired from the streaming server 600. The satellite position data includes information on an orbit of each of the satellites. Based on the estimated position of the work vehicle 100 and the orbit of each satellite, the processor 560 can calculate the direction (e.g., the angle of elevation and the angle of direction) of each satellite as seen from the estimated position of the work vehicle 100. Based on the calculated direction of each satellite, the processor 560 can generate visualized data representing an image including a mark that shows in which direction each satellite is located. The processor 560 transmits the visualized data from the communication device 590 to the terminal 400 in response to, for example, a request from the terminal 400. The display 430 of the terminal 400 displays an image based on the received visualized data.
The image shown in
The management computer 500 may transmit, to the terminal 400, first visualized data representing an image or the like of a site (position) where the value of the reception strength of the satellite signal is decreased to a predetermined value or lower and second visualized data representing an image or the like of the site captured in the past. The display 430 of the terminal 400 may display the image based on the first visualized data and the image based on the second visualized data side by side.
The above-described operation allows the user to look at the image based on the visualize data and check whether reception interference is occurring or not, as well as the surrounding environment of the work vehicle 100 and the positions of the satellites at that moment.
In the example shown in
As in the example shown in
The processor 560 may store the log data in the storage 550 on only a portion of the period when the work vehicle 100 is automatically traveling, the portion including a period when reception interference is occurring. In this case, the processor 560 generates visualized data only on the portion of the period. Such an operation can reduce or minimize the amount of data to be recorded in the storage 550.
The processor 560 may be configured or programmed to output an alert signal when recognizing an object that causes reception interference in the image captured by the camera 120. In the example shown in
In the above-described example embodiment, image data generated by the cameras 120 is used as sensing data. Instead of, or in addition to, the image data, point cloud data output from the LiDAR sensor 140 may be used as the sensing data. The point cloud data represents a three-dimensional distribution of measurement points in the surrounding environment of the work vehicle 100. Therefore, the processor 560 of the management computer 500 can generate visualized data representing the distribution of objects existing around the work vehicle 100 based on the point cloud data.
In the above-described example embodiment, the terminal 400 acquires visualized data and displays the acquired visualized data. The management system according to an example embodiment of the present disclosure is not limited to having such a configuration. Alternatively, for example, the communication device 190 of the work vehicle 100 may acquire the visualized data and the operational terminal 200 may display the acquired visualized data.
At least a portion of the operations executed by the management computer 500 according to the above-described example embodiment may be executed by the work vehicle 100 or the terminal 400. For example, the controller 180 in the work vehicle 100 or the processor 460 in the terminal 400 may generate visualized data based on log data. In this case, the work vehicle 100 or the terminal 400 acts as a management system according to an example embodiment of the present disclosure.
Although the work vehicles 100 according to each of the above-described example embodiments are tractors, the techniques according to each example embodiment are also applicable to vehicles other than tractors as well as to agricultural machines other than vehicles. For example, the techniques according to each example embodiment may also be applied to harvesters, rice transplanters, vehicles for crop management, vegetable transplanters, mowers, mobile robots for agriculture, or other agricultural machines.
A device for performing self-driving control, transmission of log data, or generation of visualized data according to each of the above example embodiments can be mounted on an agricultural machine lacking such functions as an add-on. Such a device may be manufactured and marketed independently from the agricultural machine. A computer program for use in such a device may also be manufactured and marketed independently from the agricultural machine. The computer program may be provided in a form stored in a non-transitory computer-readable storage medium, for example. The computer program may also be provided through downloading via telecommunication lines (e.g., the Internet).
The techniques according to example embodiments of the present disclosure are applicable to management systems for agricultural machines, such as tractors, harvesters, rice transplanters, vehicles for crop management, vegetable transplanters, mowers, seeders, spreaders, or agricultural robots, for example.
While example embodiments of the present invention have been described above, it is to be understood that variations and modifications will be apparent to those skilled in the art without departing from the scope and spirit of the present invention. The scope of the present invention, therefore, is to be determined solely by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2021-107444 | Jun 2021 | JP | national |
This application claims the benefit of priority to Japanese Patent Application No. 2021-107444 filed on Jun. 29, 2021 and is a Continuation Application of PCT Application No. PCT/JP2022/013989 filed on Mar. 24, 2022. The entire contents of each application are hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/013989 | Mar 2022 | US |
Child | 18394074 | US |