The present invention relates to systems and methods which can be used to register routes that can be used by a plurality of vehicles to perform a variety of tasks. More specifically, the present invention relates to systems and methods which can be used to register routes for autonomous vehicles.
Conventional systems and methods for controlling autonomous vehicles typically require extensive setup and calibration time. Further, the routing of autonomous vehicles in conventional systems often must be performed using a very cumbersome programming process which requires that a specific autonomous vehicle may only use a route if the route was registered using the specific autonomous vehicle. Accordingly, conventional systems and methods for controlling autonomous vehicles are inefficient and difficult to operate.
Preferred embodiments of the present invention provide systems and methods to register routes that can be used by autonomous vehicles to perform a variety of tasks.
A method according to a preferred embodiment of the present invention includes registering a work vehicle with a work vehicle system, recording a new route using the work vehicle, saving the new route as a registered route in the work vehicle system, selecting a particular work vehicle to follow the registered route, the particular work vehicle being the work vehicle used to record the new route or another work vehicle different from the work vehicle used to record the new route, selecting a particular task for the particular work vehicle to perform when the particular work vehicle follows the registered route, and displaying results related to the particular task performed by the particular work vehicle.
In a preferred embodiment of the present invention, the method further includes displaying status information of the work vehicle before the work vehicle is registered with the work vehicle system.
In a preferred embodiment of the present invention, the method further includes providing the work vehicle access to the work vehicle system when the work vehicle has been registered in the work vehicle system, receiving status information from the work vehicle after the work vehicle has been provided access to the work vehicle system, and displaying the status information of the work vehicle.
In a preferred embodiment of the present invention, the particular work vehicle is the another work vehicle different from the work vehicle used to record the new route.
In a preferred embodiment of the present invention, the method further includes displaying status information of the particular work vehicle before the particular work vehicle is selected to follow the registered route.
In a preferred embodiment of the present invention, the displaying the results related to the particular task performed by the particular work vehicle includes displaying a screen that includes an image of an agricultural item before the particular task was performed by the particular work vehicle and an image of the agricultural item after the particular task was performed by the particular work vehicle.
In a preferred embodiment of the present invention, the screen does not include information regarding the registered route.
In a preferred embodiment of the present invention, the selecting the particular work vehicle and the selecting the particular task are performed using a user interface, and the user interface displays a screen that allows a user to select the particular work vehicle from among a plurality of work vehicles and select the particular task from among a plurality of tasks.
In a preferred embodiment of the present invention, the selecting the particular task includes selecting the particular task from a list of tasks that the particular work vehicle is able to perform.
In a preferred embodiment of the present invention, the registered route includes one or more task points at which the particular work vehicle is to perform the particular task.
In a preferred embodiment of the present invention, the method further includes autonomously controlling the particular work vehicle to follow the registered route and perform the particular task at each of the one or more task points included in the registered route.
In a preferred embodiment of the present invention, the method further includes capturing image data of an agricultural item when the particular work vehicle is positioned at the one or more task points, and the results include the image data.
In a preferred embodiment of the present invention, the recording the new route includes receiving an instruction to add the one or more task points to the new route, and when the instruction to add the one or more task points to the new route is received, a position of the work vehicle is saved as a location of the one or more task points of the new route.
In a preferred embodiment of the present invention, the method further includes displaying an agricultural field map including a position of the work vehicle, an area surrounding the work vehicle, and the location of the one or more task points when the instruction to add the one or more task points to the new route is received.
In a preferred embodiment of the present invention, the recording the new route using the work vehicle includes receiving an instruction to start recording the new route, controlling the work vehicle to record the new route, and receiving an instruction to stop recording the new route.
In a preferred embodiment of the present invention, the method further includes displaying an agricultural field map including a position of the work vehicle, an area surrounding the work vehicle, and a location of a start point of the new route when the instruction to start recording the new route is received, and when the instruction to start recording the new route is received, a position of the work vehicle is saved as the location of the start point of the new route.
In a preferred embodiment of the present invention, the controlling the work vehicle to record the new route includes manually, remotely, or autonomously controlling the work vehicle.
In a preferred embodiment of the present invention, the method further includes displaying at least one of an agricultural field map or a live stream when the new route is being recorded. The agricultural field map includes a position of the work vehicle and an area surrounding the work vehicle, and the live stream is obtained using one or more cameras attached to the work vehicle.
A work vehicle system according to a preferred embodiment of the present invention includes a user interface including an input to receive one or more inputs from a user, a display, and a processor operatively connected to the input and the display. The processor is configured or programmed to register a work vehicle with the work vehicle system, record a new route using the work vehicle, save the new route as a registered route in the work vehicle system; select, based on the one or more inputs received by the input, a particular work vehicle to follow the registered route, the particular work vehicle being the work vehicle used to record the new route or another work vehicle different from the work vehicle used to record the new route; select, based on the one or more inputs received by the input, a particular task for the particular work vehicle to perform when the particular work vehicle follows the registered route; and control the display to display results related to the particular task performed by the particular work vehicle.
The above and other features, elements, steps, configurations, characteristics, and advantages of the present invention will become more apparent from the following detailed description of preferred embodiments of the present invention with reference to the attached drawings.
The work vehicle 100 has a self-driving function. In other words, the work vehicle 100 can travel by the action of a controller, rather than manually. As discussed in more detail below, a controller according to the present preferred embodiment is provided inside the work vehicle 100, and is able to control the speed, steering and/or other functions or operations of the work vehicle 100. A portion or an entirety of the controller may reside outside the work vehicle. For example, control signals, commands, data, etc., may be communicated between the work vehicle 100 and a controller residing outside the work vehicle 100.
The work vehicle 100 that performs self-driving may move autonomously while sensing the surrounding environment, without any person being involved in the controlling of the movement or other operations of the work vehicle 100, and can perform autonomous movement to travel within a field or outside a field (e.g., on roads) in an unmanned manner. During autonomous movement, operations of detecting and avoiding obstacles may be performed. As discussed in more detail below, the work vehicle 100 that performs self-driving can also have the function of moving partly based on the user's instructions. For example, the work vehicle 100 can operate not only in a self-driving mode but also in a manual driving mode, where the work vehicle moves through manual operations of the user/driver.
The work vehicle 100 includes a positioning device 110 such as a GNSS receiver. Based on the position of the work vehicle 100 as identified by the positioning device 110 and a target path previously stored in a storage device, the controller causes the work vehicle 100 to automatically travel. In addition to controlling the travel of the work vehicle 100, the controller can also control the operation of the implement. As a result, while automatically traveling, the work vehicle 100 can perform a task or work using the implement.
The terminal 400 may be a mobile apparatus such as a smartphone, a tablet computer, a remote control, or a stationary computer such as a desktop personal computer (PC). The terminal 400 can be used by a user 10 who is in a field in which the work vehicle 100 performs agricultural work, or at a remote location from the field in which the work vehicle performs agricultural work. In response to a manipulation by the user 10, the terminal 400 can transmit command signals to the work vehicle 100.
In a preferred embodiment of the present invention, the work vehicle 100 includes a processing unit (also referred to as a “processor” or “path generating device”) to generate a target path along which the work vehicle 100 moves. The path generating device generates the target path P along which the work vehicle 100 travels when performing tasks within the field. Based on the information entered by the user and map information stored in the storage device, the path generating device generates the target path P. The controller controls a drive device (e.g., a steering device, a transmission, and a power unit) of the work vehicle 100 along the generated target path P. As a result, the work vehicle 100 automatically moves along the target path P.
Hereinafter, more specific examples of the configuration and operation of a system according to the present preferred embodiment will be described.
As shown in
The work vehicle 100 shown in
The work vehicle 100 further includes the positioning device 110. The positioning device 110 can include a GNSS receiver. The GNSS receiver includes an antenna to receive a signal(s) from a GNSS satellite(s) and a processing circuit to determine the position of the work vehicle 100 based on the signal(s) received by the antenna. The positioning device 110 receives a GNSS signal(s) transmitted from a GNSS satellite(s), and performs positioning on the basis of the GNSS signal(s). GNSS is a general term for satellite positioning systems, such as GPS (Global Positioning System), QZSS (Quasi-Zenith Satellite System, e.g., MICHIBIKI), GLONASS, Galileo, BeiDou, and the like. Although the positioning device 110 in the present preferred embodiment is disposed above the cabin 105, it may be disposed at any other position.
Instead of or in addition to the GNSS receiver, the positioning device 110 may include any other type of device, such as a LiDar sensor 135. Additionally, the positioning device 110 may utilize the data acquired by the cameras 120 for positioning. When objects serving as characteristic points exist in the environment that is traveled by the work vehicle 100, the position of the work vehicle 100 can be estimated with a high accuracy based on data that is acquired with the LiDar sensor 135 or cameras 120 and an environment map that is previously recorded in a storage device. The LiDAR sensor 135 or cameras 120 may be used together with the GNSS receiver. By correcting or complementing position data based on the GNSS signal(s) using the data acquired by the LiDAR sensor 135 or cameras 120, it becomes possible to identify the position of the work vehicle 100 with a higher accuracy. Furthermore, the positioning device 110 may complement the position data by using a signal from an inertial measurement unit (IMU). The IMU can measure tilts and minute motions of the work vehicle 100. By complementing the position data based on the GNSS signal using the data acquired by the IMU, the positioning performance can be improved.
In a preferred embodiment of the present invention, the work vehicle 100 further includes a plurality of obstacle sensors 130. In the example shown in
The positioning device 110, the cameras 120, the obstacle sensors 130, and the LiDar sensor 135 may be disposed at other positions on the work vehicle 100, and the work vehicle 100 can include any combination of the positioning device 110, the cameras 120, the obstacle sensors 130, and the LiDAR sensor 135.
In a preferred embodiment of the present invention, a solar panel 145 may be provided at the top or any suitable location of the work vehicle 100 to generate electrical energy to be stored in a battery of the work vehicle 100. The solar powered electrical energy can be used to drive various electrical systems and components of the work vehicle 100 including an electric motor if preferably included.
The prime mover 102 may be a diesel engine, for example. Instead of a diesel engine, an electric motor may be used. The transmission 103 can change the propulsion and the moving speed of the work vehicle 100 through a speed changing mechanism. The transmission 103 can also switch between forward travel and backward travel of the work vehicle 100.
The steering device 106 includes a steering wheel, a steering shaft connected to the steering wheel, and a power steering device to assist in the steering by the steering wheel. The front wheels 104F are the wheels responsible for steering, such that changing their angle of turn (also referred to as “steering angle”) can cause a change in the traveling direction of the work vehicle 100. The steering angle of the front wheels 104F can be changed by manipulating the steering wheel. The power steering device includes a hydraulic device or an electric motor to supply an assisting force for changing the steering angle of the front wheels 104F. When automatic steering is performed, under the control of a controller disposed in the work vehicle 100, the steering angle may be automatically adjusted by the power of the hydraulic device or electric motor.
A linkage device 108 is provided at the rear of the vehicle body 101. The linkage device 108 may include, e.g., a three-point linkage (also referred to as a “three-point link” or a “three-point hitch”), a PTO (Power Take Off) shaft, a universal joint, and a communication cable. The linkage device 108 allows the implement 300 to be attached to or detached from the work vehicle 100. The linkage device 108 is able to raise or lower the three-point link with a hydraulic device, for example, thus changing the position or attitude of the implement 300. Moreover, motive power can be sent from the work vehicle 100 to the implement 300 via the universal joint. While towing the implement 300, the work vehicle 100 allows the implement 300 to perform a predetermined task. The linkage device may be provided frontward of the vehicle body 101. In that case, the implement may be connected frontward of the work vehicle 100.
Although the implement 300 shown in
The work vehicle 100 shown in
In addition to the positioning device 110, the cameras 120, the obstacle sensors 130, the LiDAR sensor 135, and the operational terminal 200, the work vehicle 100 in the example of
The positioning device 110 shown in
Note that the positioning method is not limited to an RTK-GNSS, and any arbitrary positioning method (e.g., an interferometric positioning method or a relative positioning method) that provides positional information with the necessary accuracy can be used. For example, positioning may be performed by utilizing a VRS (Virtual Reference Station) or a DGPS (Differential Global Positioning System). In the case where positional information with the necessary accuracy can be obtained without the use of the correction signal transmitted from the reference station 60, positional information may be generated without using the correction signal. In that case, the positioning device 110 may lack the RTK receiver 112.
The positioning device 110 in the present preferred embodiment further includes an IMU 115. The IMU 115 includes a 3-axis accelerometer and a 3-axis gyroscope. The IMU 115 may include a direction sensor such as a 3-axis geomagnetic sensor. The IMU 115 functions as a motion sensor which can output signals representing parameters such as acceleration, velocity, displacement, and attitude of the work vehicle 100. Based not only on the GNSS signals and the correction signal but also on a signal that is output from the IMU 115, the positioning device 110 can estimate the position and orientation of the work vehicle 100 with a higher accuracy. The signal that is output from the IMU 115 may be used for the correction or complementation of the position that is calculated based on the GNSS signals and the correction signal. The IMU 115 outputs a signal more frequently than the GNSS signals. Utilizing this highly frequent signal allows the position and orientation of the work vehicle 100 to be measured more frequently (e.g., about 10 Hz or above). Instead of the IMU 115, a 3-axis accelerometer and a 3-axis gyroscope may be separately provided. The IMU 115 may be provided as a separate device from the positioning device 110.
In addition to or instead of the GNSS receiver 111, the RTK receiver 112, and the IMU 115, the positioning device 110 may include other kinds of sensors, e.g., LiDAR sensors or image sensors. Depending on the environment that is traveled by the work vehicle 100, it is possible to estimate the position and orientation of the work vehicle 100 with a high accuracy based on data from such sensors.
In the example shown in
In addition, each camera 120 is an imager that images the surrounding environment of the work vehicle 100, and includes image sensors, an optical system including one or more lenses and a signal processing circuit. During travel of the work vehicle 100, the cameras 120 can image the surrounding environment of the work vehicle 100, and generate image data (e.g., motion pictures). The images generated by the cameras 120 may be used when a remote supervisor checks the surrounding environment of the work vehicle 100 with the terminal 400, for example. The images generated by the cameras 120 may also be used for the purpose of positioning or obstacle detection. As shown in
The obstacle sensors 130 detect objects around the work vehicle 100. Each obstacle sensor 130 may include a laser scanner or an ultrasonic sonar, for example. When an object exists at a position closer to the obstacle sensor 130 than a predetermined distance, the obstacle sensor 130 outputs a signal indicating the presence of an obstacle. A plurality of obstacle sensors 130 may be provided at different positions of the work vehicle 100. For example, a plurality of laser scanners and a plurality of ultrasonic sonars may be disposed at different positions of the work vehicle 100. Providing a multitude of obstacle sensors 130 can reduce blind spots in monitoring obstacles around the work vehicle 100.
The drive device 140 includes various devices that are needed for the traveling of the work vehicle 100 and the driving of the implement 300, e.g., the aforementioned prime mover 102, transmission 103, steering device 106, and linkage device 108. The prime mover 102 may include an internal combustion engine such as a diesel engine. Instead of an internal combustion engine or in addition to an internal combustion engine, the drive device 140 may include one or more electric motors that are dedicated to traction and steering purposes.
The steering wheel sensor 152 measures the angle of rotation of the steering wheel of the work vehicle 100. The angle-of-turn sensor 154 measures the angle of turn of the front wheels 104F, which are the wheels responsible for steering. Measurement values by the steering wheel sensor 152 and the angle-of-turn sensor 154 are used for steering control by the controller 180.
The wheel axis sensor 156 measures the rotational speed, i.e., the number of revolutions per unit time, of a wheel axis that is connected to a tire 104. The wheel axis sensor 156 may include a sensor utilizing a magnetoresistive element (MR), a Hall generator, or an electromagnetic pickup, for example. The wheel axis sensor 156 may output a numerical value indicating the number of revolutions per minute (unit: rpm) of the wheel axis, for example. The wheel axis sensor 156 is used to measure the speed of the work vehicle 100.
The storage device 170 includes one or more storage media such as a flash memory or a magnetic disc. The storage device 170 stores various data generated by the positioning device 110, the cameras 120, the obstacle sensors 130, the sensors 150, and the controller 180. The data that is stored by the storage device 170 may include map data in the environment that is traveled by the work vehicle 100 and data of a target path for use during self-driving. The storage device 170 also stores a computer program(s) to cause the ECUs in the controller 180 to perform various operations (to be described later). Such a computer program(s) may be provided for the work vehicle 100 via a storage medium (e.g., a semiconductor memory or an optical disc) or through telecommunication lines (e.g., the Internet). Such a computer program(s) may be marketed as commercial software.
The controller 180 includes a plurality of ECUs. The plurality of ECUs may include, for example, an ECU 181 for speed control, an ECU 182 for steering control, an ECU 183 for implement control, an ECU 184 for self-driving control, and an ECU 185 for target path generation. The ECU 181 controls the prime mover 102, the transmission 103, and the brakes included in the drive device 140, thus controlling the speed of the work vehicle 100. The ECU 182 controls the hydraulic device or electric motor included in the steering device 106 based on a measurement value of the steering wheel sensor 152, thus controlling the steering of the work vehicle 100. In order to cause the implement 300 to perform a desired operation, the ECU 183 controls the operation of the three-point link, the PTO shaft, etc., that are included in the linkage device 108. Also, the ECU 183 generates a signal to control the operation of the implement 300, and transmits this signal from the communicator 190 to the implement 300. Based on signals which are output from the positioning device 110, the steering wheel sensor 152, the angle-of-turn sensor 154, and the wheel axis sensor 156, the ECU 184 performs computation and control for achieving self-driving. During self-driving, the ECU 184 sends the ECU 181 a command to change the speed, and sends the ECU 182 a command to change the steering angle. In response to the command to change the speed, the ECU 181 controls the prime mover 102, the transmission 103, or the brakes to change the speed of the work vehicle 100. In response to the command to change the steering angle, the ECU 182 controls the steering device 106 to change the steering angle. The ECU 185, which functions as a processing unit (i.e., the path generating device), generates a target path for the work vehicle 100, and records the target path thus generated to the storage device 170. The ECU 184 sends necessary commands to the ECUs 181 and 182 so that the work vehicle 100 moves along the target path generated by the ECU 185. In a preferred embodiment, the controller 180 includes a sensor monitor 186 that monitors data measured by the camera 120, obstacle sensor 130, and LiDAR sensor 135.
Through the action of these ECUs, the controller 180 is able to perform self-driving. During self-driving, the controller 180 can control the drive device 140 based on the position of the work vehicle 100 as measured or estimated by the positioning device 110 and the target path stored in the storage device 170. As a result, the controller 180 causes the work vehicle 100 to travel along the target path.
The plurality of ECUs included in the controller 180 may communicate with one another according to a vehicle bus standard such as CAN (Controller Area Network). Instead of CAN, faster communication methods may be used, e.g., Automotive Ethernet (registered trademark). Although the ECUs 181 to 185 are illustrated as individual corresponding blocks in
The communicator 190 is a circuit that performs communications with the communicator 390 of the implement 300. The communicator 190 includes circuitry to perform exchanges of signals complying with an ISOBUS standard such as ISOBUS-TIM, for example, between itself and the communicator 390 of the implement 300. This causes the implement 300 to perform a desired operation, or allows information to be acquired from the implement 300. The communicator 190 may further include a communication circuit and an antenna to exchange signals complying with any arbitrary wireless communication standard (e.g., Wi-Fi (registered trademark), 3G, 4G. 5G or other cellular mobile communication, or Bluetooth (registered trademark)) between itself and the communicator 490 of the terminal 400. Moreover, the communicator 190 can communicate with an external computer via a wired or wireless network. The external computer may be a server computer which centralizes management of information concerning fields by using a cloud, and assists in agriculture by utilizing the data on the cloud, for example. Such an external computer may be configured to perform a part of the functionality of the work vehicle 100. For example, the path generation function of the ECU 185 may be performed by an external computer. In that case, the external computer functions as a “processor” or “processing unit”.
The operational terminal 200 is a terminal for the user to perform a manipulation related to the traveling of the work vehicle 100 and the operation of the implement 300, and may also be referred to as a virtual terminal (VT). The operational terminal 200 may include a display device such as a touch screen panel, and/or one or more buttons. The display device may be a display such as a liquid crystal or an organic light-emitting diode (OLED), for example. By manipulating the operational terminal 200, the user can perform various manipulations, such as switching ON/OFF the self-driving mode, setting a target path, recording or editing a map, and switching ON/OFF the implement 300. At least some of these manipulations can also be realized by manipulating the operation switches 210. The operational terminal 200 may be configured to be detachable from the work vehicle 100. A user who is remote from the work vehicle 100 may manipulate the detached operational terminal 200 to control the operation of the work vehicle 100. Instead of the operational terminal 200, the user may manipulate a smartphone, a tablet computer, or a personal computer (PC), or other apparatuses on which necessary application software is installed, to control the operation of the work vehicle 100, The terminal 400 may cover the functionality of the operational terminal 200.
The drive device 340 in the implement 300 performs a necessary operation for the implement 300 to perform a predetermined task. The drive device 340 includes devices adapted to the intended use of the implement 300, e.g., a pump, a hydraulic device, an electric motor, or a pump. The controller 380 controls the operation of the drive device 340. In response to a signal that is transmitted from the work vehicle 100 via the communicator 390, the controller 380 causes the drive device 340 to perform various operations. Moreover, a signal that is in accordance with the state of the implement 300 may be transmitted from the communicator 390 to the work vehicle 100.
The terminal 400 may be a mobile apparatus such as a smartphone, a tablet computer, or a remote control, for example. Based on signals transmitted from the multiple GNSS satellites, the GNSS receiver 410 in the terminal 400 can output data including information of the position of the terminal 400. The GNSS receiver 410 may output data of an NMEA format, for example. The input device 420 is a device that accepts input operations from the user, and may include one or more buttons or switches. The display device 430 may be a display such as a liquid crystal or an OLED, for example. The input device 420 and the display device 430 may be implemented as a touch screen panel. The storage device 450 may include a semiconductor storage medium such as a flash memory or a magnetic disc, for example. The storage device 450 stores a computer program(s) to be executed by the processor 460 and various data that is generated by the processor 460. By executing the computer program(s) stored in the storage device 450, the processor 460 performs the operations discussed in more detail below.
In another preferred embodiment of the present invention, instead of the processing unit 500, the terminal 400 may generate the target path. In that case, the terminal 400 acquires positional information of the work vehicle 100 from the work vehicle 100 or the processing unit 500. Based on the positional information of the work vehicle 100, the processor 460 of the terminal 400 generates the target path. The terminal 400 can transmit a signal including the information of the target path to the work vehicle 100. Through such an operation, effects similar to those of each of the aforementioned preferred embodiments can be obtained.
In each of the above preferred embodiments, instead of the terminal 400, a monitoring terminal 600 for monitoring the work vehicle 100 may perform the operation of controlling the work vehicle 100. Such a monitoring terminal 600 may be provided at the home or the office of a user who monitors the work vehicle 100, for example.
In a preferred embodiment of the present invention, one or more of the terminal 400, the operational terminal 200, or the monitoring terminal 600 can be used to control one or more of the work vehicle(s) 100, More specifically, one or more of the terminal 400, the operational terminal 200, or the monitoring terminal 600 can include a user interface that can be used to register a new work vehicle 100 with the work vehicle system, register a new route with the work vehicle system, schedule a job with the work vehicle system, and view results related to a job. In the example discussed in detail below, a user interface of the terminal 400 (e.g., a user interface in which the input device 420 and the display device 430 are implemented as a touch screen panel) is used to register a new work vehicle 100 with the work vehicle system, register a new route with the work vehicle system, schedule a job with the work vehicle system, and view results related to the job. However, a user interface of the operational terminal 200 and a user interface of the monitoring terminal 600 can provide the same or similar functionality,
In step S11-1, an input to add a new work vehicle to the work vehicle system is received using the user interface of the terminal 400. For example, the user interface allows a user to input a command to add a new work vehicle to the work vehicle system. More specifically, if a user presses the work vehicles pull down menu 420-4 shown in
In a preferred embodiment of the present invention, step S11-1 can include a new work vehicle verification process in which the work vehicle system identifies and confirms a particular work vehicle that corresponds to the new work vehicle to be registered with the work vehicle system. For example, the new work vehicle verification process can include receiving/a user entering an identification number (e.g., an identifier) associated with the particular work vehicle that corresponds to the new work vehicle to be registered with the work vehicle system to confirm that the correct work vehicle is being registered with the work vehicle system. For example, as shown in
In another preferred embodiment, the new work vehicle verification process can include the user using the terminal 400 to scan a machine-readable code such as a two-dimensional barcode (e.g., an identifier) associated with the particular work vehicle (e.g., located on the particular work vehicle) that corresponds to the new work vehicle to be registered with the work vehicle system to confirm that the correct work vehicle is being registered with the work vehicle system.
In step S11-2, a selection of a type of work vehicle that corresponds to the new work vehicle is received using the user interface of the terminal 400. The user interface of the terminal 400 shown in
In step S11-4, a name for the new work vehicle is received using the user interface of the terminal 400, and in step S11-5, a description, information, or comment regarding the new work vehicle is received using the user interface of the terminal 400. For example, the user interface of the terminal 400 shown in
In step S11-6, when the confirmation to register the new work vehicle is received, the new work vehicle is saved as a registered work vehicle in the work vehicle system. For example, the new work vehicle can be added to a list of registered work vehicles that can be saved in the storage device 450 of the terminal 400 and/or the storage device 570 of the processing unit 500, for example.
When the confirmation to register the new work vehicle is received and the new work vehicle is saved as a registered work vehicle, system access is provided to the new work vehicle in step S11-7. For instance, in the example shown in
In a preferred embodiment, step S11-7 in which system access is provided to the new work vehicle corresponds to step S34-1 in
When system access is provided to the new work vehicle 100 in step S11-7, the new work vehicle 100 sends status information of the new work vehicle 100 to the system in step S11-8. For example, in step S11-8, the processing unit 500 and/or the terminal 400 receive status information from the new work vehicle 100. The status information of the new work vehicle 100 can include information such as the location of the new work vehicle 100 and the availability of the new work vehicle 100 (e.g., whether the new work vehicle 100 is available to complete a job or busy executing a job).
In a preferred embodiment, step S11-8 in which the new work vehicle 100 sends status information of the new work vehicle 100 to the work vehicle system corresponds to step S34-2 in
In a preferred embodiment of the present invention, the user interface can display the status information of each of the work vehicles 100 that have been registered with the work vehicle system. For example, the user interface of the terminal 400 can display the status of each of the work vehicles 100 that have been registered with the system in accordance with the steps discussed above with respect to
In a preferred embodiment of the present invention, the status information windows 424 (e.g., the status information window 424a for the registered work vehicle 1, the status information window 424b for the registered work vehicle 2, and the status information window 424c for the registered work vehicle 3) can each function as buttons of the user interface of the terminal 400. When one of the status information windows 424 is pressed, additional information regarding the respective registered work vehicle is displayed on the user interface. For example, if the status information window 424a of the registered work vehicle 1 shown in
In step S14-1, an input to add a new route to the work vehicle system is received using the user interface of the terminal 400. For example, the user interface allows a user to input a command to add a new route to the work vehicle system. More specifically, if a user presses the routes pull down menu 420-6 shown in
In step S14-2, a name and a description, information, or comment is received for the new route using the user interface. For example, the user interface of the terminal 400 shown in
In step S14-3, a selection of a work vehicle with which the new route will be recorded is received using the user interface. For example, the user interface of the terminal 400 shown in
In step S14-3, when the selection of the work vehicle 100 with which the new route will be recorded is received, an agricultural field map MP1 is generated and displayed on the user interface of the terminal 400, as shown in
In step S14-4, an instruction to start recording the new route is received using the user interface. For example, when the work vehicle 100 is positioned at a location that corresponds to a desired start point of the new route, the user interface of the terminal 400 allows a user to input an instruction to start recording the new route in step S14-4. For example, the user interface shown in
In a preferred embodiment, step S14-4 in which the instruction to start recording the new route is received using the user interface corresponds to step S34-3 in
Once the user inputs an instruction to start recording the new route in step S14-4 (e.g. by pressing the “START” button 420-42), the work vehicle 100 can be controlled/moved in order to record the new route. For example, the work vehicle 100 can be controlled/moved by the user manually driving the work vehicle 100, by the user remotely driving the work vehicle 100, or by the work vehicle 100 being autonomously controlled. As mentioned above, when the position of the work vehicle 100 is controlled/moved, the agricultural field map MP1 is updated to show a current position of the work vehicle 100.
When the work vehicle 100 is being controlled/moved in order to record the new route, a position of the work vehicle (e.g., as determined using the positioning device 110) is periodically recorded/saved (e.g., in the storage device 450 and/or the storage device 570). For instance, a position of the work vehicle 100 can be periodically recorded/saved at a predetermined time interval (e.g., approximately every 2 seconds) or a predetermined distance interval (e.g., approximately every 0.5 meters traveled by the work vehicle 100). For example, as shown in
When the user inputs an instruction to start recording the new route in step S14-4 (e.g, by pressing the “START” button 420-42), the user interface of the terminal 400 proceeds to the display screen shown in
In a preferred embodiment of the present invention, an entry point EN can correspond to a position of the work vehicle 100 where the work vehicle 100 enters a designated area DA, such as a trellis area of a vineyard or a field of crops that includes one or more of the agricultural items. In the example shown in
The user interface of the terminal 400 shown in
The user interface of the terminal 400 shown in
In a preferred embodiment of the present invention, a task point TP can correspond to a position of the work vehicle 100/a geographical location where the work vehicle 100 can perform a task. For example, the task point TP can correspond to a position of the work vehicle 100/a geographical location where the work vehicle 100 is positioned adjacent to an agricultural item AI to perform a task with respect to the agricultural item AI. In the example shown in
In a preferred embodiment, the user interface of the terminal 400 shown in
In a preferred embodiment of the present invention, the user interface of the terminal 400 shown in
When the work vehicle 100 is positioned at a location that corresponds to a desired end point of the new route, the user interface of the terminal 400 allows a user to input an instruction to stop recording the new route (step S14-9). For example, the user interface of the terminal 400 shown in
In a preferred embodiment, step S14-9 in which an instruction to stop recording the new route is received using the user interface corresponds to step S34-5 in
When the user inputs the instruction to stop recording the new route in step S14-9, the new route is recorded/saved (e.g., in the storage device 450 and/or the storage device 570) as a registered route RR in step S14-10. In a preferred embodiment, the registered route RR includes a plurality of waypoints including the start point SP, the navigation points NP and/or the entry points EN and the exit points EX, the task points TP, and the end point EP. The plurality of waypoints can be represented in the form of a list along with their corresponding positions (e.g., GPS coordinates of each of the plurality of waypoints).
In a preferred embodiment of the present invention, the user interface of the terminal 400 can display the registered routes that have been recorded/saved. For example, the user interface can display information regarding each of the registered routes that have been recorded/saved in accordance with the step discussed above with respect to
In a preferred embodiment of the present invention, the information windows 426 (the information window 426a of the registered route 1, the information window 426b of the registered route 2, and the information window 426c of the registered route 3) can function as buttons of the user interface of the terminal 400. For example, when one of the information windows 426 is pressed, a display screen of additional information regarding the respective registered route is displayed on the user interface of the terminal 400. For example, when the information window 426a shown in
In a preferred embodiment of the present invention, the user interface of the terminal 400 allows a user to modify/edit a registered route, deleted a registered route, and rename a registered route. For example, the user interface shown in
In a preferred embodiment of the present invention, when the user presses the “Edit Route” button 420-82, the user interface proceeds to a display screen shown in
In the example shown in
In a preferred embodiment, the user interface shown in
In a preferred embodiment of the present invention, after the registered route RR has been edited/modified, the user interface allows the user to save the modified registered route. For example, after the registered route RR has been edited/modified, the user is able to press the “Save Route” button 420-92 to save the modified registered route. The modified registered route can be saved with the previous name for the registered route or saved with a new name for the registered route.
In step S19-1, an input to add a new job to the work vehicle system is received using the user interface of the terminal 400. For example, the user interface of the terminal 400 allows a user to input a command to add a new job to the work vehicle system. More specifically, if a user presses the jobs pull down menu 420-8 shown in
In step S19-2, an input to select a registered route for the new job is received by the user interface of the terminal 400. For example, the user interface of the terminal 400 allows a user to select a registered route that the work vehicle will follow when the work vehicle executes the new job. More specifically, the user interface of the terminal 400 shown in
In a preferred embodiment of the present invention, when the registered route for the new job has been selected in step 19-2, the user interface of the terminal 400 displays a preview of the registered route RR. For example, in
In step S19-3, an input to select a registered work vehicle for the new job is received using the user interface of the terminal 400. For example, the user interface of the terminal 400 allows a user to select a registered work vehicle to execute the new job. More specifically, the user interface of the terminal 400 shown in
In a preferred embodiment of the present invention, the registered work vehicle selected in step S19-3 to execute the new job can be a registered work vehicle different from the registered work vehicle used to register the route selected in step 19-2. For example, even if a first registered work vehicle (e.g., registered work vehicle 1) was used to register a registered route (e.g., registered route 1), a second registered work vehicle (e.g., registered work vehicle 2) can be selected in step S19-3 to execute the new job which includes following the registered route (e.g., registered route 1).
In a preferred embodiment of the present invention, step S34-6 in
In step S19-4, an input to select a task for the new job is received using the user interface of the terminal 400. For example, the user interface of the terminal 400 allows a user to select a task (e.g., an agricultural task) to be performed when the new job is executed. The task selected in step 19-4 will be performed at the one or more task points (e.g., task point TP1 and task point TP2) included in the selected registered route when the new job is executed.
In the example shown in
In the example show in
In the example shown in
In a preferred embodiment, if the “None” task button 420-60c is selected in step S19-4, no task will be performed at the one or more task points when the new job is executed by the work vehicle 100. The ability to select the “None” task button 420-60c in step S19-4 is beneficial to functionality because it allows the user to select a registered route (in step 19-2) that includes task points associated with agricultural items (e.g., the first task point TP1 associated with agricultural item AI1 and the second task point TP2 associated with agricultural item AI3) without having to have a task performed at the one or more task points when the new job is executed by the work vehicle 100. By selecting the “None” task button 420-60c in step S19-4, no task will be performed at the one or more task points when the new job is executed by the work vehicle 100. If there was no ability to select the “None” task button 420-60c in step S19-4, the user would need to modify the registered route to change the one or more task points to navigation points if the user did not want to have a task performed when the new job is executed. However, modifying the registered route in this manner can be time-consuming, inconvenient, and would require the user to modify the registered route again to include the one or more task points if the user wanted to have a task performed when a future new job is executed.
In step S19-5, an input to set a schedule for the new job is received using the user interface of the terminal 400. For example, the user interface of the terminal 400 allows a user to set a date, time, and frequency for the new job. In the example shown in
In a preferred embodiment, the job is executed according to the schedule set in step S19-5. The job is executed by autonomously controlling the registered work vehicle 100 selected in step S19-3 to follow the registered route RR selected in step S19-2, and perform the task selected in step S19-4 at each of the task points TP included in the registered route RR.
In a preferred embodiment, when a job is executed according to the schedule set in step S19-5, instructions to perform the job, including the registered route RR selected in step S19-2 and the task selected in step S19-4, are sent from the work vehicle system (e.g., the processing unit 500) to the registered work vehicle selected in step S19-3. In the example shown in
In a preferred embodiment of the present invention, when the registered work vehicle 100 executes a job, the registered work vehicle 100 is autonomously controlled to follow a target path P that is generated based on the selected registered route RR. For example, in a preferred embodiment, the controller 180 is configured or programmed to generate a target path P based on the plurality of waypoints of the registered route (e.g., the start point SP, the navigation points NP and/or the entry points EN and the exit points EX, the task points TP, and the end point EP). An example of a target path P is shown in
In a preferred embodiment of the present invention, the controller 180 can be configured or programmed to function as a global planner and a local planner to generate the target path P. For example, the global planner can generate an initial target path based on the waypoints including the start point SP, the entry points EN and the exit points EX, the task points TP, and the end point EP. An example of the global planner includes a Dijkstra global planner, known to one of ordinary skill in the art. The local planner will receive the initial target path generated by the global planner, and if an obstacle is on the initial target path, for example, if an obstacle is detected by the one or more of the cameras 120, obstacle sensor 130, or the LiDAR sensor 135 as the work vehicle travels, then the local planner will change/update the initial target path so that the work vehicle avoids the obstacles. For example, the local planner is able to use Time Elastic Bands (TEB), known to one of ordinary skill in the art, to create a sequence of intermediate working machine poses (x-coordinate, y-coordinate, and heading θ) to modify the initial target path generated by the global planner.
However, in another preferred embodiment of the present invention, the controller 180 may not be configured or programmed to function as a global planner and a local planner to generate the target path P. For example, the controller 180 can be configured or programmed to generate the target path P based on each of the start point SP, the navigation points NP, the entry points EN and the exit points EX, the task points TP, and the end point EP.
On the other hand, if in step S22-2, it is determined that the controller 180 of the registered work vehicle 100 is not configured or programmed to function as a local planner (No in step S22-2), then the controller 180 generates the target path P based on waypoints of the registered route RR including the navigation points NP (step S22-4). For example, the controller 180 can be configured or programmed to generate the target path P based on the start point SP, the navigations points NP, the entry points EN and the exit points EX, the task points TP, and the end point EP. In this case, the controller 180 is configured or programmed to generate the target path P between each of the entry points EN and the exit points EX using the navigation points NP.
In the example shown in
An example of steering control by the controller 180 will be described more specifically below with reference to
As shown in
As shown in
As shown in
As shown in
For the steering control and speed control of the work vehicle 100, control techniques such as PID control or MPC (Model Predictive Control) may be applied. Applying these control techniques will ensure smoothness of the control of bringing the work vehicle 100 closer to the target path P. Additionally, when an obstacle is detected by one or more obstacle sensors 130 during travel, the controller 180 can halt the work vehicle 100. Alternatively, when an obstacle is detected, the controller 180 may control the drive device 140 so as to avoid the obstacle.
In a preferred embodiment of the present invention, when the job is being executed by the work vehicle 100, the task selected in step S19-4 is performed at each of the task points TP included in the registered route. For example, when the controller 180 controls the work vehicle 100 to follow the target path P, the controller 180 can be configured or programmed to control the work vehicle 100 to stop at each task point TP for a predetermined period of time during which the work vehicle 100 performs the task at the respective task point TP. For example, the controller 180 can be configured or programmed to control the work vehicle 100 to stop at the first task point TP1 for a predetermined period of time during which the work vehicle 100 performs the task with respect to the agricultural item AI1, and stop at the second task point TP2 for a predetermined period of time during which the work vehicle 100 performs the task with respect to the agricultural item AI3.
In a preferred embodiment of the present invention, as discussed above, if the “None” task button 420-60c is selected in step S19-4, no task will be performed at the one or more task points when the new job is executed by the work vehicle 100. For example, the work vehicle can be controlled to travel/pass through each of the one or more task points without stopping at the one or more task points. Alternatively, the work vehicle can be controlled to stop at the one or more task points and not perform any task when stopped at the one or more task points.
In a preferred embodiment of the present invention, when the work vehicle 100 is positioned at a task point TP, the cameras 120 can be used to capture image data of an agricultural item before, during, and after the work vehicle 100 performs the task with respect to the agricultural item. For example, the controller 180 can be configured or programed to control the one or more cameras 120 to capture image data of the first agricultural item AI1 before, during, and after the work vehicle 100 performs the task with respect to the first agricultural item AI1. For instance, if the task selected in step S19-4 is pruning, the one or more cameras 120 can be used to capture first image data of the first agricultural item AI1 before the work vehicle 100 performs pruning on the first agricultural item AI1 and second image data of the first agricultural item AI1 after the work vehicle 100 has pruned the first agricultural item AI1.
In a preferred embodiment of the present invention, the one or more cameras 120 can include a stereo camera and/or a depth camera, which can be used to capture data such as three-dimensional data and/or point cloud data of the agricultural item before, during, and after the work vehicle 100 performs the task with respect to the agricultural item. The LiDar sensor 135 can also be used to capture three-dimensional data of the agricultural item before, during, and after the work vehicle 100 performs the task with respect to the agricultural item. For example, the controller 180 can be configured or programed to control the one or more cameras 120 and/or the LiDar sensor 135 to capture three-dimensional data of the first agricultural item AI1 before, during, and after the work vehicle 100 performs the task with respect to the first agricultural item AI1.
In a preferred embodiment, the image data of an agricultural item (e.g., the first image data 422-1 and the second image data 422-2) captured when the work vehicle 100 is positioned at a task point TP can be included in a task point result that corresponds to the task point TP and can be recoded/saved in the system (e.g., in the storage device 450 of the terminal 400 and/or the storage device 570 of the processing unit 500). For example, a first task point result that corresponds to the first task point TP1 and a second task point result that corresponds to the second task point TP2 can be recorded/saved. Each of the task point results can also include information such as the type of task that was performed at the task point (e.g. a pruning task or an imaging, task) and the registered route in which the task point is included (the registered route that the work vehicle 100 was following when the image data of an agricultural item was captured at the task point).
In a preferred embodiment of the present invention, when the job is being executed by the work vehicle 100, the user interface of the terminal 400 can display the job being executed (step S19-6). For example, the user interface of the terminal 400 can display the target path P, an actual path AP that was taken by the work vehicle 100, and the current position of the work vehicle 100. In an example shown in
In a preferred embodiment of the present invention, when the job is being executed by the work vehicle 100, the user interface of the terminal 400 can display a live stream window LS, as shown in
In step S26-1, an input to view results related to a job is received using the user interface of the terminal 400. For example, the user interface of the terminal 400 allows a user to input a command to view results related to a job. More specifically, if a user presses the results menu button 420-10 shown in
In a preferred embodiment of the present invention, in step S26-4, the user interface of the terminal 400 can also display the registered route RR selected in step S26-3 including the task points TP included in the registered route RR. For example, as shown in
In a preferred embodiment of the present invention, the list of task point results can also include a locate button 420-78 and a view button 420-80. The locate button 420-78 and the view button 420-80 can be used to select a particular task point result to receive details/information regarding the task point result. When the locate button 420-78 is pressed (e.g., an example of step S26-5), the location of the task point that corresponds to the task point result is highlighted on the registered route RR displayed on the user interface of the terminal 400 (e.g., an example of step S26-6). For example, the location of the task point TP can highlighted on the registered route RR displayed on the user interface of the terminal 400 by displaying the task point TP with a color or symbol different from the other task points TP included in the registered route RR displayed on the user interface of the terminal 400. For instance, in
In a preferred embodiment of the present invention, when the view button 420-80 is pressed (e.g., an example of step S26-5), the user interface of the terminal 400 can display information regarding the task point result. For example, when the view button 420-80 is pressed, the user interface of the terminal 400 can display details/information regarding the task point result such as image data of the agricultural item before the work vehicle 100 performed the task on the agricultural item and image data of the agricultural item after the work vehicle 100 performed the task on the agricultural item (e.g., an example of step S26-6). For example, if the view button 420-80a is pressed, the user interface of the terminal 400 can proceed to the display screen shown in
In a preferred embodiment, the first image data 422-1 of the first agricultural item AI1 captured before the work vehicle 100 performed pruning on the first agricultural item AI1 and the second image data 422-2 of the first agricultural item AI1 captured after the work vehicle 100 has pruned the first agricultural item AI1 can be associated with the agricultural item identifier that identifies the agricultural item on which the task was performed. In a preferred embodiment, the agricultural item identifier can be displayed when the first image data 422-1 and the second image data 422-2 are displayed, as shown in
In a preferred embodiment of the present invention, in addition to displaying the first image data 422-1 and the second image data 422-2 separately, the second image data 422-2 can be displayed superimposed on the first image data 422-1, as shown in
It should be understood that the foregoing description is only illustrative of the present invention. Various alternatives and modifications can be devised by those skilled in the art without departing from the present invention. Accordingly, the present invention is intended to embrace all such alternatives, modifications, and variances that fall within the scope of the appended claims.