The present disclosure relates to agriculture, an autonomous driving field of agricultural machinery, and especially relates to an automatic driving system for grain processing, and an automatic driving method and a path planning method.
Agricultural machinery refers to various machinery used in the initial processing process of agricultural products and animal products, as well as in the production of crops and animal husbandry. There are various types of agricultural machinery, such as seeding equipment, plowing equipment, harrowing equipment, rotary tillers, plant protection equipment, harvesting equipment, and so on. In the operation process, the agricultural machinery needs to take into account of a walking system and an operation system of the mechanical equipment. When the agricultural machinery is in motion in the farmland, it is necessary to adjust a route of the agricultural machinery according to conditions of the farmland.
When the agricultural machinery is operating in farmland, it is necessary to judge the conditions of the farmland and the crop growth of the farmland in real time, and to operate agricultural machinery and the operation system according to the operation conditions of the farmland and the crop growth of the farmland. In the prior art, operating the agricultural machinery and the operation system can be done by human drivers and operators. The agricultural machinery needs to take into account of factors such as worked area of the farmland, unworked area of the farmland, as well as boundaries and many other factors. In the operation process of the agricultural machinery, an operation of a vehicle and operation parameters needs to be adjusted in real time according to the condition of the crop. In a more complicated operating environment in which driving conditions also need to be considered, the agricultural equipment of the existing technology also requires the operator to adjust the operation of the agricultural machinery based on real-time information of the farmland crop. The probability of errors in judgment in the operation of agricultural machinery may be great when manual operation is employed, which leads to increase in failures of the mechanical equipment during the course of operation.
Existing agricultural machinery relies on human drivers to operate the agricultural equipment. Based on the PTK satellite positioning method, a high-precision satellite positioning information can be obtained. However, for self-driving agricultural machinery, especially harvester equipment, it may be more important to determine a harvestable area, an unharvested area of the current farmland crop, the boundary area of the farmland, and other information, than the positions of the agricultural equipment, in order to accurately operate or adjust the operation of the agricultural machinery. The agricultural machinery of the existing technology cannot determine the accuracy of farmland areas that has been provided, and the course of operation to the agricultural machinery usually follows a set path. Once a deviation from the set route occurs, it is difficult to adjust and modify in a timely manner. Therefore, if the set path is not accurate, the operation of the agricultural machinery of the existing technology may not be efficient and can even result in errors causing serious mechanical failure. In addition, when using the PTK satellite positioning method, the agricultural equipment has high performance requirements, and the required manufacturing costs and maintenance costs of the agricultural equipment are high, so the self-driving positioning of the existing technology of may not be applicable to the current agricultural machinery.
Present disclosure provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, the automatic driving system identifies the farmland area based on at least one visual image.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the automatic driving system controls the driving path by identifying unworked area, worked area, and farmland boundaries area in the visual image.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein a path planning system of the automatic driving system automatically plans the path based on positioning information of current vehicle, information identified by the image processing system, and information of the navigation system.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the automatic driving system plans the driving path and operation route of the automatic driving system based on the identified areas from the visual image.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the automatic driving system acquires images by the image acquisition device, identifies the areas in the visual image, and updates or adjusts the operation route of the automatic driving system in real time according to the changed areas, so as to make the automatic driving system work better.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the image processing system of the automatic driving system, based on the acquired visual image information, uses image segmentation technology to identify the unworked area, the worked area, the farmland boundary area and the boundaries of adjacent two areas in the image.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the image processing system of the automatic driving system, based on the acquired visual image information, uses image segmentation technology to identify crop information, such as species, height, and particle plumpness of the crop in the image, so as to enable the operation system of the automatic driving system to adjust the operation parameters based on the information of the crops.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the image processing system of the automatic driving system identifies the area boundary in the image based on the acquired image information, so that the path planning system can plan the driving path of the vehicle based on the identified area boundary.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the automatic driving system does not require high precision satellite positioning, therefore reducing difficulty in manufacturing the automatic driving equipment, and also reducing the maintenance cost of the automatic driving equipment.
The present disclosure further provides an automatic driving system for grain processing, and an automatic driving method and a path planning method, wherein the automatic driving system carries out path planning based on the area division information output by the image processing system to realize automatic driving and automatic driving operation.
Features of the present disclosure are described in the following detailed description, and can be realized by the combination of means and devices specially pointed out in the attached claims.
According to an aspect of the present disclosure, the present disclosure provides a path planning method for automatic driving system for grain processing, the path planning method includes:
step (a): acquiring at least one image of farmland surrounding a grain processing host;
step (b): identifying and dividing areas of the farmland and boundaries of the farmland corresponding to the image; and
step (c): planning at least one driving planning path based on the identified areas of the farmland and the boundaries of the farmland.
According to an embodiment of the present disclosure, the path planning method further includes: step (a.0): setting operation area of the farmland and at least one operation boundary corresponding to the operation area.
According to an embodiment of the present disclosure, the step (a.0) further includes: identifying the operation area and the operation boundary of the farmland by an image processing system.
According to an embodiment of the present disclosure, the step (a) further includes: based on position of the automatic driving system for grain processing, photographing the image surrounding the automatic driving system for grain processing in real time.
According to an embodiment of the present disclosure, the step (b) further includes: segmenting the image by using an image segmentation method, and identifying and dividing the areas of the image.
According to an embodiment of the present disclosure, the step (b) further includes: an image processing system using an image segmentation method to segment the image, and dividing the areas of the image into an unworked area, a worked region, and a farmland boundary area.
According to an embodiment of the present disclosure, the step (b) further includes:
step (b.1): segmenting the image into a plurality of pixel regions and normalizing pixel values of the pixel regions into an array;
step (b.2): extracting the features of the pixel regions corresponding to each array;
and
step (b.3): outputting a classification label of the image based on the features of the pixel regions.
According to an embodiment of the present disclosure, the step (c) further includes: step (c.1): determining the driving planning path based on positioning information of the grain processing host, area planning information of the image identified by the image processing system, and navigation information of a navigation system.
According to an embodiment of the present disclosure, the step (c) further includes: step (c.2): adjusting driving direction of the grain processing host based on crop information identified by the image processing system to generate a vehicle driving path.
According to another aspect of the present disclosure, the present disclosure provides an automatic driving method for automatic driving system for grain processing, the automatic driving method includes:
step (I): acquiring at least one image and identifying areas of farmland and boundary of farmland in the image;
step (II): planning at least one driving planning path based on the areas of the farmland and the boundaries of the farmland; and
step (III): controlling a grain processing host to move automatically according to the driving planning path.
According to an embodiment of the present disclosure, the step (II) further includes:
acquiring positioning information of the grain processing host; and
planning the driving planning path based on the positioning information, the identified areas of the farmland and the boundaries of the farmland, and navigation information of a navigation system.
According to an embodiment of the present disclosure, the step (I) further includes: identifying crop information in the farmland from the image, wherein the crop information includes information of crop type, crop height, and particle plumpness.
According to an embodiment of the present disclosure, the automatic driving method further includes: step (IV): based on the crop information of the image, adjusting operation parameters of an operation system of the grain processing host.
According to another aspect of the present disclosure, the present disclosure provides an automatic driving system for grain processing, the automatic driving system includes:
a grain processing host;
an image processing system, wherein the image processing system is set on the grain processing host to acquire at least one image of farmland surrounding the grain processing host, and the image processing system identifies the areas of the image based on an image segmentation recognition method; and
a path planning system, wherein the path planning system plans at least one driving planning path based on the areas identified by the image processing system, and the grain processing host controls automatic driving according to the driving planning path planned by the path planning system.
According to an embodiment of the present disclosure, the automatic driving system further includes an image acquisition device, the image acquisition device is set on the grain processing host, and the image acquisition device captures images in front of the grain processing host.
According to an embodiment of the present disclosure, wherein the image processing system identifies at least one worked area, at least one unworked area, and at least one farmland boundary area from the image.
According to an embodiment of the present disclosure, the automatic driving system for grain processing further includes:
an image segmentation module, the image segmentation module segments the image into a plurality of pixel regions, and each pixel region includes at least one pixel unit;
a feature module, wherein the feature module extracts features of each pixel region based on the pixel unit of the pixel region; and
a region division module, wherein the region division module identifies and divides the areas of the image according to the features of the pixel region.
According to an embodiment of the present disclosure, the automatic driving system for grain processing further includes a positioning device and a navigation system, the positioning device and the navigation system are arranged on the grain processing host, wherein the positioning device acquires position information of the grain processing host, and the navigation system provides navigation information for the grain processing host.
According to an embodiment of the present disclosure, the automatic driving system for grain processing further includes a positioning device and a navigation system, the positioning device and the navigation system are arranged on the grain processing host, wherein the positioning device acquires position information of the grain processing host, and the navigation system provides navigation information for the grain processing host.
According to an embodiment of the present disclosure, the automatic driving system for grain processing further includes:
an operation area setting module, wherein the operation area setting module sets the farmland boundary area to acquire the unworked area of the farmland and an operation boundary of the farmland; and
a driving path planning module, wherein the image processing system identifies area planning information of the image based on the positioning information of the grain processing host and the navigation information of the navigation system to acquire at least one driving planning path.
Further objects and advantages of the present invention are disclosed by the following description and drawings.
Implementations of the present disclosure will now be described, by way of embodiments, with reference to the attached figures.
The following description is used to disclose the present invention so that the technical personnel in the art can realize the present invention. The preferred embodiments described below are for example only, and technical personnel in the field can think of other obvious variant embodiments. The basic principles of the present invention as defined in the following description may be applied to other embodiments, developed embodiments, improvement schemes, equivalent schemes, and other technical schemes that do not deviate from the spirit and scope of the present invention.
The technical personnel in the art shall understand that, in the disclosure of the present invention, the term “portrait direction”, “horizontal direction”, “up”, “down”, “front”, “back”, “left”, “right”, “vertical”, “horizontal”, “top”, “bottom”, “inside”, “outer” and other indicative orientation or positional relationship is based on the orientation or position relationship shown in the drawings, and is intended to facilitate the description of the present invention and simplify the description, rather than to indicate or imply that the device or component must have a specific orientation, in a specific direction and operation, therefore, the above terms are not to be understood as limitations on the present invention.
Understandably, the term “one” should be understood as “at least one” or “one or more”, i.e. in one embodiment, the quantity of one component may be one, while in another embodiment the quantity of components may be multiple, and the term “one” cannot be understood as a limit on the quantity.
Referring to
The automatic driving system for grain processing shall divide the area type of the farmland according to the types and boundaries of each divided area, the area type divided by the automatic driving system include at least one worked area 100, at least one unworked area 200, and at least one field boundary area 300. The automatic driving system for grain processing determines a route of movement of a vehicle by a navigation system according to the divided area type, so as to achieve automatic driving.
It is worth mentioning that a self-driving vehicle, in a self-driving mode, needs to obtain accurate vehicle positioning information, especially high-precision satellite positioning information, in order to identify the route, and the self-driving vehicle needs to update obstacle information, other vehicle information, as well as pedestrians and other information, in order to achieve self-driving function at high speed. The images obtained by the automatic driving system of the invention are image data information corresponding to the crop grain in a farmland, and the images are of the scene surrounding the vehicle and the images are obtained based on the current position of the vehicle. The automatic driving system does not require high precision satellite positioning information, but merely ordinary meter-level accuracy of satellite positioning (GPS positioning or Beidou positioning, etc.). In addition, the images obtained and processed by the automatic driving system are different from the images obtained by the self-driving vehicle. Therefore, the path planning and driving mode determined by the automatic driving system are not the same as the path planning and driving mode determined by the self-driving vehicle. Understandably, the identification mode of the present invention is different from the identification mode the self-driving vehicle.
Referring to
Preferably, the image acquisition device 20 is set in the grain processing host 10. In one embodiment, the image acquisition device 20 takes still or moving pictures around the grain processing host 10. More preferably, the image acquisition device 20 is set in front part of the grain processing host 10. In one embodiment, the image acquisition device 20 can obtain the images in real time in front of the grain processing host 10. The grain processing host 10 identifies the area of the farmland from the images obtained by the image acquisition device 20, and set a driving route according to the divided area of the farmland. In one embodiment, the content of the image obtained by the image acquisition device 20 is within the field of view of the grain processing host 10. In other words, the image acquisition device 20 obtains the image within the field of view of grain processing host 10, and adjusts a travel direction of grain processing host 10 according to a position that the grain processing master 10 installed on the grain processing host 10.
In one embodiment, the image acquisition device 20 acquires images of the scene in the travel direction of the grain processing host 10 to obtain the image. In one embodiment, the image may be a two-dimensional flat image or a three-dimensional image. Understandably, the type of the image taken by the image acquisition device 20 is here only as an example, not as a limitation.
It one embodiment, the grain treatment host 10 is capable of finishing processing of crop grain during a driving process, such as the processing of crop grain includes harvesting, farming, ploughing, seeding, plant protection, harvesting operations, etc. for example, in a first embodiment of the present invention, the grain processing host 10 is a harvester equipment, and the grain treatment host 10 is controlled to driving to the unworked area 200 of the farmland for harvesting operation, in order to harvest the crop within the unworked area 200 of the farmland. The crop can be rice, wheat, corn, and so on. The grain processing host 10 execute automatic driving in the farmland according to the area divided by the image obtained by the image acquisition device 20, and conduct self-driving in the field, without a driver. Understandably, the type of the grain processing host 10 is here only as an example, not as a limitation.
As shown in
As shown in
It should be noted that the image processing system 30 uses the image segmentation recognition method to identify the areas and boundaries from the image, and the areas represent the areas of the farmland in front of the grain processing host 10. The boundaries represent the boundaries of the farmland in front of the grain processing host 10. Based on the areas and boundaries identified by the image processing system 30 using the image segmentation recognition technology, the grain processing host 10 is controlled to move and operate in the unworked area in the farmland. For example, the image acquisition device 20 set at the front end of the harvester device acquires an image of the farmland in front of the harvester device, wherein the image captured by the image acquisition device 20 is segmented and identified by the image processing system 30 to identify the unworked area 100, the worked area 200, the working area 200, and the farmland boundary area 300. The grain processing host 10, that is, the host of the harvester device, plans the vehicle driving path and harvesting operation based on the areas and boundaries identified by the image processing system 30.
It should be noted that the image processing system 30 uses the image segmentation and recognition technology to identify the type of crop, the height of crop, the ripeness and other information from the image captured by the image acquisition device 20. The image processing system 30 can determine whether the crop has been harvested based on the type of the crop and the height of the crop in the identified image. The image processing system 30 can be used to adjust the operation parameters based on the particle plumpness in the identified image. In other words, the image processing system 30 can identify the area type of the farmland and the boundary of the farmland according to the image captured by the image acquisition device 20, and identify the type, the height, the particle plumpness, and the crop maturity, etc. of the crop in the farmland.
In one embodiment, the image processing system 30 segments and identifies the image acquired by the image acquisition device 20 to the areas and boundaries of the image based on one of a segmentation method based on threshold, a segmentation method based on a region, a segmentation method based on edge, and a segmentation method based on a specific theory. In one embodiment, the image processing system 30 uses a deep learning algorithm to segment and identify the image and perform area division and boundary limitation on the image. In other words, the image processing system 30 uses the deep learning algorithm to identify the areas of the farmland and the boundary of the farmland from the image, and the grain processing host drive and operate according to the identified areas of the farmland and the boundary of the farmland. More preferably, the image processing system 30 uses the image segmentation and identification technology of the convolution neural network algorithm as the deep learning algorithm to identify the unworked area 100, the worked area 200 and the farmland boundary area 300 from the image.
It is worth mentioning that the processing algorithm used by the image processing system 30 is only an example here, not a limitation. Therefore, the image processing system 30 can also use other algorithms to segment and identify the obtained image, so as to identify the area of the farmland and the boundary of the farmland from the image.
Referring to
The image processing system 30 extracts the image features corresponding to the pixel region 301 based on the array corresponding to each pixel region 301. The image processing system 30 obtains the image features corresponding to the pixel region 301 according to the array corresponding to the pixel region 301. In one embodiment, when the image processing system 30 uses the convolutional neural network algorithm, such as a two-dimensional convolutional neural network, an input layer of the convolutional neural network corresponds to the two-dimensional array or three-dimensional array in the pixel area 301. A hidden layer of the convolutional neural network extracts features from the array of the input layer, selects features and filters features after the feature is extracted. The convolutional neural network outputs a classifying label of the pixel area 301 based on the features corresponding to the array, and the classification label corresponds to the unworked area 100, the worked area 200, or the farmland boundary area 300.
Referring to
Referring to
In one embodiment, the image processing system 30 further includes, a storage device 34, and at least one processor 35. The at least one processor 35 is used to execute a plurality of modules (e.g., the mage segmentation module 31, the feature module 32, and the area division module 33 shown in
The image segmentation module 31 acquires the image captured by the image acquisition device 20, and generates a number of pixel regions 301 by segmenting and processing the image. In one embodiment, each pixel region 301 includes at least one pixel unit. The feature module 32 uses the deep learning algorithm to extract the type of the feature of the pixel area 301, selects the feature and filter the feature. The area division module 33 divides the image based on the features of the pixel region 301 extracted by the feature module 32 to generate the classification label corresponding to the unworked area 100, the worked area 200, and the farmland boundary area 300.
In one embodiment, the image segmentation module 31 divides the image into a number of the pixel regions 301, and each of the pixel regions 301 has the same size, shape and range. It should be noted that the image segmentation module 31 can segment the image according to a threshold of the image pixel. Namely, the size, shape and range of the pixel region 301 segmented by the image segmentation module 31 can be different. In one embodiment, the pixel area 301 divided by the image segmentation module 31 is a single pixel unit when the feature module 32 of the image processing system 30 adopts the convolution neural network algorithm to segment the image.
In one embodiment, the feature module 32 includes a pixel processing module 321, a feature extraction module 322, and a feature output module 323. The pixel processing module 321 processes the array of the pixel unit in the pixel area 301. In one embodiment, the pixel processing module 321 normalizes the pixel area 301 into an array suitable for processing. The feature extraction module 322 inputs the array of the pixel area 301 processed by the pixel processing module 321, extracts the type of the features corresponding to the array, selects the features, and filters the features, to retain data as to availability and eliminate data as to interferences, so as to make the features more prepared. The feature output module 323 outputs the features extracted by the feature extraction module 322, and the area division module 33 generates a classification label of the corresponding area in combination with the features output by the feature output module 323.
The area division module 33 divides the areas of the image and sets area boundaries based on the features of the pixel region 301 extracted by the feature module 32. Correspondingly, the area division module 33 further includes an area division module 331 and a boundary division module 332. The area division module 331 divides different areas according to the features of the pixel region 301, and the boundary division module 332 divides boundary range of the areas, so as to determine the range of the area.
During the movement of the grain processing host 10 of the automatic driving system for grain processing, the image acquisition device 20 acquires the images in the front view of the grain processing host 10 in real time. Accordingly, the image processing system 30 acquires the image captured by the image acquisition device 20 in real time, and uses the image segmentation and identification technology to identify the divided area and the area boundary range corresponding to the farmland in the image. When the divided area and the area boundary range identified by the image processing system 30 are not consistent with the previous area boundary range, the identified area and area boundary range corresponding to the image are adjusted.
Referring to
Referring to
It should be noted that the divided area and the boundary range of the farmland obtained by the image processing system 30 processing the image are updated to the navigation system 50 in real time to update the navigation information of the navigation system 50. In one embodiment, the navigation system 50 can be an inertial integrated navigation system. It is understood that the types of the navigation system 50 are only of an exemplary nature here, not a limitation, and therefore the navigation system 50 can also be other types of navigation devices.
Correspondingly, the grain processing host 10 of the automatic driving system for grain processing includes a vehicle 11, an operation system 12 arranged on the vehicle 11, and a driving control system 13. In one embodiment, the operation system 12 is driven by the vehicle 11, and executes a grain processing operation, such as a harvesting operation. The driving control system 13 controls the driving of the vehicle 11 and the operation of the operation system 12. It is worth mentioning that the driving control system 13 has an self-driving mode and an operation driving mode. When the automatic driving system for grain processing is in the self-driving mode, the driving control system 13 automatically controls the operation of the vehicle 11 and the operation system 12. Accordingly, when the automatic driving system for grain processing is in the operation driving mode, the driving control system 13 allows a driver to operate the vehicle 11 and control the operation of the operation system manually.
In one embodiment, the automatic driving system for grain processing is a harvester device, wherein the operation system 12 can be a harvesting operation device. The driving control system 13 controls the driving of the vehicle 11 and the operation of the operation system 12. In other words, the driving control system 13 controls adjustment of the operation parameters of the operation system 12 during the driving of the vehicle 11. The driving control system 13 acquires information of the image processing system 30, such as the type of the crop, the height of the crop, the particle plumpness, the diameter of the crop stalks, etc., and adjusts the operation parameters of the operation system 12 based on the acquired information, such as operation speed, width, and height of the operation system 12, and parameters of cutting or threshing.
As shown in
Referring to
The path planning system 60 plans at least one driving path based on the outermost of the operation boundary 602 of the operation area 601. When the width of the operation area 601 is larger than an operation width of the operation system 12, the path planning system 60 plans a “circular” shaped driving route or an “s” shaped driving route. It should be noted that the way of the driving route planned by the path planning system 60 is only an example, not a limitation. Therefore, other driving routes can also be applied.
In one embodiment, when the vehicle 11 is travelling to far end boundary of the operation area 601, the path planning system 60 replans at least one driving path based on the range of the current unworked area 100. Namely, when the vehicle 11 is driving to the far end boundary of the operation area 601, the operation area 601 and the operation boundary 602 are updated by the path planning system 60 for the vehicle 11, and a new driving path is planned according to the updated operation area 601.
It should be noted that the driving control system 13 controls the vehicle 11 to drive according to the driving path planned by the path planning system 60. In one embodiment, the driving control system 13 controls the operation system 12 to harvest the outermost crop of the operation area 401. Namely, the driving control system 13 controls the operation system 12 to harvest crop in the unworked area 100 based on the operation boundary 602.
Referring to
In one embodiment, the path planning system 60 further includes, a storage device 64, and at least one processor 65. The at least one processor 65 is used to execute a plurality of modules (e.g., the area setting module 61, the path planning module 62, and the path adjustment module 63 shown in
The area setting module 61 identifies the operation area 601 and the operation boundary 602 of the farmland based on the image processing system 30 identifying the boundary area of the farmland in the image, or based on setting the grain processing host to operate in the operation area 601 and the operation boundary 602 in the farmland. As the operation of the grain processing host 10 causes the unworked area 100 and the worked area 200 to change, the area setting module 61 updates the range of the operation area 601 in real time and the boundary of the operation boundary 602 in order to generate a new unworked area 100 and a new worked area 200.
The path planning module 62 obtains at least one driving plan path 603 based on the positioning information of the grain processing host 10, the area planning information of the image identified by the image processing system 30, and the navigation information of the navigation system 50. In one embodiment, the driving control system 13 controls the vehicle 11 to drive according to the driving plan path 603. The path adjustment module 63 adjusts the driving direction of the grain processing device 10 based on the information of the crop of the image identified by the image processing system 30 to generate a vehicle driving path 604, wherein the vehicle driving path 604 is basically coincident with or parallel to the driving plan path 603. When the image processing system 30 identifies that an adjustment of a harvesting range is required, the vehicle driving path generated by the path adjustment module 63 deviates from the driving planning path 603.
According to another embodiment of the invention, the invention further provides a path planning method used in the automatic driving system for grain processing, wherein the path planning method includes steps as below.
(a) Acquiring at least one image of the farmland around the grain processing host 10.
(b) Identifying and dividing the area of the farmland and the boundary of the farmland corresponding to the image.
(c) Planning At least one driving planning path 603 based on the identified area.
The path planning method further includes the following steps: (a.0) setting the operation area 601 of the farmland and at least one operation boundary 602 corresponding to the operation area 601. It should be noted that the operation area 601 of the farmland includes the worked area 200, the unworked area 100, and the farmland boundary area 300. The boundary area 300 corresponding to the outside of the farmland coincides with the operation boundary 602.
The path planning method step (a.0) further includes the following steps: identifying the operation area 601 and the operation boundary 602 of the farmland by the image processing system 30 identifying the farmland area and the boundary from the image.
The step (b) of the path planning method further includes the following steps: using the image segmentation technique to segment the image, and identifying and dividing the region of the image.
In step (a) of the above path planning method, the image information around the grain processing host 10 is captured in real time based on the position and the driving direction of the grain processing host 10. In step (b) of the above path planning method, the image processing system uses the image segmentation technology to segment the image information, and identify and divide the regions from the image as the unworked region 100, the worked region 200, and the farmland boundary region 300. Step (b) of the path planning method further includes the following steps.
(b.1) Dividing the image into a number of the pixel regions 301, and normalizing the pixel values of the pixel regions 301 into an array.
(b.2) Extracting the features of the pixel region 301 corresponding to each array.
(b.3) Outputting a classification label of the image based on the features of the pixel region 301.
In step (b.3) of the path planning method, the classification label corresponds to the unworked area 100, the worked area 200, and the farmland boundary area 300.
In step (b) of the above path planning method, the image processing system 30 uses the convolution neural network algorithm of the deep learning to divide the image and identify the areas of the image.
In step (c) of the path planning method further includes step (c.1): Based on the positioning information of the grain processing host 10, the image processing system 30 identifies the area planning information of the image and the navigation information of the navigation system 50, the driving planning path 603 is obtained.
The step (c) of the path planning method further includes step (c.2): adjusting the driving direction of the grain processing host 10 based on the information that the image processing system 30 recognizes the crop of the image to form the vehicle driving path 604.
The path planning method further includes: step (d) comparing whether the divided area and the area boundary range identified by the image processing system 30 are consistent with the previous area boundary range, adjusting the divided areas and the area boundary range corresponding to the image when the divided area and the area boundary range identified by the image processing system 30 are consistent with the previous area boundary range, and keeping the divided area and the area boundary range unchanged when the divided area and the area boundary range identified by the image processing system 30 are not consistent with the previous area boundary range.
According to another aspect of the invention, the disclosure further provides an automatic driving method applied in the automatic driving system for grain processing, and the automatic driving method includes the following steps.
(1) Acquiring at least one image and identifying the area and the area boundary of the farmland in the image.
(2) Based on the area and the area boundary of the farmland, planning at least one driving planning path 603.
(3) Controlling the grain processing host 10 to move automatically according to the driving planning path 603.
The step (I) includes the method for automatically identifying the farmland area and the farmland boundary provided by the invention. The driving control system 13 controls the driving and operation of the grain processing host 10 based on the farmland area and the farmland boundary identified by the image processing system 30.
The step (I) of the automatic driving method further includes: identifying the corresponding crop information in the farmland in the image, wherein the crop information in the farmland includes the information of the crop type, the height of the crop, the particle plumpness.
The step (II) of the automatic driving method further includes the following steps.
Obtaining the positioning information of the grain processing host 10.
Based on the positioning information, identification information of the image, and the navigation information of the navigation system, planning the driving planning path 603.
Accordingly, in the step (III), the driving control system 13 controls the driving of the vehicle 11 of the grain processing host 10 according to the positioning of the grain processing host 10, the area planning information of the farmland obtained by the image processing system 30, and the navigation information.
The automatic driving method further includes steps (IV): adjusting the operation parameters of the operation system 12 of the grain processing host 10 based on the identified information of the image.
Those skilled in the art should understand that the above description and the embodiments of the present disclosure shown in the drawings are only examples and do not limit the present disclosure. The purpose of the present disclosure has been completely and effectively achieved. The function and structure principle of the present disclosure have been shown and explained in the embodiments. Without departing from the principle, the implementation of the present disclosure may have any deformation or modification.
Number | Date | Country | Kind |
---|---|---|---|
201910007342.9 | Jan 2019 | CN | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2019/107534 | Sep 2019 | US |
Child | 17366404 | US |