This patent application makes reference to, claims the benefit of, and claims priority to an Indian Patent Application No. 202241075496 filed on Dec. 26, 2022, which is incorporated herein by reference in its entirely, and for which priority is hereby claimed under the Paris Convention and 35 U.S.C. 119 and all other applicable law.
The present disclosure relates generally to the field of agricultural machines and systems; and more specifically, to a system (i.e., an electro-mechanical agricultural equipment) mounted in a vehicle for performing for an agricultural application and a method of operation of the system.
With the rapid advancement of agricultural machines, implements, special-purpose vehicles, and vehicle mounted apparatus, productivity in agricultural operations have increased. However, existing vehicle-based systems are very complex in nature, where a particular system or machinery works only when it is from a same manufacturer. In other words, one system of one manufacturer is not compatible with another system of another manufacturer. This binds a farmer to use costly machineries and agricultural implements of one specific-manufacturer. For example, it is sometimes simply not possible or very technically challenging to use a conventional camera system of one manufacturer with another system of another manufacturer as crosstalk among different electronics and mechatronics systems is generally restricted or severely limited in use.
There are many other technical problems with conventional systems and methods in terms of how to identify crop plants in the uneven land area of the agricultural field given the uncertainty in surrounding environmental conditions when the images are captured in motion i.e., when the vehicle carrying the camera system is in motion. In one example, camera-based systems are known to aid in different operations in an agricultural field. However, uneven land area of the agricultural field combined with uncertainty in surrounding environmental conditions while capturing images of agricultural field (e.g., variation in sunlight due to either clouds, rain, a shadow of a large object, like tree, while capturing an image, change in position of sun throughout the day, light intensity, a time of day when farming is done etc.) are found to severely and adversely impact the existing image acquisition systems that are used in agricultural machines or implements. The existing systems either fail or accuracy is severely impacted in such conditions. This causes the conventional machines, systems, and methods to misbehave or causes errors to differentiate between two green looking objects (e.g., crop plants and weeds). In another example, there is a problem of over-engineering, i.e., too many sensor units, too much processing, and very complex machines. In such a situation, the chances of errors are high due to multiple failure points and at the same time makes such machines very costly, power intensive, and processing intensive, which are not suited for many sub-urban, urban, or rural farming conditions and needs. For instance, some existing systems use chlorophyll sensors or detectors to supplement or corroborate the visible-spectrum image-sensors. However, it is still observed the conventional camera systems many-a-times fail to identify crop plants in the uneven land area of the agricultural field given the uncertainty in surrounding environmental conditions when the images are captured in motion. In a second example, conventional systems or agricultural special-purpose vehicles require row identification, where row-based processing forms an indispensable component of conventional systems. Conventional systems fail when proper rows are not demarcated in agricultural field. In yet another example, conventional location determination techniques and systems, such as Global Positioning System (GPS) that is integrated with an agricultural vehicle for location determination. However, it is well-known that civilian use of GPS has an error-range of 1-10 meters, and sometimes more depending on signal reception issues in a particular area. Such errors cause intermittent issues in determining a location of a crop plant or weeds whenever there is a GPS signal fluctuation.
Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art through comparison of such systems with some aspects of the present disclosure as set forth in the remainder of the present application with reference to the drawings.
The present disclosure provides a system (i.e., an electro-mechanical agricultural equipment) mounted in a vehicle for performing an agricultural application and a method of operation of the system. The present disclosure provides a solution to the existing problem of frequent misidentification of crop plants in presence of weeds in an uneven land area of the agricultural field given the uncertainty in surrounding environmental conditions when the images are captured in motion i.e., when the vehicle carrying the camera system is in motion. For example, the existing systems either fail or accuracy is severely impacted when images are captured in a changing surrounding environmental condition, causing erroneous processing and operations of the conventional machines and agricultural implements that are aided by camera systems. An aim of the present disclosure is to provide a solution that overcomes at least partially the problems encountered in prior art and provide an improved system that can be mounted in a vehicle for performing controlled and perceptive operations for an agricultural application with increased reliability in practice. The disclosed system is technically advanced in terms of detecting crop plants with advanced technology where even if there are some temporary or elastic changes in morphology of the crop plants, for example, drooping leaves, any elastic change in the physical characteristics like from movement of air across the crop plants, or a change in shape of leaves due to some external conditions. The disclosed system is able to accurately identify such crop plants with more than 98-100% accuracy in the uneven land area of the agricultural field even if there is any change in surrounding environmental conditions while the images are captured in motion. There is further provided an improved method of operation of the system which improves the perceptive ability of the system, which in turn improves the operations of the system.
These and other advantages, aspects, and novel features of the present disclosure, as well as details of an illustrated embodiment thereof, will be more fully understood from the following description and drawings.
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. Wherever possible, like elements have been indicated by identical numbers.
Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
Certain embodiments of the disclosure may be found in a system mounted in a vehicle for performing for an agricultural application and a method of operation of the system. In one aspect, the present disclosure provides the system mounted in the vehicle, where the system comprises a boom arrangement that includes a predefined number of electronically controllable sprayer nozzles, and a plurality of image-capture devices configured to capture a plurality of field-of-views (FOVs) of a plurality of defined areas of the agricultural field. The system includes one or more hardware processors configured to obtain a sequence of images corresponding to the plurality of FOVs from the plurality of image-capture devices, distinguish crop plants form weeds using a trained artificial intelligence (AI) model when the vehicle is in motion on the agricultural field. Moreover, the distinguishing of the crop plants from weeds using the pre-trained AI model includes detecting a first set of crop plant with drooping leaves, detecting a second set of crop plants manifesting an elastic change in physical characteristics of the second set of crop plants and detecting a third set of remaining crop plants different from the first set of crop plants and the second set of the crop plants. Furthermore, the one or more hardware processors are configured to cause a specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles to operate based on the distinguishing of the crop plants from the weeds. Moreover, the distinguished crop plants comprise the first set of crop plant with drooping leaves, the second set of crop plants manifesting the elastic change in physical characteristics, and the third set of remaining crop plants.
The system in the present disclosure is technically advanced in terms of improved and reliable detection of crop plants and accurate distinguishing of the crop plants from weeds. For instance, even if some plants have drooping leaves or there are some temporary or elastic changes in morphology of the crop plants, for example, any elastic change in the physical characteristics like from movement of air across the crop plants, or a change in shape of leaves due to some external conditions, the system is able to detect without fail when the vehicle is in motion with an increased accuracy. This accuracy further cascades to the operation of the electronically controllable sprayer nozzles when a crop plant to be sprayed is beneath a given nozzle. Furthermore, the disclosed system is technically advanced from conventional systems in terms of improved operation of detecting of the crop plants by processing the images even if there are sudden jerks when the vehicle moves in the uneven land area of the agricultural field. The distinguishing of the crop plants from the weeds using the pre-AI model is used to accurately identify the crop plants with more than 98-100% accuracy in the uneven agricultural land even if there is any change in surrounding environmental conditions, such as change in shape of leaves due to wind, drooping of leaves, occlusion by weed, high weed density around the crop plants, and the like. This significantly improves the subsequent operations of the system and imparts a perceptive ability to adapt to uneven agricultural land and handle surprisingly advanced real-time changes in the surrounding environmental conditions.
In an implementation form, the pre-trained AI model is trained in a training phase by extracting a drooping leaf feature from a training dataset stored in a training database, extracting one or more leaf movement features from the training dataset, extracting one or more stem bending features from the training dataset, and extracting a plurality of features of a crop plant at different growth stages and at different times-of-day with and without surrounding weeds. Moreover, the plurality of features are indicative of physical characteristics of the crop plant at the different growth stages at the different times-of-day.
In the training phase, various advanced parameters, such as drooping of the leaves, movement of the leaves, bending of the stem, growth stage, weed density and the like, are taken into account. This improve the capability of the system itself that becomes perceptive and is able to accurately distinguish the crop plants from the weeds with more than 98-100% of accuracy even if there is any change in surrounding environmental conditions, such as change in shape of leaves due to wind, drooping of leaves, occlusion by weed, high weed density around the crop plants, and the like in practice.
In a further implementation form, the one or more hardware processors are configured to receive geospatial location correction data from an external device placed at a fixed location in the agricultural field and geospatial location coordinates associated with the boom arrangement mounted on the vehicle.
The use of the geospatial location correction data (e.g., real-time kinematic positioning (RTK) correction data) from the external device (e.g., an RTK base station) that is applied on the geospatial location coordinates obtained by a geospatial sensor provided in the boom arrangement, significantly improves the positional accuracy of the boom arrangement, i.e., provides a centimetre (cm) level accuracy of position of the boom arrangement when the vehicle is in motion.
In a further implementation form, the one or more hardware processors are configured to execute mapping of pixel data of the weeds or the crop plants in an image to distance information from a reference position of the boom arrangement when the vehicle is in motion. Moreover, the specific set of electronically controllable sprayer nozzles are operated further based on the executed mapping of pixel data.
By virtue of mapping of pixel data of weeds or the crop plant to distance information from the reference position of the boom arrangement when the vehicle is in motion, increases the accuracy of the system to detect the crop plants and the weeds in the agricultural field. This accuracy further cascades to the operation of the electronically controllable sprayer nozzles when a crop plant to be sprayed is beneath a given nozzle.
In a further implementation form, the one or more hardware processors are configured to detect a confidence threshold indicative of a detection sensitivity of the crop plant in the AI model and automatically include or exclude a category of plants to be considered for operation by the one or more hardware processors based on the detected confidence threshold.
Advantageously, the use of the defined confidence threshold further improves the perceptive capability of the system such that the detection of the crop plant is achieved with improved accuracy and precision much before actual action required on the crop plants, for example, perceptive spot chemical irrespective of any change in the surrounding environmental conditions while capturing images.
In a further implementation form, the one or more hardware processors are configured to update the defined confidence threshold in response to a change in a quality parameter of the captured plurality of FOVs of the plurality of defined areas of the agricultural field.
In such implementation, an increase or a decrease in the defined confidence threshold dynamically changes the detection sensitivity of the crop plant increasing the perceptive capability of the system making the system fail-safe. For instance, sudden change in surrounding conditions, such as leaf folding, leaves drooping, partial occlusion of crop plants by surrounding weeds, high weed density or different weed density around the crop plants etc. can be handled.
In a further implementation form, the one or more hardware processors are further configured to determine an upcoming time slot to spray a chemical based on the distinguishing of the crop plants from the weeds.
The use of the defined confidence threshold significantly improves the perceptive capability of the system such that the spraying of chemicals is achieved with improved accuracy and precision for the correct time slots, at correct intended areas or spots and only when required with correct amount of spray and correct selection of a type of chemical irrespective of any change in the surrounding environmental conditions while capturing images of crop plants in the agricultural field.
In a further implementation form, the one or more hardware processors are further configured to communicate control signals to operate a plurality of different sets of electronically controlled sprayer nozzles at different time instants during a spray session based on the distinguishing of the crop plants from the weeds. As a result of the improved perceptive ability of the system, the system is able to operate and control the different set of electronically controlled sprayer nozzles to release chemical at different time instants during the spray session. Moreover, such operation enables the system to behave differently for different type of crop plants and weeds that overall increases the productivity of the system in the agricultural field.
In another aspect, the present disclosure provides a method of operation of the system. The method comprises obtaining, by one or more hardware processors, a sequence of images corresponding to a plurality of field-of-views (FOVs) of a plurality of defined areas of the agricultural field from a plurality of image-capture devices mounted in a boom arrangement of a vehicle, distinguishing, by the one or more hardware processors, crop plants from weeds using a trained artificial intelligence (AI) model when the vehicle is motion on the agricultural field. Moreover, the distinguishing of the crop plants from weeds using the pre-trained AI model comprises detecting a first set of crop plant with drooping leaves, detecting a second set of crop plants manifesting an elastic change in physical characteristics of the second set of crop plants, and detecting a third set of remaining crop plants different from the first set of crop plants and the second set of the crop plants. Furthermore, the method comprises causing, by the one or more hardware processors, a specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles to operate based on the distinguishing of the crop plants from the weeds, wherein the distinguished crop plants comprises the first set of crop plant with drooping leaves, the second set of crop plants manifesting the elastic change in physical characteristics, and the third set of remaining crop plants.
The method achieves all the advantages and technical effects of the system of the present disclosure. Furthermore, the method is technically advanced in terms of its perceptive ability and is intelligent enough to adapt to uneven agricultural land, is astutely perceptive to real-time changes in the surrounding environmental conditions, and not dependent on any row-identification. For example, in conventional methods and systems, if crops are planted in proper rows and columns in an agricultural field, then only camera-assisted or camera-aided machines can function in real-world conditions. Unlike the conventional systems, the disclosed method of the present invention does not need any prior plantation format to be followed.
It is to be appreciated that all the aforementioned implementation forms can be combined. All steps which are performed by the various entities described in the present application as well as the functionalities described to be performed by the various entities are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. Even if, in the following description of specific embodiments, a specific functionality or step to be performed by external entities is not reflected in the description of a specific detailed element of that entity which performs that specific step or functionality, it should be clear for a skilled person that these methods and functionalities can be implemented in respective software or hardware elements, or any kind of combination thereof. It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.
Additional aspects, advantages, features and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative implementations construed in conjunction with the appended claims that follow.
The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
The system 102 is mounted in the vehicle 104 for an agricultural application. The system 102 includes the boom arrangement 114 that includes the predefined number of electronically controllable sprayer nozzles 116 and the plurality of image-capture devices 118 configured to capture a plurality of field-of-views (FOVs) of a plurality of defined areas of the agricultural field 106. The system 102 further includes one or more hardware processors (shown in
The boom arrangement 114 is removably mounted on the vehicle 104. The boom arrangement 114 includes one or more elongated booms that are interconnected through a single frame. The boom arrangement 114 comprises the predefined number of electronically controllable sprayer nozzles 116 and the plurality of image-capture devices 118. The predefined number of electronically controllable sprayer nozzles 116 are configured to spray a chemical on either a plurality of crop plants or weeds perceptively in a controlled manner, depending on an application scenario.
Each of the plurality of image-capture devices 118 may include suitable logic, circuitry, and/or interfaces that is configured to capture a plurality of field-of-views (FOVs) of a plurality of defined areas of the agricultural field 106 (of
In an implementation, the one or more hardware processors 202 may include one or more graphics processing units (GPU) and a central processing unit (CPU). Examples of each of the one or more hardware processors 202 may include, but are not limited to an integrated circuit, a co-processor, a microprocessor, a microcontroller, a complex instruction set computing (CISC) processor, an application-specific integrated circuit (ASIC) processor, a reduced instruction set (RISC) processor, a very long instruction word (VLIW) processor, a central processing unit (CPU), a state machine, a data processing unit, and other processors or circuits. Moreover, the one or more hardware processors 202 may refer to one or more individual processors, graphics processing devices, a processing unit that is part of a machine.
The memory 204 may include suitable logic, circuitry, and/or interfaces that is configured to store machine code and/or instructions executable by the one or more hardware processors 202. Examples of implementation of the memory 204 may include, but are not limited to, an Electrically Erasable Programmable Read-Only Memory (EEPROM), Random Access Memory (RAM), Read Only Memory (ROM), Hard Disk Drive (HDD), Flash memory, a Secure Digital (SD) card, Solid-State Drive (SSD), a computer readable storage medium, and/or CPU cache memory. The memory 204 may store an operating system, such as a robot operating system (ROS) and/or a computer program product to operate the system 102. A computer readable storage medium for providing a non-transient memory may include, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
The AI model 210A is trained to obtain a trained AI model 210B. The training of the AI model 210A is described in detail, for example, in
In accordance with an embodiment, there is a training phase for the AI model 210A to obtain the trained AI model 210B used in an operational phase of the system 102. The training phase of the AI model 210A, is described in detail, in
Now referring to
In accordance with an embodiment, the training dataset 240 are colour images of crop plants captured in actual real-world condition on the agricultural field 106 while the vehicle 104 is in motion. Furthermore, the training dataset 240 further comprises images that are captured at different times of day (e.g., early morning, evening, or night), and at different growth stages (e.g., two-day cotton plant, three-day cotton plant), different heath states (e.g., diseased and non-diseased etc.), and under different surrounding environmental conditions, with variation in sunlight due to either clouds, rain, a shadow of a large object, like tree, while capturing an image, change in position of sun throughout the day, different light intensity when farming is done etc.). In an implementation, such images are also captured when there is an external influence on the morphology of the crop plant, such as a change in shape of leaves and step when wind is blowing at different levels of speed, when leaves are drooping due to excess watering or lack of water etc. Thus, an extended set of training data (i.e., the training dataset 240) takes into account the different surrounding environmental conditions including variation in sunlight, the growth states, health state, external influence on the morphology of the crop plant, and drooping leaves, for the training of the AI model 210A.
In accordance with an embodiment, the one or more hardware processors 202 are configured to extract a plurality of different features from the training dataset 240 using the AI model 210A. In an implementation, the AI model 210A selected may be a deep neural network model, such as a convolution neural network (CNN) model. The one or more hardware processors 202 are configured to extract a feature 242A indicative of a type of the crop plant 240A, such as a cotton plant, a chilly plant, and the like. The one or more hardware processors 202 are further configured to extract a drooping leaf feature 242B from the training dataset 240 (i.e., the training dataset of images) stored in a training database 246. The drooping leaf feature is indicative of one or more structural patterns visible in images of crop plants with drooping leaves. In an example, the one or more structural patterns may be a leaf shape, an orientation with respect to stem, a texture, a colour etc. Thus, the drooping leaf feature may include one or more sub-features of the one or more structural patterns. At the time of drooping, the stomata of leaves start to get closed which creates obstruction in the photosynthesis process which affects the health of the crop plant. The drooping of leaves occurs due to certain natural and artificial conditions. The natural conditions include abiotic stresses such as extreme high or low temperatures, drought, flood, salts, strong wind, insufficient sunlight etc. The artificial conditions include overwatering (supply of more water than required), underwatering (supply of lesser water than required), supply of contaminated water etc. Because of drooping, the texture and pattern of leaves get disturbed and may resemble close to that of weeds. Thus, from images received through the image-capturing devices 118, the AI model 210A is trained to classify the images in which there is drooping condition of leaves has occurred.
In accordance with an embodiment, the one or more hardware processors 202 are further configured to extract one or more leaf movement features 242C from the training dataset. In certain scenarios, for example, due to movement of air, the natural structure of the leaves and the stem becomes temporarily disturbed. For example, depending on the intensity or force of air blowing on the crop plants, structure of the crop plants gets disturbed due to loss of turgidity of leaves. Moreover, there is possibility of overlapping of leaves due to impact from aerodynamic force. Due to overlapping, the leaves get constricted and the gap between the leaves get reduced. The extend of leaf movement due to wind also varies with respect to the time of the day and season. For example, the movement of the leaves in winds of rainy season is more rapid than that of summer season at the same time of the day etc. Thus, the images stored in the training dataset 240 fed to the AI model 210A while training include the images captured and labelled at a specified range of wind speed during specified time of the day (which depends on the region and geography of the location at which the crop plants are situated i.e., maximum wind speed of the region etc.). For example, one image is taken at 12 pm when the wind speed is 5 m/s which may be a typical maximum speed of wind at a given geographical region). Thus, the plurality of features extracted are interrelated and synergistic in nature, which works in sync to enhance the training effectiveness of the AI model 210A. The one or more hardware processors 202 are further configured to extract one or more stem bending features 242D from the training dataset 240. Like the leaves, the depending on the type of plant, and its inherent strength, the stem may also bend temporary when there is strong movement of air (e.g., strong wind). The AI model 210A further takes into account the stem bending in its training. In an example, the angle of bending of the stem depends on the speed of the wind. The images of crop plants in which stem is bent at different angles due to the wind speed are captured and stored in the training dataset 240. The advantage of extracting the stem bending feature of the crop plants is that the crop plants which are bent due to wind are not precepted as weed by the AI model 210A and are not excluded from receiving a supply of sprayed chemical in the operational or execution phase.
In accordance with an embodiment, furthermore, the one or more hardware processors 202 are further configured to extract a plurality of other features 244 of a crop plant at different growth stages and at different times-of-day with and without surrounding weeds, where the plurality of other features are indicative of physical characteristics of the crop plant at the different growth stages at the different times-of-day. In an implementation, the plurality of other features 244 comprises an age and growth state of the crop plant, a time of capture of the image, a weed pattern, a canopy pattern, a crown pattern, and the like. In an example, a sub-feature of an overall appearance of the type of crop plant may be indicative of an age and growth state of the crop plant (e.g., fifteen days cotton plant, 30 days, and the like). In an example, the age and growth state of the crop plant may be useful to decide which chemical to be selected for spraying and what shall be an ideal amount during the execution or operational phase of the system 102. In another example, the age of the crop plant may be useful to identify if the crop plant is eligible for tolerating a certain chemical spray or not, for example, the fifteen-day cotton plant may die due to the spray of a fungicide in the execution phase once the AI model 210A is trained. Thus, the fungicide may not be sprayed over the fifteen-day cotton plant. Further, during the training, the AI model 210A extracts one or more features of a crop plant at different growth stages and at different times of day with or without surrounding weeds from the training dataset 240. The growth stages of crop plants typically include meristematic, elongation and maturation phase. During the meristematic phase, the tips of roots and shoots of the crop plant exhibits continuous growth. In the elongation phase the cells of the plant gets expanded and in maturation phase, full grown plant with roots is formed. The reason for considering stages of growth of the crop plant to enable the AI model 210A to detect the crop plants which are in their intermediate stage of growth while spraying the chemicals. The stage of growth is important by two ways—firstly, to determine the quantity of chemical to be sprayed on the crop plant. i.e., the crop plant which is at the intermediate growth stage require different quantity of chemical than that of full-grown plant and secondly, to distinguish between the crop plant and the weed without having incorrect perception due to similarity of physical characteristics between the short-grown crop plant and the weed while determining the time and quantity of chemical to be sprayed. For example, at the meristematic phase, the size of the crop plant is lesser and equivalent to that of a weed. Therefore, the training dataset 240 is fed with images that include the details of growth stages of the crop plants at different times of the day. The features related to the growth stages of the crop plant are physical characteristics of the crop plant at the different growth stages at the different times-of-day. The physical characteristics include size, shape, colour, texture etc. of the leaves of the crop plant. These characteristics vary at different stages of growth. The advantage of considering the physical characteristics of the crop plant is to eliminate the perception error by the trained AI model 210B while distinguishing between the crop plant and the weed.
In accordance with an embodiment, the one or more hardware processors 202 are configured to timestamp the captured images indicative of the time of capture (i.e., or a time of day). The colour displayed in the captured images may vary due to the rain, shadow, sunlight or any other such environmental surroundings. Thus, the time of capture improves the perceptive ability in the execution phase to distinguish between the actual colour of the crop plant and any colour which is not natural and may be likely due to external environmental conditions, such as shadow, cloud, rain, time of day etc. The one or more hardware processors 202 are further configured to extract the weed pattern from the training dataset 240 indicative of a weed density around a crop plant. In certain scenarios, due to high weed density, the leaves of the weeds occlude the leaves of the identified crop plant. Therefore, during the training of the AI model 210A, an area covered by the crop plant and the weeds surrounding is also taken into account to prevent false detection of the crop plant, such as due to occlusion of the leaves of the crop plant by the weeds in the execution phase.
In accordance with an embodiment, the AI model 210A is trained using the plurality of different features extracted from the training dataset 240 to obtain a trained AI model 210B. It is to be understood that for different type of crops, such as cotton, chilli, tomato, such features are different, and thus, based on the type of crop plant different sets of parameters are loaded for training. In an implementation, different AI models maybe trained for different types of crop plants. In another implementation, same AI model 210A is trained for the crop specific parameters extracted from the training dataset 240. The trained AI model 210B may then be used by the one or more hardware processors 202 to detect and track the crop plant with high accuracy and distinguish from the weeds with an improved reliability.
Now referring back to the
The system 102 further comprises the boom arrangement 114 that comprises the predefined number of electronically controllable sprayer nozzles 116. The predefined number of electronically controllable sprayer nozzles 116 are electronically controlled by use of solenoid valves which control the flow (e.g., on, off, pressure and volume) of chemicals through the sprayer nozzles. In an implementation, the predefined number of electronically controllable sprayer nozzles 116 of the boom arrangement 114 may be divided into a first set, a second set and a third set in order to spray chemicals on left side, right side, and in front side of the vehicle 104, respectively, when the vehicle 104 is moving across the agricultural field 106. Moreover, there may be a specific distance (e.g., 25 cm) between the plurality of image-capture devices 118 and the predefined number of electronically controllable sprayer nozzles 116 of the boom arrangement 114. The specific distance can be increased (e.g., increased up to 50 cm) by tilting each of the plurality of image-capture devices 118. The calibration of the specific distance between the plurality of image-capture devices 118 and the predefined number of electronically controllable sprayer nozzles 116 of the boom arrangement 114 provides a certain time for image processing and switch on the sprayer nozzles. The predefined number of electronically controllable sprayer nozzles 116 may be placed below the plurality of image-capture devices 118 in order to reduce delay and less time will be consumed in spraying the chemicals. In conventional agricultural systems, it is required to tilt a boom, rotate the boom, retract or fold up a part of the boom, when in operation etc. In contrast to the conventional agricultural systems, there is no such requirement in the boom arrangement 114 of the system 102. The predefined number of electronically controllable sprayer nozzles 116 further includes a plurality of spray valves 206 and a plurality of spray controllers 208 (e.g., a solenoid). Moreover, each spray valve from the plurality of spray valves 206 is attached to a corresponding sprayer nozzle of the predefined number of electronically controllable sprayer nozzles 116. Further, the one or more hardware processors 202 are configured to send an instruction (e.g., an electrical signal) at a first time instant to at least one spray controller (e.g., a solenoid) from the plurality of spray controllers 208 to activate or deactivate a specific set of spray valves associated with the identified sprayer nozzles.
The system 102 further comprises the one or more hardware processors 202 configured to obtain a sequence of images corresponding to the plurality of FOVs from the plurality of image-capture devices 118. In an implementation, the plurality of images captured by the plurality of image-capture devices 118 may include one or more images of the agricultural field 106 captured in different environmental conditions, such as a few images are captured in daylight, a few images are captured in evening time, and few are in night-time. Moreover, the plurality of images also includes one or more images captured during cloudy or rainy environment. In an implementation, the plurality of images captured by the plurality of image-capture devices 118 are stored in the memory 204. In an example, the plurality of images are further processed by a crop detector 212, and a crop tracker 214. The crop detector 212 is configured to detect a crop plant, using the trained AI model 210B which further leads to more accurate differentiation between crop plants and weeds in different environmental conditions and enables the boom arrangement 114 of the system 102 to perform an efficient and effective chemical spraying in the agricultural field 106. Moreover, the crop tracker 214 using the trained AI model 210B, is also configured to track location of each crop from the captured plurality of images. In an example, the crop detector 212 and the crop tracker 214 can be implemented in a hardware circuitry. In another example, the crop detector 212 and the crop tracker 214 may be implemented as functions or logic stored in the memory 204.
With reference to
The one or more hardware processors 202 are further configured to cause a specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 to operate based on the distinguishing of the crop plants from the weeds, where the distinguished crop plants comprise the first set of crop plants 246A with drooping leaves, the second set of crop plants 246B manifesting the elastic change in physical characteristics, and the third set of remaining crop plants 246C. The specific set of electronically controllable sprayer nozzle are selected based on the different sets (first, second and third) of the crop plants over which the chemical is to be sprayed. In an implementation, the specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 can be operated either automatically by virtue of the one or more hardware processors 202 or manually, depending on requirement. The operation of the predefined number of electronically controllable sprayer nozzles 116 depends on an accurate and improved distinguishing between the crop plants and the weeds, which takes into account the crop plant with drooping leaves, the second set of crop plants manifesting the elastic change in physical characteristics, and the third set of remaining crop plants.
In an implementation, the memory 204 further includes a STM coordinator 216, a state estimator (SE) 218, and a real time kinematics (RTK) module 220. In an example, each of the STM coordinator 216, SE 218, and the RTK module can be implemented in a hardware circuitry or logic. The STM coordinator 216 is configured to coordinate between the crop detector 212, the crop tracker 214, and the AI model 210 to process the captured plurality of images. Moreover, the SE 218 works in coordination with the RTK module 238 that is configured to process positioning details of the crop plants and weeds from the captured images with improved accuracy. In an example, the SE 218 is configured to receive data related to position of the crop plants and the weeds from the RTK module 220. In addition, the SE 218 is configured to receive freewheel odometry values from the vehicle 104 and provide a fused odometry output that is published in the memory 204 and used by the crop tracker 214 to track positions of the crop plants and weeds.
The one or more hardware processors 202 are further configured to receive geospatial location correction data from the external device 108 placed at a fixed location in the agricultural field 106 and geospatial location coordinates associated with the boom arrangement 114 mounted on the vehicle 104. In an example, the geospatial location coordinates associated with the boom arrangement 114 are obtained based on a geospatial sensor 222 arranged in the boom arrangement, for example, on a printed circuit board (PCB) where the one or more hardware processors 202 are disposed. In an implementation, the external device 108 may also be referred to as a real-time kinematics global positioning system (RTKGPS) module. The external device 108 is configured to provide the geospatial location correction data that means exact location of the vehicle 104 with error correction data in the agricultural field 106 when the vehicle 104 is moving at a specific range of speed across the agricultural field 106. Moreover, the external device 108 provides the geospatial location coordinates of the boom arrangement 114 that is mounted on the vehicle 104. In conventional agricultural systems, a GPS module is located inside a vehicle which provides location data of the vehicle. It is observed during experimentation that by virtue of locating the GPS module inside the vehicle, there is error in location accuracy of the vehicle. In contrast to the conventional agricultural systems, the external device 108 provides not only the exact location but also the error correction data. Additionally, the external device 108 provides geospatial location coordinates of the boom arrangement 114 that mounts the plurality of image-capture devices 118, the predefined number of the electronically controllable sprayer nozzles 116, and the one or more hardware processors 202 so that there is no delay in processing of data with high location accuracy (e.g., accuracy in centimetres, cm) can be achieved.
In an implementation, the external device 108 is setup on a tripod. Moreover, the external device 108 includes a solar panel 226, a solar charger 228, a battery 230, a DC-to-DC converter 232, a Remote Control (RC) module 234, a microcontroller 236, and a RTK module 238. The solar panel 226 is configured to be removably and electrically coupled to the external device 108. The solar panel 226 is further configured to capture solar energy and convert into electric energy, which is further stored in the battery 230 that is electrically coupled to the solar panel 226. Thereafter, the DC-to-DC converter 232 is configured to convert an output of the battery 230 from one voltage level to another, such as to provide a desired voltage to the RC module 234. In an example, the RC module 234 is configured to work with a specified frequency, for example, a 2.4 Giga Hertz or at other frequency value without limiting the scope of the disclosure. In addition, the microcontroller 236 is communicatively coupled with the RC module 234 as well as with the RTK module 238, for example through a universal asynchronous receiver-transmitter (UART). The microcontroller 236 is configured to control the RC module 234 and the RTK module 238, such as to ensure that the system is within a desired from the external device 108. For example, the RC module 234 and the RTK module 238 are configured to receive from an antenna 224 of the system 102.
The one or more hardware processors 202 are further configured to execute mapping of pixel data of weeds or a crop plant in an image to distance information from a reference position of the boom arrangement 114 when the vehicle 104 is in motion. In contrast to conventional agricultural systems, the one or more hardware processors 202 of the system 102 are configured to map pixel level data of weeds or the crop plant in the image to distance information to achieve high accuracy. The distance information signifies the information about the location of weeds and the crop plant from the reference position of the boom arrangement 114 when the vehicle 104 is in motion. That means, how far and in which direction the weeds and the crop plant is located in the agricultural field 106 from the reference position of the boom arrangement 114. Each pixel of the image is mapped to the distance information in millimetres (mm), for example, 1 pixel to 3 mm on real ground, pixel per mm mapping is performed. The mapping of the image depends on a certain threshold value. If the threshold value is different then, mapping of the image will be different. In an implementation, a sub-pixel (or a virtual pixel) of each pixel of the image can be considered to achieve more accuracy.
In an example, in order to execute the mapping, each of the plurality of image-capture devices 118 positioned above the predefined number of electronically controllable sprayer nozzles 116, captures an image frame of the ground (i.e., the agricultural field) with crop plants and weeds. From this image frame, the one or more hardware processors 202 are configured to map each crop plants/weeds in a coordinate, where the location correction data (RTK GPS) provides geolocation of the image frame. Using the image frame and the geolocation, a precise geolocation of each crop plant and/or weeds can be determined upto a precision of +/−2.5 cm. Now that the predefined number of electronically controllable sprayer nozzles 116 may be “X” distance away from at least one of the plurality of image-capture devices 118. Thus, the system 102 waits until one or more electronically controllable sprayer nozzles reaches the geolocation or geocoordinates of the crop pant to initiate spraying of a defined chemical. The boom orientation sensor data, boom height data, and camera orientation data are fused with the image frame and the geolocation of the image frame to derive accurate coordinate of each crop plant and weeds to precisely and perceptively spray on each crop plant (or weed if so desired).
In an implementation, the one or more hardware processors 202 are further configured to cause the specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 to operate further based on a defined confidence threshold and the executed mapping of pixel data, where the defined confidence threshold is indicative of a detection sensitivity of the crop plant. In an implementation, the specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 can be operated either automatically by virtue of the one or more hardware processors 202 or manually, depending on requirement. The operation of the predefined number of electronically controllable sprayer nozzles 116 depends on the defined confidence threshold and the executed mapping of pixel data. The defined confidence threshold is the threshold value of the AI model 210A. The defined confidence threshold is adaptive in real time or can be set manually by use of a user interface (UI) of the custom application 112 via the display device 110 (of
In accordance with an embodiment, the one or more hardware processors 202 are further configured to determine a height of a tallest crop plant from among a plurality of crop plants from a ground plane in the agricultural field 106 and set a boom height from the ground plane based on the determined height of the tallest crop plant. In an example, the system 102 further includes an ultraviolet sensor that is used by the plurality of image-capture devices 118 to determine the height of the crop plant from the ground level. The height of the tallest crop plant from among the plurality of crop plants is determined from the ground plane in the agricultural field 106. The reason of determining the height of the tallest crop plant from among the plurality of crop plants is to include each and every crop with a height lying in a range of smallest to the tallest crop plant. Furthermore, the one or more processors are configured to set the boom height of the boom arrangement 114 from the ground plane based on the determined height of the tallest crop plant.
In accordance with an embodiment, the one or more hardware processors 202 are further configured to determine an upcoming time slot to spray a chemical based on the executed mapping of the pixel data, the defined confidence threshold, and the set boom height. In an implementation, the upcoming time slot may be referred to as a time period (or a time window) which is required to spray the chemical either on the crop plant or on weeds based on the executed mapping of the pixel data, the defined confidence threshold, and the set boom height. For example, 500 to 800 milliseconds (msec) may be required to spray the chemical on the crop plant or on the weeds. The time period of 500 to 800 msec is referred to as the upcoming time slot. By use of the executed mapping of the pixel data, the defined confidence threshold, and the set boom height, the chemical is sprayed either on the crop plant or on weeds in a controlled amount as well. In an implementation, the chemical may be sprayed on the crop plant in order to either protect the crop plant from disease or to promote the growth of the crop plant. In another implementation, the chemical may be sprayed on the weeds for weed management.
In accordance with an embodiment, the determining of the upcoming time slot to spray the chemical is further based on a size of the crop plant occupied in a two-dimensional space in x and y coordinate direction. The determination of the upcoming time slot (or the time period) to spray the chemical the crop plant is based on the size of the crop plant in the two-dimensional space in the x and y coordinate direction. In an implementation, the x coordinate direction indicates the direction of motion of the vehicle 104 and y coordinate direction indicates the height of the crop plant.
In accordance with an embodiment, the one or more hardware processors 202 are further configured to determine one or more regions in the agricultural field 106 where to spray a chemical based on the executed mapping of pixel data and the defined confidence threshold. Currently, the operations of conventional agricultural systems is based on proper demarcation of an agricultural field. In other words, row identification and row-based processing forms an indispensable component of the conventional agricultural systems. Therefore, the conventional agricultural systems fail when used in the agricultural field where there is no proper demarcation of rows, like in India and many other countries. In contrast to the conventional agricultural systems, the system 102 is applicable on both that is, row based agricultural fields or non-row based agricultural fields. The one or more hardware processors 202 of the system 102 are configured to determine the one or more regions of the agricultural field 106 where to intelligently spray the chemical based on the executed mapping of pixel data and the defined confidence threshold.
In accordance with an embodiment, the specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 are caused to operate specifically at the determined one or more regions in the agricultural field 106 for a first time slot that corresponds to the determined upcoming time slot. After determination of the one or more regions (i.e., either row based or non-row based) in the agricultural field 106 where there is requirement to spray the chemical, the specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 are caused to operate for the first time slot that corresponds to the determined upcoming time slot (i.e., the time period). The specific set of electronically controllable sprayer nozzles may include either the first set or the second set or the third set in order to spray the chemicals either on the left side, or the right side, or in the front side of the vehicle 104, respectively, when the vehicle 104 is moving across the agricultural field 106. The operation of the specific set of the electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 is described in further detail, for example, in
In accordance with an embodiment, the one or more hardware processors are further configured to control an amount of spray of a chemical for the first time slot from each of the specific set of electronically controllable sprayer nozzles by regulating an extent of opening of a valve associated with each of the specific set of electronically controllable sprayer nozzles. Since each of the specific set of electronically controllable sprayer nozzles is electronically controlled by use of the valve (e.g., solenoid valve) therefore, by regulating the extent of opening of the valve, the amount of spray of the chemical can be controlled for the first time slot.
In accordance with an embodiment, the one or more hardware processors 202 are further configured to communicate control signals to operate a plurality of different sets of electronically controlled sprayer nozzles at different time instants during a spray session. In order to regulate the operation of the predefined number of electronically controllable sprayer nozzles 116, the one or more hardware processors 202 are configured to communicate the control signals (e.g., clock signals) to operate the plurality of different sets of electronically controlled sprayer nozzles at different time instants during the spray session.
In accordance with an embodiment, the one or more hardware processors 202 are further configured to receive a user input, via the custom application 112 rendered on the display device 110, wherein the user input corresponds to a user-directed disablement, or an enablement of one or more electronically controllable nozzles to override an automatic activation and deactivation of the one or more electronically controllable nozzles during a spray session. In an implementation, when a user moves the vehicle 104 across the agricultural field 106 then, the user may provide the user input through the custom application 112 rendered on the display device 110. The display device 110 may be used in form of either a tablet or a smart phone which is installed on one side of the vehicle 104. The user provides the user input either for deactivating or activating the one or more electronically controllable nozzles to stop or operating, respectively, the one or more electronically controllable nozzles during the spray session. An implementation scenario of the user-directed disablement, or the enablement of one or more electronically controllable nozzles to override the automatic activation and deactivation of the one or more electronically controllable nozzles during the spray session is described in detail, for example, in
Thus, the system 102 enables an intelligent spraying of the chemicals in the agricultural field 106 and in the controlled manner. The use of the AI model 210A enables the plurality of image-capture devices 118 to capture high-quality images of the agricultural field 106 despite of variation in the environmental parameters (i.e., variation in sunlight due to either clouds or rain or shadow of a large object). Moreover, the AI model 210A enables the system 102 to clearly differentiate between two green looking objects (e.g., crop plants and weeds) and results in a controlled spraying of chemicals on the agricultural field 106. Additionally, the geospatial location correction data received from the external device 108 enables the system 102 to have an exact location of the vehicle 104 with error correction data even when the vehicle 104 is moving at a specific range of speed across the agricultural field 106. The geospatial location coordinates of the boom arrangement 114 provided by the external device 108 enables the system 102 to have a high location accuracy of the vehicle 104. Moreover, mapping of each image at the pixel level (or at the sub-pixel level) to the distance information enables the system 102 to have a more accurate location of the crop plants and weeds in the agricultural field 106 and the boom arrangement 114 so that an efficient spraying of chemicals can be achieved. Furthermore, using the specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 depending on the application scenario increases the efficiency and practical utility of the system 102.
In accordance with an embodiment, the specific set of electronically controllable sprayer nozzles are operated further based on a predefined operating zone (indicated by the UI element 316) of the vehicle 104, where the predefined operating zone (indicated by the UI element 316) defines a range of speed of the vehicle 104 in which an accuracy of the detection sensitivity of the crop plant is greater than a threshold. The predefined operating zone of the vehicle 104 means that when the vehicle 104 is moved through the agricultural field 106 in a specific range of speed, for example, from 40 to 70 cm/second (s), the accuracy of the detection sensitivity of the crop plant is greater than the threshold. Alternatively stated, the crop plant can be detected, tracked, identified with a crop type, and distinguished with weeds and any other green looking objects with improved accuracy in the predefined operating zone of the vehicle 104.
In an implementation, a custom application 112 is pre-installed in the display device 110. The custom application 112 has many user interfaces (UI), where the UI 112A is one of the many UI interfaces. The custom application 112 is designed and configured to directly establish a communication with a Robot Operating System (ROS) layer of the system 102 to perform any specified operations of the system 102.
The UI element 302 indicates a driver role and corresponding functions made available to a user operating the vehicle 104 as per the defined driver role. The UI element 304 indicates a connection status of the system 102. The UI element 306 indicates a spray modeselected as a perceptive spot spraying mode. The UI element 308 indicates a predetermined boom height range that is optimal for a tallest plant height determined by the system 102 as well a current boom height from the ground plane. The boom height range is determined for a given plant height based on experimentation where an optimal result was achieved previously and saved in a database for later use. The UI element 310 indicates a type of crop plant (such as a cotton plant in this case) that is current object-of-interest, to be acted on or sprayed with a specified chemical. The UI element 312 indicates a geospatial sensor signal quality (e.g., GPS signal quality) is good or not and also from the external device 108. The UI element 314 indicates battery status of the system 102 to power the components of the system 102. The UI element 318 indicates a current device activity status, i.e., whether the system 102 is in operation or idle. The UI element 320 indicates a pause or resume function in terms of operation of the system 102. The UI element 322 provides a control to visualize/update various operations and its corresponding settings or parameters. The UI element 324 is a sprayer control that provides an option to test and manually enable or disable some selected electronically controllable sprayer nozzles of the predefined number of electronically controllable sprayer nozzles 116. Such manual selection is sometimes needed to avoid double spraying of chemicals or under some unforeseen scenarios. An example is of such circumstance is explained in
In an implementation, the defined confidence threshold 410A is set in real-time or near real-time in the AI model 210A of the system 102. Alternatively, the defined confidence threshold 410A is pre-set via the UI 112B rendered on the display device 110 communicatively coupled to the one or more hardware processors 202. In yet another implementation, the defined confidence threshold 410A is adaptive and may automatically be changed depending on a surrounding environment condition, a crop type, and/or a captured image input from the plurality of image-capture devices 118. Examples of the surrounding environmental conditions while capturing images of agricultural field may include, but are not limited to a variation in sunlight due to either clouds, rain, a shadow of a large object, like tree, in an image, a change in position of sun throughout the day, a change in light intensity, a time of day when farming is done etc, an extent of resistance from mud in the agricultural field 106.
In the exemplary scenario 400, the UI element 402 is a detection control that controls detection sensitivity of the crop plant by calibrating the defined confidence threshold 410A as indicated by the UI element 410. The defined confidence threshold 410A is automatically (or optionally manually) increased or decreased, depending on the requirement. If the defined confidence threshold 410A increases, detection sensitivity of the crop plant increases. The confidence threshold value may range from 0 to 1. An increase or decrease of the defined confidence threshold 410A increases, it changes i.e., increases or decreases the perceptiveness of the system 102. For example, at a first defined confidence threshold, say 0.X1, the one or more hardware processors 202 are configured to distinguish between green looking objects, such as crop plants and weeds. At a second defined confidence threshold, say 0.X2, the one or more hardware processors 202 are configured to further distinguish between a type of crop plant and a type of weed. At a third defined confidence threshold, say 0.X3, the one or more hardware processors 202 are configured to further distinguish between a diseased or a non-diseased crop plant and further distinguish weeds from such diseased or non-diseased crop plants. At a fourth defined confidence threshold, say 0.X4, the one or more hardware processors 202 are configured to further increase crop detection sensitivity such that a discoloured plant or non-discoloured plant, a growth state of the crop plant, a lack of nutrient etc. can be further sensed and additionally distinguish from weeds. Such detection sensitivity is very advantageous and provides a technical effect of increased perceptiveness of the system 102 resulting in improved performance of the system 102, such as reduced wastage of chemical used for spraying. At a fifth defined confidence threshold, say 0.X5, the one or more hardware processors 202 are configured to distinguish various crop plants from weeds using the trained AI model 210B by detecting the first set of crop plant 246A with drooping leaves, detecting the second set of crop plants 246B manifesting the elastic change in physical characteristics of the second set of crop plants, and also detecting the third set of remaining crop plants 246C different from the first set of crop plants 246A and the second set of the crop plants 246B. Alternatively state, the use of the defined confidence threshold 410A significantly improves the perceptive capability of the system 102 such that the spraying of chemicals is achieved with improved accuracy and precision for the correct time slots, at correct intended areas or spots and only when required with correct amount of spray and correct selection of a type of chemical irrespective of any change in the surrounding environmental conditions while capturing images of agricultural field. For example, an increase or a decrease in the defined confidence threshold 410A dynamically changes the detection sensitivity of the crop plant increasing the perceptive capability of the system 102 making the system fail-safe.
In an example, two different chemicals can be loaded in two different chemical storage chamber in the vehicle 104. A specific chemical type is used only when a discoloured crop plant is detected by a specific nozzle while some nozzles may use another chemical to spray on normal/healthy crop plant, and remaining nozzles may be deactivated to stop spraying on weeds or unwanted regions. Thus, different applications are made possible by calibration of the defined confidence threshold 410A.
In accordance with an embodiment, the one or more hardware processors 202 are configured to update the defined confidence threshold in response to a change in a quality parameter of the captured plurality of FOVs of the plurality of defined areas of the agricultural field 106. For example, when there is a change in the quality parameter of the captured plurality of FOVs, that means some images are captured in a sunny environment, a few images are captured in a cloudy environment and a few other images are captured in rainy environment or there is some shadow, then according to the change in the quality parameter, the defined confidence threshold 410A is dynamically updated to maintain the spray accuracy greater than a threshold, for example, greater than 95-99.99%.
In an example, the one or more hardware processors 202 are configured to determine precision and recall values for different confidence threshold values ranging from 0.1-0.99. The confidence threshold may be selected by identifying and selecting an optimal point in dataset of the precision and recall values that meets the required high recall and at the same time maintaining high enough precision values associated with the detection sensitivity of the AI model. When a precision value is highest, the recall value may be lowest. Thus, a right mix of precision and recall value is reflected in a given confidence threshold value.
In an implementation, the UI element 404 is a sprayer units' control where a front buffer 408A and a rear buffer 408B associated with each image-capture device indicated by UI elements 406A, 406B, and 406C, of the plurality of image-capture devices 118, may be set. Such setting may occur automatically by the one or more hardware processors 202 or may be done based on a user input. The one or more hardware processors 202 are further configured to determine one or more regions in the agricultural field 106 where to spray a chemical based on the executed mapping of pixel data, the defined confidence threshold 410A, and the front buffer 408A and the rear buffer 408B associated with each image-capture device of the plurality of image-capture devices 118. For example, if a region is determined as 15 cm length and 15 cm breadth. Thus, increasing the front buffer 408A to 5 cm may extend the spray region ahead of the crop plant by 5 cm, for example, now 20 cm length. Similarly, increasing the rear buffer 408B, say by 3 cm, may dynamically extend the spray area to 3 cm from the rear end/behind the crop plant in the direction of movement of the vehicle 104.
In the exemplary scenario 500A, the UI element 502 indicates position of the boom arrangement 114. The UI element 502 is used to control the predefined number of electronically controllable sprayer nozzles 116. The predefined number of electronically controllable sprayer nozzles 116 are divided into three units (represented by the UI element 504), for example, a left unit, a right unit, and a centre unit. There is further shown a selection of the left unit (represented by a thick box). Moreover, the UI element 506 indicates that the left unit includes a total of eight electronically controllable sprayer nozzles out of which first three sprayer nozzles are deactivated manually by use of the UI element 506. In another implementation scenario, the first three sprayer nozzles can be automatically deactivated by use of the trained AI model 210B. The deactivation of the first three sprayer nozzles is performed in order to perform the controlled and perceptive chemical spraying on the agricultural field 106, for example, not to spray again crop plants when the vehicle 104 moves in opposite direction to cover another set of crop plants, like shown, for example, in
With reference to
At 602, the method 600 comprises obtaining a training dataset of crop images. The training dataset may be obtained from a training database that stores previously captured images from by plurality of image-capture devices 118.
At 604, the method 600 comprises training an AI model 210A to obtain a trained AI model 210B in a training phase based on the training dataset. The operation 604 comprises multiple sub-operations, such as operations 604A, 604B, and 604C. At 604A, the method 600 comprises extracting a drooping leaf feature from the training dataset stored in the training database. At 604B, the method 600 comprises extracting one or more leaf movement features from the training dataset. At 604C, the method 600 comprises extracting one or more stem bending features from the training dataset. At 604D, the method 600 comprises extracting a plurality of features of a crop plant at different growth stages and at different times-of-day with and without surrounding weeds, wherein the plurality of features are indicative of physical characteristics of the crop plant at the different growth stages at the different times-of-day.
At 606, the method 600 comprises receiving, by the one or more hardware processors 202, geospatial location correction data from an external device 108 placed at a fixed location in the agricultural field 106 and geospatial location coordinates associated with the boom arrangement 114 mounted on the vehicle 104.
At 608, the method 600 comprises executing, by the one or more hardware processors 202, mapping of pixel data of the weeds or the crop plants in an image to distance information from a reference position of the boom arrangement 114 when the vehicle 104 is in motion, wherein the specific set of electronically controllable sprayer nozzles are operated further based on the executed mapping of pixel data.
At 610, the method 600 comprises obtaining, by the one or more hardware processors 202, a plurality of images corresponding to a plurality of field-of-views (FOVs) of a plurality of defined areas of the agricultural field 106 from the plurality of image-capture devices 118 mounted in a boom arrangement of a vehicle.
At 612, the method 600 comprises detecting, by the one or more hardware processors 202, a confidence threshold indicative of a detection sensitivity of the crop plant in the trained AI model 210B.
At 614, the method 600 comprises automatically including or excluding, by the one or more hardware processors 202, a category of plants to be considered for operation by the one or more hardware processors 202 based on the detected confidence threshold.
At 616, the method 600 further comprises distinguishing, by the one or more hardware processors 202, crop plants from weeds using a trained AI model 210B when the vehicle 104 is motion on the agricultural field 106. Moreover, the distinguishing of the crop plants from weeds using the trained AI model 210B comprises sub-operations, such as operations 616a to 616c. At 616a, the distinguishing of the crop plants from weeds using the trained AI model 210B comprises detecting the first set of crop plant 246A with drooping leaves. At 616b, the distinguishing of the crop plants from weeds using the trained AI model 210B further comprises detecting the second set of crop plants 246B manifesting an elastic change in physical characteristics of the second set of crop plants 246B. At 616c, the distinguishing of the crop plants from weeds using the trained AI model 210B further comprises detecting the third set of remaining crop plants 246C different from the first set of crop plants 246A and the second set of the crop plants 246B.
At 618, the method 600 further comprises updating, by the one or more hardware processors 202, the defined confidence threshold in response to a change in a quality parameter of the captured plurality of FOVs of the plurality of defined areas of the agricultural field 106.
At 620, the method 600 further comprises determining, by the one or more hardware processors 202, an upcoming time slot to spray a chemical based on the distinguishing of the crop plants from the weeds. The determining of the upcoming time slot to spray the chemical may be further based on a size of the crop plant occupied in a two-dimensional space in x and y coordinate direction.
At 622, the method 600 further comprises determining, by the one or more hardware processors 202, one or more regions in the agricultural field where to spray a chemical based on the distinguishing of the crop plants from the weeds. Thereafter, the one or more hardware processors 202 may communicate control signals to operate a plurality of different sets of electronically controlled sprayer nozzles at different time instants during a spray session based on the distinguishing of the crop plants from the weeds.
At 624, the method 600 further comprises causing, by the one or more hardware processors 202, a specific set of electronically controllable sprayer nozzles from amongst the predefined number of electronically controllable sprayer nozzles 116 to operate based on the distinguishing of the crop plants from the weeds, wherein the distinguished crop plants comprises the first set of crop plant 246A with drooping leaves, the second set of crop plants 246B manifesting the elastic change in physical characteristics, and the third set of remaining crop plants 246C.
The method 600 is used for detecting crop plants with advanced technology where even if there are some temporary or elastic changes in morphology of the crop plants, for example, drooping leaves, any elastic change in the physical characteristics like from movement of air across the crop plants, or a change in shape of leaves due to some external conditions, the detection of such crop plants is not missed. The distinguishing of the crop plants from the weeds using the trained AI model 210B is used to accurately identify the crop plants with more than 98-100% accuracy in the uneven agricultural land even if there is any change in surrounding environmental conditions, such as change in shape of leaves due to wind, drooping of leaves, occlusion by weed, high weed density around the crop plants, and the like. This improves the perceptive ability to adapt to uneven agricultural land and handle surprisingly advanced real-time changes in the surrounding environmental conditions.
The operations 602 to 624 are only illustrative and other alternatives can also be provided where one or more steps are added, one or more steps are removed, or one or more steps are provided in a different sequence without departing from the scope of the claims herein. Various embodiments and variants disclosed with the aforementioned system (such as the system 102) apply mutatis mutandis to the aforementioned method 600.
Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “have”, “is” used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. The word “exemplary” is used herein to mean “serving as an example, instance or illustration”. Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments. The word “optionally” is used herein to mean “is provided in some embodiments and not provided in other embodiments”. It is appreciated that certain features of the present disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the present disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable combination or as suitable in any other described embodiment of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
202241075496 | Dec 2022 | IN | national |