AERIAL SENSOR AND MANIPULATION PLATFORM FOR FARMING AND METHOD OF USING SAME

Information

  • Patent Application
  • 20240033940
  • Publication Number
    20240033940
  • Date Filed
    July 27, 2022
    2 years ago
  • Date Published
    February 01, 2024
    10 months ago
Abstract
A robotic sensor and manipulation platform for farming is disclosed, having a robotic base and one or more exchangeable robotic sensing and manipulation tips deployable from the robotic base to commanded positions in a plant growth area. The robotic sensing and manipulation tips have a plurality of sensors adapted to detect and monitor plant health and growth conditions, and a computer-based control system configured to analyze sensor data and provide analyzed results to the farmer or producer.
Description
TECHNICAL FIELD

This disclosure generally relates to robotic farming, particularly to autonomous computer controlled robotic aerial sensor and manipulation platforms having deployable sensing and manipulation tips, specifically for farming or alternately indoor cultivation and greenhouse environments. The disclosure includes cable robots that include a cable positioned robotic device suspended from a set of cables and respective support structures and more particular robots in the field of agriculture, with one or preferably a plurality of interchangeable sensing and or manipulating devices.


BACKGROUND OF THE INVENTION

There are various reasons to tend plants on an individual level or at least on a level of only a few plants. For example, the use of fertilizer or pesticides may be reduced to the minimum necessary level for each plant; expensive or delicate plants may be grown more successfully; for toxic plants it might be necessary to track growth on an individual level to fulfill legal requirements; and in scientific environments results for studies may be obtained faster by individual tracking and adjusting growth parameters for each plant. Vertical farming, with highly packed plants on several levels in buildings in metropolitan urban areas may benefit from a close monitoring for individuals or groups of few plants.


Tracking of the growth of plants may be done manually. However, in a commercial style environment for many plants an automated tracking system is more efficient. For toxic or allergic plants an automated tracking may have labor safety advantages. In an environment using artificial intelligence, e.g. deep learning, automated individual tracking of the growth of plants may enable a necessary feedback loop.


Unmanned aerial vehicles have been proposed as means for individual tracking of the growth of plants as well as robots moving on the ground. Drones, however, have a limited time for operation and pose a higher risk to damage the plants when accidentally crashing into the crop. Robots moving on the ground need pathways between the plants that can interfere with human ground tasks.


Cultivation is done in a variety of ways today. Outdoor farming takes advantage of a variety of natural resources such as sunlight, soil and others. Greenhouse cultivation protects the cultivar by isolating it from certain environmental factors and by controlling certain aspects of the environment. Indoor cultivation typically uses very little to no external resources and almost all aspects of the cultivation environment are fully controlled.


In scientific environments cable suspended camera systems similar to those known in sports stadiums have been proposed. For example, the article “NU-Spidercam: A large-scale, cable-driven, integrated sensing and robotic system for advanced phenotyping, remote sensing, and agronomic research” in Computers and Electronics in Agriculture, Volume 160, May 2019, Pages 71-81 describes such a system.


U.S. Pat. No. 10,369,693 B1 describes systems, methods, devices, and techniques for controlling and operating cable-suspended robotic systems which may be used also for seeding, fertilizing, irrigation, crop inspection, livestock feeding or other agricultural operations in a pasture, orchard, or field.


CN111425733A describes a wire driven parallel unmanned agricultural robot and a control method thereof. The unmanned agricultural robot comprises a mobile platform, a pillar system, a winding system, at least four wires, an ultrasonic module and a control system.


SUMMARY OF THE INVENTION

An object of the invention is to provide a robotic sensor and manipulation platform for farming having a robotic base and a robotic sensing and manipulation tip deployable from the robotic base to commanded positions and elevations in a plant growth area. The robotic sensing and manipulation tip having a plurality of sensors adapted to detect and monitor various aspects of plant health and growth conditions, and a computer-based control system configured to analyze gathered sensor data and provide analyzed results to the farmer or producer.


Monitoring crop growth in a crop field, indoor cultivation or greenhouse is a critical and productive task which can productively be offloaded from direct human labor. New sensor and technologies applied in the robotic aerial sensor and manipulation platform presented herein allow farmers to get a much higher level of data and computer analysis about their crops than they have in the past. Beyond that, sensors like multi- or hyper-spectral cameras can generate insights which the human eye cannot identify. The robotic aerial sensor and manipulation platform includes an aerially maneuvered robotic platform base, a sensing and analysis tip and analysis software, which can be configured to operate autonomously in the plant growth area. The farmer or producer can view the collected crop data and analysis in real time.


The aerial sensor and manipulation platform is adapted to provide a more detailed autonomous monitoring of crop growth particularly as the deployable sensing and manipulation tip is configured to get closer to the plants or crops, moving into spaces too small for a human to work, and to get into problem areas, detect plant health problems and resolve issues without causing any damage to the plants. The plant spray device tip of the aerial sensor and manipulation platform can be applied for tasks, such as an autonomous targeted application of water, fertilizers, and pesticides onto targeted plants, particularly in response to sensor readings and programmatic analysis of sensor data by the computer-based control system.


Irrigating and fertilizing crops and plants has traditionally used a lot of water, and so is inefficient. The disclosed invention discloses an autonomous computer-controlled robot or robotic aerial sensor and manipulation system having one or a plurality of multiple deployable sensing and manipulation tips providing robotically directed precision irrigation and fertilizer application, among other things, which can reduce wasted water by only targeting specific plants in need and reducing waste. Additionally, the sensing and manipulation tip of the aerial robotic sensor and manipulation platform is adapted to autonomously navigate between rows of crops or plants and apply sprays and irrigation directly to the base or targeted leaves of each plant.


The robotic sensor and manipulation platform has the advantage of being able to access plant and growth areas where humans and larger equipment cannot or would cause significant damage. For example, corn growers face a problem that the plants grow too quickly to reliably fertilize them. The present invention solves this and other problems as it easily moves between the rows of plants and individual neighboring plants and targets nitrogen fertilizer directly at the base of the targeted plants, take sensor readings detecting soil, and, moisture and plant health, apply water, remove damaged growth, aggregate data to determine plant health, detect and resolve insect infestation issues, among other things.


Additionally, spraying pesticides and weed killers onto large plant growth areas is not only wasteful but also may be severely harmful to the environment. The robotic sensor and manipulation platform disclosed herein provides a much more efficient method of micro-spraying and can significantly reduce the amount of pesticides and/or herbicide used in plant production and farming. The robotic sensor and manipulation platform makes use of computer vision and feature analysis technology to detect pests and weeds, identify them, and then spray a targeted micro spray of the respective pest management medium onto the weeds.


The robotic sensor and manipulation platform provides a variety of interchangeable sensor manipulation tips, including tips with a pruning or cutting device. Pruning is a time-consuming and complex job for the farmer or operator. The computer-based control system and its algorithms collect and analyze sensor data, like airflow in the canopy, CO2 content, and/or other parameters, to determine plant health and condition and can decide which plant growth to prune, which to keep and which to remove.


Disclosed herein is a robotic sensor and manipulation platform which is configured to connect directly or indirectly onto an aerial support and positioning system and be maneuvered along control system directed paths over or within the plant canopy. The robotic sensor and manipulation platform generally includes a robotic base connected to and moved to command positions by the aerial support and positioning system. The sensing and manipulation tip is deployable from the robotic base via a tip suspension cable or telescoping collapsible pipe sections under control of a computer-implemented control system. At least one sensing and manipulation tip is provided with one or more sensors selected from the set: at least one RGB, IR, multi- or hyper-spectral camera taking images, at least one distance sensor detecting distance to nearby plants and objects; at least one temperature sensor detecting ambient air temperature, at least one air quality sensor, at least one airflow sensor, at least one light intensity and/or light spectrum sensor, at least one tip orientation detection means, detecting rotation orientation of the sensing and manipulation tip relative to the robotic base, at least one humidity sensor, at least one CO2 sensor, at least one plant fluids sensor which collects and analyzes fluids by puncturing the plant, an RFID or similar tag reader, and/or at least one fluorescence sensor, fluorescence filter or filter cube. The robotic sensor and manipulation platform includes a tip positioning mechanism having a motor drive responsive to positioning commands from the control system, the tip positioning mechanism is arranged on the robotic base, supports positions and connects the sensing and manipulation tip to the robotic base. The tip positioning mechanism is operable to move the sensing and manipulation tip to commanded positions above or in the plant canopy.


In aspects of the inventive disclosure, the robotic sensor and manipulation platform further includes one or more manipulation attachments configured to detachably connect to the sensing and manipulation tip of the manipulation platform. The one or more manipulation attachments may include at least one of: a cutting device operable by the control system to trim, prune or cut plant material from plants in a geometric plant growth area, a handling device operable by the control system to hold, grasp or stabilize certain areas of a plant while deriving further measurements, a spray device operable by the control system and having one or more directional spray nozzles, at least one needle device operable by the control system to derive plant measurements beneath an outer plant surface, such as plant sap measurements.


The robotic sensor and manipulation platform may include an extensible arm controlled by the control system, the extensible arm having either folding arm sections or a telescoping arm sections, such that individual manipulation attachments may be selectively and detachably connected to the extensible arm under the control of the control system. The one or more manipulation attachments are preferably provided with at least one of the one or more sensors discussed earlier above.


Preferably the sensing and manipulation tip is configured as a bicone without edges, so as to smoothly slide into plant growth without entangling or damaging plants.


In some aspects of the inventive disclosure, the spray device has at least one of the one or more directional spray nozzles having a spraying direction controlled by the control system to target areas of plants or soil within the plant grown area.


Preferably, the one or more directional spray nozzles are actuated on/off and spray direction controlled, the spray nozzles preferably controlled individually by the control system.


In another aspect of the inventive disclosure, the at least one sensing and manipulation tip is a plurality of different sensing and manipulation tips configured for different functions, further including at least one sensing and manipulation tip selected from the set of: a cutting device operable by the control system to trim, prune or cut plant material from plants in a geometric plant growth area, a handling device operable by the control system to hold, grasp or stabilize certain areas of a plant while deriving further measurements, a spray device operable by the control system and having one or more directional spray nozzles, at least one needle device to derive plant measurements beneath an outer plant surface, wherein the at least one needle device includes a plant sap measuring device and/or an extensible arm controlled by the control system, the extensible arm having either folding arm sections or a telescoping arm sections. Preferably the plurality of sensing and manipulation tips are individually selectively attached and detachably connected onto the robotic sensor and manipulation platform under control of the control system.


In preferred aspects of the inventive disclosure, at least one of the one or more directional spray nozzles of the spray device have a spraying direction controlled by the control system. Even more preferred is having the one or more directional spray nozzles individually actuated and controlled by the control system. Preferably the spray nozzles have a controlled spray on, spray off, and/or spray direction, preferably individually.


In various aspects of the inventive disclosure, the robotic sensor and manipulation platform includes a mechanical self-cleaning mechanism configured to wipe clean or wipe-off the tip positioning mechanism, such as tip suspension cables or telescoping pipe sections, while the sensing and manipulation tip is driven upwards towards the robotic base under control of the control system.


In some aspects of the inventive disclosure, the robotic sensor and manipulation platform is provided with a force detection sensor or a visual sensor in communication with the control system and directly detecting or indirectly inferring forces applied on the tip positioning mechanism or the robotic sensor and manipulation platform, so as to detect obstacles encountered or other entanglements of the sensing and manipulation tip in the plants or other obstacles.


In preferred aspects of the inventive disclosure, the sensing and manipulation tip is or includes at least one bicone sensing and manipulation tip having an arcuate, or semi-circular, viewing/sensing slot or window provided in an outer wall of the bicone sensing and manipulation tip. The bicone sensing and manipulation tip having a motor driven rotating disc rotatably mounted in an interior of the in the bicone sensing and manipulation tip. The rotating disc operatively coupled to and controlled by the control system to rotate about an axis of rotation to positions commanded by the control system. The rotating disc is arranged in a plane and has an outer circumference substantially aligned or aligned adjacent to a viewing/sensing slot or window of the bicone sensing and manipulation tip. One or more cameras is arranged on and rotated in unison with the rotating disc to position the camera(s), under control of the control system, at desired viewpoint positions along an arcuate length of the viewing/sensing slot, this under control of the control system. At any point in time, the rotating disc with the camera(s) can be rotated by the control system to expose the lens of the camera and record images at control system commanded viewpoints of interest in locations about or within the plant canopy, such as for detecting insect infestations, disease, or injury areas in plant growth, as well as to determine areas of interest for sensor gathering sensor measurement. The camera images may also be processed to map the plant grown area and plant canopy for determining access paths through or about the plant growth area and to evaluate distances to plants or obstacles, or to infer cable tension or slack in the positioning system.


Advantageously, the arcuate, or semi-circular, viewing/sensing slot or window is preferably arranged substantially in a lower cone portion of the bicone sensing and manipulation tip, such that an upper cone portion of the bicone sensing and manipulation tip has a protected upper region which preferably is substantially enclosed and into which the viewing/sensing slot or window does not extend. At any time, the control system can rotate the rotating disc to move the camera(s) into the protected upper region such that the camera(s) are positioned away from the viewing/sensing slot or window, and thereby protected from dirt and scratches by the bicone housing while deployed in or robotically moving about the plant canopy.


In preferred aspects, the bicone sensing and manipulation tip is rotatably coupled to and affixed to the tip positioning mechanism by a motor driven rotatable pan joint, rotated to commanded positions under the control of the control system. The panjoint is operatively coupled to the control system controlled thereby to rotate the bicone sensing and manipulation tip about an axis of the tip suspension cable or tubular pipe sections of the tip positioning mechanism to enable a controlled full 360-degree field of view from the at least one camera about the axis of the tip suspension cable or tubular telescoping pipe sections. The rotating disc may further include at least one of the one or more sensors discussed herein affixed onto and rotated in unison with the rotating disc.


Also disclosed herein is an aerial robotic sensor and manipulation system having the robotic sensor and manipulation platform, bicone sensing and manipulation tip(s) and other features discussed earlier above in this Summary section. A control system (or computer-based control system) is provided having one or more processors executing instructions stored on a non-volatile data store, wherein the instructions, when executed, by the one or more processors, are configured to autonomously operate the aerial robotic sensor and manipulation system, preferably independent of human oversight or actions. The aerial robotic sensor and manipulation system includes an aerial support and positioning system embodied as either: a plurality of aerial platform positioning cables connected to and driven by a cable spooling device, connected to and supporting the robotic sensor and manipulation platform over or within a plant growth area. The aerial support and positioning system having a motor driven cable spooling device responsive to commands from the control system, the plurality of cable spooling devices are responsive to commands from the control system to controllably deploy or retract lengths of the aerial platform positioning cable to move the robotic sensor and manipulation platform in X and/or Y and/or Z directions above or above or within the plant growth area. A plurality of cable support points are provided, such as, for example, on post or walls, each is fixed onto an elevated support structure at a fixed position preferably above a top of the plant canopy. The cable support points generally delimit in 2D X-Y an outer boundary of a geometric plant growth area, at least the geometric plant growth area accessible to the robotic sensor and manipulation platform.


The aerial support and positioning system may be realized as by a gantry aerial X-Y support and positioning device supporting and positioning the robotic sensor and manipulation platform above the plant growth area and having at, least one drive motor responsive to commands from the control system to move the robotic sensor and manipulation platform in X and/or Y and/or Z directions over the plant growth area to commanded positions.


Preferably at least one aerial platform positioning cable of the plurality of aerial platform positioning cables supporting and positioning the robotic sensor and manipulation platform has an outer sheath which carries and encloses therein one or more electric power conductors, one or more a network or data communication cables, and may include one or more fluid supply tubes, all protectively enclosed within an interior of the at least one aerial platform positioning cable, so as to be wound and unwound from the cable spooling device with the platform positioning cable. In this way the enclosed cables and tubes are supported within the aerial platform positioning cable(s), and are prevented from entanglement in the surrounding environment. The cable supporting the robotic sensing and manipulation tip(s) from the robotic sensor and manipulation platform preferably can be similarly configured.


The aerial robotic sensor and manipulation system may further include a resting platform arranged within the outer boundary of a geometric plant growth area and positioned above the plant canopy, for example in raised platforms supported by posts or walls, or other elevated structures. The resting platform may hold and provide one or more one or more manipulation attachments or sensor tips which are configured to autonomously connect to and detachably connect from the robotic sensor and manipulation platform. Preferably the control system controls the detachable connection and the disconnection of the one or more manipulation attachments, for retrieval from the resting platform and return to the resting platform. The resting platform can also be used to swap the aerial platform entirely. Furthermore, it can be used for charging of the aerial platform and/or for data transmission if the aerial platform is operated wirelessly.


In another aspect, the aerial robotic sensor and manipulation system includes at least one force detection sensor or a visual sensor in communication with the control system and detecting forces applied on the tip positioning mechanism or the robotic sensor and manipulation platform for detecting encountered obstacles or entanglements of the sensing and manipulation tip. The at least one distance sensor may be a LiDAR or time-of-flight sensor detecting distance to nearby plants and objects.


Finally, other aspects of the invention are directed to methods used by the aerial sensor and manipulation platform for detecting problematic microclimates for scheduling the platform to periodically return to desired locations in an agricultural field for detecting bugs, pest and insects using one or more cameras, sensors and/or other imagers of the aerial sensor and manipulation platform as described herein. Still other methods are provided for detecting bugs, pests and insects using one or more cameras, sensors and/or other imagers of the aerial sensor and manipulation platform as described herein.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying Figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views and which together with the detailed description below are incorporated in and form part of the specification, serve to further illustrate various embodiments and to explain various principles and advantages all in accordance with the present invention.


Features of the present invention, which are believed to be novel, are set forth in the drawings and more particularly in the appended claims. The invention, together with the further objects and advantages thereof, may be best understood with reference to the following description, taken in conjunction with the accompanying drawings. The drawings show a form of the invention that is presently preferred; however, the invention is not limited to the precise arrangement shown in the drawings.



FIG. 1 depicts a schematic view of an aerial robotic sensor and manipulation system installed over and managing plant grown health of a plant growth area, such as a portion of an agricultural field, or arranged within an interior of a plant growth structure, for example a greenhouse, consistent with the present inventive disclosure;



FIG. 1A illustrates, for better understanding, a preferred outer contour of the sensing and manipulation tip having a substantially smooth bicone shaped body without edges and shaped to smoothly pass through the plant growth or a trellis without entangling into or damaging the plants. For understanding, the sensing and manipulation tip is depicted under the plant canopy, managing the growth environment and health of grapes vines in a vineyard, consistent with the present inventive disclosure;



FIG. 2 depicts an enlarged view of the robotic sensor and manipulation platform of FIG. 1, depicting the sensing and manipulation tip deployed from the robotic base and supportively connected to one or more aerial platform positioning cables over a managed plant growth area, consistent with the present inventive disclosure;



FIG. 3 depicts a preferred aspect of the inventions in which at least one of the aerial platform positioning cables enclose electric power conductors, sensor signal lines, data line and/or network cable and a fluid supply line, all enclosed in the interior of the aerial platform positioning cable, consistent with the present inventive disclosure;



FIG. 4. depicts a spray tip variant of the sensing and manipulation tip of FIG. 2, including a plurality of spray nozzles for spraying treatments onto or irrigating plants in the growth area, consistent with the present inventive disclosure;



FIG. 5 is a schematic illustration of the robotic base and a sensor manipulation tip deployably and supportively connected to the robotic base by the tip suspension cable or tubular pipe sections;



FIG. 6, FIG. 7A and FIG. 7B provide a schematic illustrations of a preferred aspect of the invention in which the sensing and manipulation tip includes a rotating disc having one or more sensors and generally aligned with a viewing/sensing slot or window of the sensing and manipulation tip; and



FIG. 8 is a schematic illustration in which the aerial support and positioning system includes a gantry aerial support and positioning device aerially supporting and positioning the robotic sensor and manipulation platform, for example, above a plant growth area.



FIG. 9 is a flow chart diagram illustrating processes used by the aerial sensor and manipulation platform for detecting problematic microclimates and periodically returning to desired locations in the plant growth area.



FIG. 10 is a flow chart diagram illustrating processes for detecting bugs, pests, and insects using one or more cameras, sensors and/or other imagers as used in the aerial sensor and manipulation platform described herein.



FIG. 11A is a flow chart diagram illustrating generation of a predictive actuation model and a spatio-temporal model.



FIG. 11B illustrates geometric patterns used in the predictive actuation and spatio-temporal models.



FIG. 12 is a flow chart diagram illustrating a method of automated height adjustment.



FIG. 13 is a flow chart diagram illustrating a method for locating static support sensors.



FIG. 14 is a block diagram illustrating the tracking of individual or batches of plants with passive or active markers using a ceiling mounting mobile platform.



FIG. 15 is a block diagram illustrating a system and method of determining a Normalized Difference Vegetation Index (NDVI) calibration.



FIG. 16 is a flow chart diagram illustrating a method of reactive environmental control.



FIG. 17A is a flow chart diagram illustrating a method of reactive pest control.



FIG. 17B is a block diagram illustrating the system used for reactive pest control.



FIG. 18 is a block diagram illustrating a system for plant identification, continuous plant counting and identification of unused plant space.



FIG. 19 is a flow chart diagram illustrating the teleoperation of an autonomous environment monitoring system.



FIG. 20 is a block diagram illustrating a system for charging remote plant sensor using a mobile platform.





Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.


DETAILED DESCRIPTION

Before describing in detail embodiments that are in accordance with the present invention, it should be observed that the embodiments reside primarily in combinations of method steps and apparatus components related to an autonomous computer controlled robotic aerial sensor and manipulation platform having exchangeable deployable sensing tips for farming. Accordingly, the apparatus components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.


In this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “comprises . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.


As used herein, the term “actuator” means environmental control system that includes but is not limited to climate humidifier, dehumidifier, AC power control, lightening control, CO2 detection, irrigation control, or airflow/fan control.


A performance growing model consists of a list of set points and respective tolerances for all the relevant growing conditions in a cultivation environment over time, like amount of PAR (PPFD), air temperature, leaf temperature, relative humidity, airflow, CO2 concentration, leaf vapor pressure deficit (VPD), soil nutrients, and soil moisture. The model also outlines correlations between these factors and how they are affected by environmental control systems. A performance growing model can be seen as a cultivation recipe for growers to optimize for specific constraints like, yield, water consumption or any other business and crop relevant factor. Such models can be computed based on the recorded measurements by the system described in this document.


Characteristics of environmental control systems are properties like the time it takes to start and ramp up operation of such a device, how the actuation of the device influence the surrounding over time, e.g. how fast humid air is dispersed by a humidifier, and, thus, how the environment is influenced by the operation of the device. These characteristics are often modeled as system responses, e.g. when agitated by a step function, and by finite element analyses of the growing environment.



FIGS. 1 and 2 depict a schematic view of an aerial robotic sensor and manipulation system or sometimes called “mobile platform” 52 is installed over and operable to monitor and manage the plant growth environment and plant health of plant growth area 40 (FIG. 1). The illustration of FIG. 1 is to be understood as representing either an outdoor growing area, such as a portion of an agricultural crop-production field or arranged within an interior of a plant growth structure, for example a greenhouse.



FIG. 2 provides an enlarged view of the robotic base 14 of FIG. 1 suspended in the air by 4 aerial platform positioning cables 12 or other means of mounting structure, like rails, and provided with a bicone-shaped sensing and manipulation tip 16 deployed below the robotic sensor and manipulation platform 10, suspended on a tip suspension cable 56 or telescoping pipe sections from the robotic base 14. The aerial robotic sensor and manipulation system 52 is shown in FIGS. 1 and 2 arranged in service over the plant growth area 40.


A plurality of cable support points 38 are each securely fixed onto an elevated support structure 54 positioned about 4 corners of the plant growth area 40 and positioned at above the plant canopy of the plant growth area 40. The plurality of cable support points 38 are arranged outwardly away from the outer boundary 42 of the plant growth area 40 as a sufficient distance such that the robotic sensor and manipulation platform 10 reach all portions of the plant growth area 40. The cable support points 38 are each provided with cable spooling devices 36 and, in this illustration, shown arranged on the elevated support structure 54. In FIG. 1, the elevated support structure is shown a vertical pole, columns or a build wall arranged about the corners of the plant growth area 40. Advantageously, the cable spooling devices 36 are responsive to positioning commands from a control system 32 to affect a controlled spooling or despooling of lengths of the platform positioning cable 12 from the cable spooling devices 36 so as to move and reposition the robotic base 14 along a desired path to a desired position over the plant growth area 40.


The cable support points 38 are arranged at or outwardly from the 2D X-Y outer boundary 40 of the geometric plant growth area.


The cable spooling devices 36 preferably include an encoder in communication with the control system 32, the encoders changes in the deployed cable lengths such that the control system can coordinate the spooling and despoiling of the four cable spooling devices to achieve a desired travel path, robotic base elevation and cable tensioning of the aerial platform positioning cables.


In FIGS. 1 and 2, each platform positioned cable 12 has an end attached onto the robotic base (one at each corner) and tensioned by the cable spooling devices 36 such that the robotic base 12 is supported at a commanded elevation above the plant canopy 22 by commanded spooling and de-spooling movements of the cable drums of the cable spooling devices 36.


As can be readily appreciated, the control system coordinated spooling and despoiling movements of each of the cable spooling devices 36 are necessarily coordinated by the control system 32, to successfully move the robotic base 14 along the desired path above the plant canopy 22 to the commanded position and elevation.


A tip positioning mechanism 20 is arranged in the robotic base and is responsive to commands from the control system to deploy the sensing and manipulation tip 16 at a control system commanded position below the robotic base 14.


The tip positioning mechanism 20 may be embodied as a cable supportively connecting the robotic base 14 to the sensing and manipulation tip 16, or alternately may be embodies as a plurality of tubular telescoping pipe sections 50 that extend from or collapse into each other, the tubular pipe sections retractable into each other so as to adjust an overall length of the tubular telescoping pipe sections to deploy the sensing and manipulation tip 16 at a control system commanded position below the robotic base 14.


As best seen in FIG. 1A, the sensing and manipulation tip 16 preferably has a smooth bicone shaped body or may have a drop shaped body. In general, the drop shaped body is similar to a bicone but has a lower half or portion of the body shaped as a bottom half of a sphere, for example, forming a shape this is somewhat similar to a water drop, the body having an outer surface without edges shaped to smoothly pass-through plant growth or a trellis without becoming entangled or damaging the plants. As shown on FIG. 1A, the sensing and manipulation tip 16 is deployed below the robotic base 12 at an elevation controlled by the control system 32 and supported on the retractable, extendable tip suspension cable 56.


As seen in FIG. 2, the platform base 10 preferably has outer surfaces which are smoothly rounded, preferably avoiding sharp. The platform positioning cables 12 are fixedly connected to respective corners of the platform base 10 and are tensioned by the cable spooling devices 36 to support the platform base 10 at a desired elevation and to move the platform base 10 along a commanded path to a commanded position above the plant growth area 40.



FIG. 3 schematically depicts a cross-section of a preferred configuration of the aerial platform positioning cable 12 in which sensor signal lines, data line and/or network cables 60 and at least one fluid supply line 58 is enclosed in the interior of the aerial platform positioning cable 12. In this way, the fluid supply lines, signal lines etc. are embedded into the interior of the cable and are not dangling in the air to become entangled in and possibly damage the growing plants of the plant growth area 40.



FIG. 4 schematically depicts a spray device tip 26 as an advantageous variant of the sensing and manipulation tip 16 of FIG. 2, in this case a spray device tip 26 having a plurality of spray nozzles 26 configured for spraying treatments or irrigating plants in the growth area. In some aspects of the invention, the spray nozzles 26 are individually controlled on/off or optionally throttled by the control system to produce and target a controlled spray pattern into a desired location and in a desired direction, for example, onto the underside of a plant leaf or at the plant base or plant roots. The spray nozzles are in fluid communication with the one or more fluid spray lines 58 to deliver insecticides, nutrients, water and/or fertilizers, for just a few examples. FIG. 4 further illustrated that the sensing and manipulations tips may optionally be configured to have other smooth outer shapes without edges, forming a modified bicone or drop shaped body. In FIG. 4, the drop shaped body is a smooth substantially hemispherical or parabolic shaped bottom section having a partially elliptical or parabolic cross-section is provided on the bottom of the spray device tip 26.



FIG. 5 is a schematic illustration of the robotic base 14 and a sensor manipulation tip 16 deployably connected to the robotic base 14 by the tip suspension cable or tubular pipe sections 56. The base portion of the sensor manipulation tip 16 may include a light distance and ranging (LiDAR) distance sensor 48 in communication with the control system 32, the control system 32 comprising a robotic platform resident control system 32A in communication with and cooperatively interacting with a “compute box” having or including a computer control system 32B. The computer control system 32B is preferably in communication with internet cloud services performing further data analysis, reporting and data storage and communication with farmers and/lor indoor cultivation and greenhouse operators.



FIGS. 6, FIG. 7A and FIG. 7B are schematic illustrations of a preferred aspect of the invention in which the sensing and manipulation tip 16 includes a rotating disc 94 having one or more sensors and generally aligned with a viewing/sensing slot or window 100 extending through a wall of the housing of the bicone sensing and manipulation tip 16.


As shown in FIGS. 6, 7A and 7B, in a preferred aspect of the invention, the sensing and manipulation tip includes a rotating disc 94 rotatably mounted in an interior of the in the sensing and manipulation tip 16. The rotating disc 94 is operatively coupled to the computer-based control system 32 and controlled to rotate about an axis of rotation 98 to positions commanded by the computer-based control system 32. Generally aligned with the rotating disc 94 is an arcuate, preferable semi-circular, viewing/sensing slot or window 100 provided in the sensing and manipulation tip 16.


One or more cameras 72 for capturing images are arranged on and are rotated in unison with the rotating disc 94 to position the camera(s) 72 and lenses 104 at desired viewpoint positions along a length of the viewing/sensing slot. At any point in time, the rotating disc 94 with the camera(s) 72 can be rotated to expose the lenses 104 and record images from the respective viewpoints of interest above, about or within the plant canopy 22.


Advantageously, at any point in time, for example when the sensing and manipulation tip 16 is lowered into the plant canopy 22, the computer-based control system 32 may rotate the rotating disc 94 to move the camera(s) 72 into the protected upper region 102 such that the camera(s) are positioned away from the viewing/sensing slot or window 100. In this way, the camera(s) 72 can be protected from dirt and scratches while deployed in or robotically moving about the plant canopy 22.


As discussed earlier, the bicone sensing and manipulation tip 16 may be rotatably coupled to the tip suspension cable or tubular telescoping pipe sections 56 by a pan joint 92. The pan joint 92 is operatively coupled to the computer-based control system 32 and controlled to rotate the sensing and manipulation tip 16 about an axis of the tip suspension cable or tubular telescoping pipe sections 56. In this way the sensing and manipulation tip 16 can be rotated to enable a full 360 deg field of view about the axis of the tip suspension cable or tubular telescoping pipe sections 56.


Advantageously, the rotating disc 94 may further have arranged thereon any one of or a variety of the sensors 96 (shown schematically) discussed herein or below, for example: the distance sensor(s) or LiDAR sensor(s) 48, air flow sensor(s) 78, air quality sensors 80, light intensity and/or light spectrum sensor(s) 82, humidity sensor 106, CO2 sensor(s) 76, fluorescence sensor 90, for example, or other sensors as would be known to those of skill in the art. The sensors 96 may be arranged at any variety of positions on the rotating disc 94.


As discussed previously, it is important to note that any one of or multiple of the sensors may alternately or additionally be arranged within or on the housing of the sensing or manipulation tip 16 rather than being arranged on the rotating disc 94.



FIG. 8 is a schematic illustration in which the aerial support and positioning system of the robotic sensor and manipulation platform 10 is embodied as a gantry aerial support and positioning device 110, aerially supporting and positioning the robotic sensor and manipulation platform 10 above a plant growth area. The gantry aerial support and positioning device 110 having a plurality of longitudinal rails 112 on which a bridge member 116 is configured to be moved to control system commanded positions along the longitudinal rails 112 in the longitudinal direction 118. The longitudinal rails 112 and/or the bridge member 116 are provided with at least one motor drive responsive to commands from the computer-based control system to move and position bridge member 116 in the longitudinal direction 118 on the longitudinal rails 112. The robotic base 14 of the robotic sensor and manipulation platform 10 is connected to and supported on the bridge member 116. The bridge member 116 includes a motor drive responsive to commands from the computer-based control system to move/reposition the robotic base 14 in the transverse direction 120 to command positions along the bridge member 116. As discussed earlier, the tip positioning mechanism 20 of the robotic base 14 is responsive to commands from the computer-based control system to move the sensing and manipulation tip 16 in the vertical or Z direction 122 into commanded positions above or within the plant growth area.


The LiDAR sensor 48 is a scanner utilizing pulsed light energy emitted from a rapidly firing laser. The light travels to the ground, plant leaf, or other obstacles and is reflected off objects such as branches, leaves, etc. The reflected light energy then returns to the LiDAR sensor where it is detected and processed by the computer-based control system 32 to determine distances from the sensor manipulation tip 16 to neighboring objects or obstacles. The LiDAR sensor or scanner can determine the distance between itself and an object by monitoring how long it takes a pulse of light to bounce back. The concept is similar to radar, except using infrared light rather than radio waves. While radar is designed to be used across greater distances, LiDAR generally works over shorter distances, due to the way light is absorbed by objects in its path. By sending, for example, hundreds of thousands of light pulses every second, the LiDAR sensor or scanner can advantageously determine distances and object sizes with relative accuracy over the relatively small distances in a plant growth area.


As an alternative to or in addition to a LiDAR distance sensor, Time-of-Flight single- or multi-zone ranging sensor might be used.


The sensor manipulation tip 16 preferably includes one or more temperature sensors 66, particularly for sensing air temperatures and temperature variations within the geometric plant growth area 40, detecting a 2D or 3D profile of how temperature changes across the plant growth area 40, such that the control system 32 can adjust temperatures of air cooling or air heating units above or about the plant growth area 40. For example, standard industry type infrared arrays might be used as temperature sensor, allowing for a measurement of not just the environmental temperature but the temperature of a plant and even the temperature distribution on a plant.


The sensor manipulation tip 16 preferably includes one or more cameras taking images and may also serve as a distance sensor, for example by measuring changes in the focal length of the image, or the camera may be embodied to take stereo images from which distance can be calculated by triangulation methods. One or more cameras, as enabling non-limiting examples: Arducam™ 12 MP or Luxonis OAK-1-PCBA™ might be included in the sensor manipulation tip 16. The camera might be integrated, e.g. on a PCB, with chips performing AI modules directly on-board. The cameras might be equipped with autofocus systems for distance measurements.


The robotic base 14 and/or robotic sensor and manipulation tip 16 preferably includes one or more multi- or hyperspectral sensors 74. Multi- and hyperspectral sensors are devices which record images using a wide portion of the electromagnetic spectrum. These sensors capture an image in a number of slices or spectral bands, each representing a portion of the spectrum. These spectral bands may then be combined to form a three-dimensional composite image. The resultant images or hyperspectral cubes provide data for a definitive, deep layer analysis of the plant materials or minerals which make up the scanned area. Hyperspectral imaging is known to be a valuable diagnostic tool in agricultural crop monitoring applications and mineralogy fields. Hyperspectral sensors may be applied with the control system 32 to create images and predictive reports which may assist in the early detection of plant disease outbreaks and overall plant health. Hyperspectral sensors can also be applied with the control system 32 to measure and determine nutrient levels in standing crops and water levels in the surrounding soil.


The robotic base 14 and/or the robotic sensor and manipulation tip 16 may include one or more CO2 sensors, detecting carbon dioxide levels in the ambient air in the plant growth area 40. For example, CO2 sensors, such as for example Sensirions SCD4x™ or combined sensors for CO2 and temperature and/or humidity like Sensirions SCD30™ might be used.


The robotic base 14 and/or the robotic sensor and manipulation tip 16 may include one or more airflow sensors 78, detecting air flow speed and/or direction in the plant growth area 40. For example, hot wire anemometers might be used, in particular in indoor environments, and a spinning cup anemometer might be used, in particular in outdoor environments.


The robotic base 14 and/or the robotic sensor and manipulation tip 16 may include one or more air quality sensors 80, for example: particulate sensors (PM 2.5, PM 5), TVOC (total volatile organic compound) sensors, humidity sensors, ozone sensors, and CO2 sensors (as above), as well as other air quality sensors as would be known to those skilled in the art. An example for such a sensor is the Bosch™ BME 680 which can measure humidity, barometric pressure, temperature, and additionally it contains a MOX sensor. The heated metal oxide changes resistance based on the volatile organic compounds (VOC) in the air, so it can be used to detect gasses and alcohols such as Ethanol, Alcohol and Carbon Monoxide, and perform air quality measurements.


The robotic base 14 and/or the robotic sensor and manipulation tip 16 may include light intensity and light spectrum sensors 82. Such sensors might be highly specialized (Extended) Photosynthetically Active Radiation Sensors or rather standard sensors e.g. like the Adafruit™ light sensor AS7341 or any other multi-channel spectral sensor. For some applications one or more sensors to capture the full spectrum combining visual light, near infrared and mid infrared may be advantageous.


The robotic base 14 and/or the robotic sensor and manipulation tip 16 may include pan tilt camera unit 84, preferably rotatable by 360 degrees freely or by 180 degrees in both directions.


The robotic base 14 preferably includes an air pressurization mechanism 86, for example, an air compressor device. The air pressurization mechanism 86 is responsive to commands from the control system 32 to pressurize, on command, an interior channel in the tip suspension cable 56 so as to stiffen the cable against flexing so as to positionally stabilize the robotic sensor and manipulation tip 16 against swinging or deflection relative to the robotic base 14. This can be especially important when spraying or pruning plants.


The pan/tilt camera unit 84 and other cameras of the robotic base 14 and/or the robotic sensor and manipulation tip 16 are operable by the control system 32 as another means to detect undesirable swinging or movement of the robotic sensor and manipulation tip 16 such as to initiate the air pressurization mechanism 86.


The robotic base 14 and/or the robotic sensor and manipulation tip 16 may preferably include at least one tip orientation detection means 88, detecting rotational orientation of the robotic sensor and manipulation tip 16 relative to the robotic base 14. The rotational orientation of the robotic sensor and manipulation tip 16 relative to the robotic base 14 may also be detected by the pan tilt camera 84 of the robotic base 14.


The robotic base 14 and/or the robotic sensor and manipulation tip 16 may preferably include at least one motion sensor 108, e.g., a combined accelerometer, an accurate close-loop triaxial gyroscope, a triaxial geomagnetic sensor as known e.g. from smart phones and the like.


The robotic sensor and manipulation tip 16 preferably may include at least one fluorescence sensor 90 operative to study chlorophyll and to measure dissolved oxygen concentrations. The at least one fluorescence sensor 90 detects chlorophyll fluorescence (CF) data and communicates with the control system 32 to provide a vital understanding of plant health and crop photosynthesis. In some embodiments, the at least one fluorescence sensor 90 collects image data at high resolution across the chlorophyll fluorescence emission spectrum, preferably from 670 to 780 nm, preferably to allow both the ‘Oxygen-A’ and ‘Oxygen-B’ bands to be measured for more accurate insight into plant photosynthetic processes. The at least one fluorescence sensor 90 is preferably rotatable at up to 360 degrees about the robotic sensor and manipulation tip 16.


The tip positioning mechanism 20 of the robotic base 14 may include a force detection sensor 30 in communication with the control system 32 and detecting forces applied on the tip positioning mechanism 20, tip suspension cable 56 or the robotic sensor and manipulation tip 16 for detecting encountered obstacles or entanglements of the robotic sensing and manipulation tip 16. In some embodiments, the force detection sensor 30 may be a motor current sensor, detecting variations or increases in motor current draw of the tip positioning mechanism 20 indicating entanglement.


Those skilled in the art will recognize that all sensors described herein are in communication with the computer-based control system 32, providing sensor data to the control system 32 for plant health analysis, 3D model generation of the plant growth area, generation a 3D topology of the plant growth area and to enable the autonomous, automated operation of the robotic base 14 and the robotic sensor and manipulation tip 16 as well as the reporting functions of the compute box/computer control system 32B and cloud provided services.



FIG. 9 is a flow chart diagram illustrating various processes used by the aerial sensor and manipulation platform for detecting problematic microclimates enabling the platform to be scheduled to periodically return to desired locations above the plant growth area. The methods 200 includes, but are not limited to, the step of detecting 201 one or more microclimates in the plant growth area over some predetermined time period. Those skilled in the art will recognize that a microclimate is the climate of a very small or restricted area, especially when this differs from the climate of the surrounding area. This is accomplished by first coarsely sampling 203 the plant canopy space and compiling and/or collecting 205 these measurements at all measured locations.


Critical areas whose climate exceeds various predetermined standards such as temperature, humidity and light intensity or where the change rate exceeds predetermined standards are identified 207. For example, a predetermined standard in a plant growth area may be measured several times with predetermined time intervals, e.g one day, and thus zones of the plant growth area with a high volatility of the standard may be determined. Once identified, critical areas are further measured 209 by taking additional subsamples so that more information and more specific or “denser” geographic areas can be identified. The subsamples and compiled where the resulting data is used to produce and compute 211 a heat map of the local environment. The map can then be used by the aerial sensor and manipulation platform enabling it to return 213 to the areas having larger or faster variations, e.g., in temperature, humidity or light intensity, at a more frequent interval. After each visit, the heatmap can be updated with new data. Hence, the aerial sensor and manipulation platform can be scheduled to visit these microclimate locations for additional visits providing new or additional applications of water, fertilizers, and/or pesticides.



FIG. 10 is a flow chart diagram illustrating a process for detecting bugs, pests and insects using one or more cameras, sensors and/or other imagers used in the aerial sensor and manipulation platform as described herein. Pests are initially labeled by expert labelers using techniques like active tracking, where a label is tracked throughout the history of images. This quickly results in a large number of labels which are used to train a model of each anomaly and pest using machine learning. Also unlabeled images can be used to improve the performance using self-supervised learning techniques. Being able to record images in close vicinity of the plants allows the system to take high resolution images from different viewpoints and, thus, see many crucial features of the anomaly, like leaf curling, color changes, interveinal yellowing, etc., and physiological details of pests, like shape, limbs, antennas, etc., which enable and improve the performance. The pest detection method 300 includes the steps of checking 302 for pests in the plant canopy. The various locations that are to be checked are compiled 303 to determine a path to that location 305. If the target location is above the canopy, then the sensor is moved 323 to that location and the camera pointed in the desired direction 325. If pests are present, then their quantity and type can be identified and reported 327 for further action.


In situations where the target location is not above the canopy 307, a new location above the target location is computed 309. The aerial sensor and manipulation platform is moved to that new target location and sensors are used to detect 313 any impediments or obstacles. If the location is not accessible 315, then the sensor measurements are evaluated 317 to determine if there is any viable free space. If there are no alternatives, then the process starts again, where the path to the next location is computed 305. If an alternative is available, then the sensor is moved to that location 311 and the process continues.


When the location is determined as accessible 315, then the sensor platform can be moved and/or lowered into position 321. Thereafter, a camera, sensor or other imaging device is pointed in a requested direction and an image is captured 325. A determination can then be made if pests are present 327. If no pests are present, then the next location is computed and the process continues. However, if pests are present, then the presence of the pest, the quantity and type of pest can be reported for applications of pesticides or other further action.


In still yet another embodiment of the invention, solutions have been found to correct how actuators impact the growing environment. For that, the mobile platform will measure the full three-dimensional (3D) space, even below the canopy using the lowering mechanism. These techniques are not solved in prior art systems. By taking measurements at different locations, the system can identify how each actuator impacts the growing environment, a process known as “system identification”. In an initial system identification phase, the system has full control over the actuators and does not need to ensure ideal growing conditions. It moves to uniformly distributed waypoints and enables each actuator and each combination of actuators for a period of time which allows the impacted measurements to converge. These step responses allow the system to determine and or provide a predictive model for how each actuator and any respective combination of actuators impact the environment over time. For example, it can be understood how the fans blow cool air from the air conditioners (AC) or dry air from the dehumidifiers through the growing environment and how activating the ACs increases the relative humidity nearby which can lead to condensation of moisture on the leaves which can lead to pest infections like powdery mildew. These spatio-temporal correlations are modeled using Dynamic Linear Models (DLM) with spatial covariates or other techniques used for Bayesian Time Series Modeling. The control algorithm is then implemented as Partially observable Markov Decision Process to accommodate for noise, e.g. using Reinforcement Learning as a prominent solution.


These predictive models combined with the understanding of microclimates and the impact of each actuator, i.e. environment control system, over time, can then be used to compute the optimal control output for each of these actuators to create the ideal growing conditions for a crop for each moment in the crop's growth cycle. Optimizing the growing conditions results in increased yields and reduced operational costs.


Later, when collecting the measurements during regular operation, the actuators need to be controlled in a way which provides the best growing conditions. The actuators are controlled respectively while the system is moving from waypoint to waypoint, covering the full space. Hence, the collected data is a mix of the location specific tendencies and the general climate across the room recorded at a specific time. When recording the measurements, it is crucial to measure along more than one path to improve the separation of spatial and temporal influence. By moving in different trajectories (see below), the system can separate the spatio-temporal correlations, i.e., it can figure out what measurement fluctuations might be caused by the measurements taken at different times vs. where actual microclimates are. This predictive model can then be used to optimize the environmental control since microclimates in the environment are known and the dynamic changes, i.e. fluctuations, can be minimized,


With regard to both FIG. 11A and FIG. 11B, the process 400 includes steps for determining both a system identification and spatio temporal predictive model. The system identification process begins with each of the environmental actuators, such as light, fan, humidifier, dehumidifier and air conditioner, turned off and/or disabled 401 as well as various combinations thereof 403. The waypoints are computed based upon two orthogonal geometric or “snake” motion patterns with flipped major and minor axes 405. Those skilled in the art will recognize that switching the pattern will allow for an improved separation of any spatio-temporal correlations. The process then moves to the next waypoint while measuring distance to the canopy, using the lidar, time of flight, and or camera sensor-based structure-from-motion (SfM) or other distance sensors, where height is adjusted to maintain a constant distance 407. Environmental parameters above and below the canopy are measured by lowering the measurement tip. Using the same sensor set as above and or the NDVI sensor, the system identifies an ideal location to drop the measurement tip in the vicinity of the waypoint 409. The actuators are then turned on 411 until the system response has been fully captured, i.e. until the measurements stabilize 413. The actuators are then turned off 415 and a determination is made if the canopy is too close to the system 417. If too close to the sensor, and the system risks touching and/or damage, then all of the remaining waypoints are skipped along the major axis 419. A next waypoint is picked on the minor axis 407, effectively avoiding any obstacles. The process will then continue moving to the next waypoint 409. If, however, the canopy is not too close to the system 417, then it continues to the next waypoint 407. Thereafter, system identification is calculated based upon the actuators' step response function. The predictive model is generated using techniques used in computational fluid dynamics (CFD) like finite element methods and more computationally efficient neural networks can be trained based using the resulting models.


With regard to the process for determining a spatio-temporal correlation, after the actuators are disabled 401 the actuator is switched on and off 425 at a constant frequency which has to be slower than twice the measurement frequency, ideally 10× slower, while moving through space in snake motion patterns as outlined in FIG. 11B. Thereafter, a spatio-temporal model is calculated for each actuator 427 using, e.g., Dynamic Linear Models (DLM) with spatial covariates or other techniques used for Bayesian Time Series Modeling. The process can be repeated to improve the modeling results.



FIG. 11B illustrates the orthogonal Snake A 427 and Snake B 429 motion patterns used in the process shown in FIG. 11A. As noted herein, both of these patterns are orthogonally reversed or “flipped” about major and minor axis. Switching these patterns in the system identification process does improve the separation of spatio-temporal correlations and, thus, achieves a better result.



FIG. 12 is a flow chart diagram illustrating a method of automated height adjustment for the platform to optimize the cultivation environment. An agricultural farmer or “grower” often runs different cultivars or strains in the same cultivation environment. Each of these cultivars will grow at a different speed and to a different height. In yet another embodiment of the invention, the mobile farming platform as described herein, can use a distance sensor, e.g. a time-of-flight (TOF) sensor and or a camera based structure-from-motion or stereo algorithm, to create key performance indicators (KPIs) such as plant height, average growth speed, etc. The KPIs are also used by the TOF sensor to compute the optimal height at which the platform should run above each plant to ensure safety for the plants and the best measurement results. This approach allows running the platform at different heights for each location in the cultivation environment during a single measurement run.


As seen in FIG. 12, the automated height adjustment process 500 begins by measuring the height of each plant via a TOF sensor 501. A plant height map is created for each cultivation environment 503. The plant height map is then used to determine an optimal height of the platform for each point in the workspace 505. The platform is then moved at the optimal heights within the cultivation environment 507 when the process begins again 501. Once the optimal height is determined, the plant height map is used to create cultivation key performance indicators (KPIs) such as plant height, biomass, growing speed and various averages used by the grower to optimize plant yield 509.


Further, once the system is installed, the dimensions of the workspace need to be known to enable a position-controlled operation of the mobile platform in the grow environment. Also, the rope length might creep over time and a recalibration might be required. A force-based control system is implemented which moves the platform from one winch to another. The forces are measured using dedicated force sensors, like strain gauges, or by simply measuring the motor current. The algorithm iterates over each winch multiple times and sets the winch into pull-mode, applying a higher force than the other winches, which are in release-mode, keeping a minimum tension to prevent the platform from lowering into the canopy. For such a control concept, the workspace dimensions are not required. The length of the rope is measured using the winch encoders. Based on the measurements, the workspace dimensions are computed. The system stops navigating, once an adequate consistency (small measurement residuals) has been achieved. As more measurements are added to the dataset, outliers are suppressed, e.g., using M-estimators.


The force sensing mechanism can also be used to detect if the system is stuck, jammed or otherwise inoperable; or a human would like to stop the system by gently pulling one of the ropes.



FIG. 13 is a flow chart diagram illustrating a method for locating static support sensors. In another embodiment of the invention, the grower can deploy sensors in the field to measure relevant environmental factors such as soil moisture, soil electrical conductivity (EC), soil temperature, irrigation temperature, irrigation EC, irrigation temperature, run-off EC, etc. The exact location of these sensors is necessary to allow growers to interpret these measurements. Manually localizing them in a map is tedious and error prone. Moreover, these sensors can be moved with the plants resulting in their locations must be continually updated. In response to this issue, the mobile platform as described herein can move across the full three-dimensional (3D) space. By measuring the received signal strength indicator (RSSI) and the time of flight (TOF) of the wireless communication from a grid of locations, the methods as described herein work to calculate the coarse location of a sensor. Thereafter, an LED turn on/off or “toggle” on the sensors to enable image-based localization from two or more images, once in the vicinity of the sensor. To improve performance, this process can be performed in darkness to simplify the LED detection.


As seen in FIG. 13, the static support location process 600 begins by computing waypoints based on an orthogonal motion pattern 601. The device at each waypoint is interrogated or “pinged” and its received signal strength indicator (RSSI) recorded as well as the time for the return ping to be returned to the source often referred to as time-of-flight (TOF). A coarse location is then computed based on the measured RSSI and TOF using a linear system of trilateration. Those skilled in the art will recognize that the term “trilateration” means the measurement of the lengths of the three sides of a series of touching or overlapping triangles on the earth's surface for the determination of the relative position of points by geometrical means such as in geodesy, map making, and surveying, To further improve location accuracy, the device can be moved to a location where a light source such as an LED can be activated 607. Thereafter, images can be recorded at two or more nearby locations 609 and an accurate location can be computed by triangulating the image coordinates of the identified LEDs.



FIG. 14 is a block diagram illustrating the tracking of both individual and batches of plants, with passive or active markers, using a ceiling mounting mobile platform. In still another embodiment of the invention, plants are often moved around in a growing facility throughout their lifetime. In order to provide a history of their growing conditions, these plants need to be tracked. As described herein, a proposed solution provides a method where each plant or batch of plants are equipped with passive (e.g. QR codes, passive RFID tags) or active markers (e.g. active RFID tags). The mobile platform will detect these markers and will track them over time. Since the markers are spatially referenced, the growing conditions for each plant will also be known, despite the plants potentially being moved around throughout their lifetime.


As seen in FIG. 14, the plant tracking system and method includes a plurality of plants 701 where each plant includes an QR code or RFID tag 703. The mobile platform 705 includes a camera or RFID sensor enabling each of the plants to be tracked while moved throughout a growing environment. These techniques can be further extended by using a drop-down sensor 707 to read a plant identifier (Barcode, RFID, etc. . . . ) that might be hidden below the plant canopy. This information will further be used to automatically create a map of each plant in the cultivation environments with its exact location. After harvest, this mechanism also allows one to fully understand what exact micro-climates a plant was exposed to during its lifetime and correlate this to the harvest result of each individual plant to make further recommendations for optimal control and plant yield strategies.



FIG. 15 is a block diagram illustrating a system and method of determining a Normalized Difference Vegetation Index (NVDI) calibration. NDVI is a technology developed and used for outdoor agriculture, in plain sunlight. Using the conventional approach, in indoor or greenhouse cultivation with supplemental light, will inherently fail in view of the differing light spectrum. The present invention provides a solution where each channel in a red-green-blue (RGB) camera needs to be calibrated to account for the respective light source. For this, the system will record images over a white resting shield. The white resting shield has a nearly Lambertian surface to provide uniform light reflection. Thereafter, the camera is adjusted to ensure that the values recorded are centered in each channel. The values which need to be adjusted are typically shutter speed and white balance factors. FIG. 15 illustrates the NVDI calibration system 800 where the camera 803 includes a white resting shield 804. The white resting shield is positioned over one or more plants 801 so to provide uniform light reflection.



FIG. 16 is a flow chart diagram illustrating a method of reactive environmental control. In still another embodiment of the invention, measuring the full canopy takes time and there are microclimates which are changing faster and need more attention to ensure consistent growing conditions. Instead of controlling the zone as one unitary environment and monitoring the space uniformly, the system of the present invention computes the uncertainty based on previous measurements where the uncertainty is used to calculate a predicted dynamic rate of change e.g. an uncertainty value. When determining the location for the next measurement based on this uncertainty value, the system prioritizes areas with high uncertainty over areas which seem to be steady and show low uncertainty. This data can be extrapolated further by not just considering the next location, but the actual trajectory to be traversed. Thus, the uncertainty in space and the current location are taken into account and a trajectory is computed by minimizing the uncertainty along that path.


As seen in FIG. 16, the method of reactive environmental control 900 includes the steps of taking measurements at uniformly sampled locations in space 901. Control parameters are computed based on a predictive model 903. A spatio-temporal model is applied to this data to separate spatial and temporal effects 905. Next, uncertainties and location specific system dynamics are computed by calculating the standard deviation from the expected values over time 907 and a measurement location is computed which minimizes uncertainty at the next time of measurement 909. Thereafter, measurement is made at the computed location 911 and a feedback loop is formed where the control parameters are again computed based on a predictive model 903.



FIG. 17A is a block diagram illustrating the system used for reactive pest control. FIG. 17B is a flow chart diagram illustrating a method of reactive pest control. In yet another embodiment of the invention, and as noted with regard to FIG. 10, growers often encounter pests which might appear at any location in a growing environment and at any time. A timely detection and intervention is crucial to avoid/minimize the spreading of pests and, thus, crop damage. The present invention provides a system and method where the mobile platform can be used to monitor the plants for insect infestation. When pests are detected, the system notifies the grower and potentially reacts to the threat by spraying and/or releasing the necessary predators/parasites at the specific location. In use, the system keeps visiting and monitoring how the pest outbreak evolves. The system includes a winch that contains a storage platform for a dropbox. After the notification of the growers, they will deploy live benign predators of the pest at the storage platform and inform the system that the insects can be deployed. The platform picks up the dropbox, moves to the respective location, and activates a lever which releases the insects. Alternatively, the system also includes a winch with a connector for pesticide containers. After the growers have connected the container with the required liquid, the platform picks up the spray nozzle and deploys the pesticide where necessary. Limiting the deployment to the location of infestation is more cost efficient and environment friendly.


As seen in FIG. 17A, the system 1000 operates to detect plant infestation. A plant 1001 is shown with an insect infestation 1003. The mobile platform 1005, as described herein, includes a winch and cable connected to a dropbox 1007. The dropbox includes live insects 1009 that are predatory in nature to counter the insect infestation. The steps used in the system are shown in the flow chart of FIG. 17B. The reactive pest control process 1050 begins where measurements are taken to collect data at uniformly sampled locations in space 1051. This data is used to detect insects and pests 1053 where a notification is sent to the grower that pests are detected 1055. In response, predator parasites and/or chemicals are released at the infected plant locations 1057. The infested locations are subsequently revisited at periodic intervals 1059 and data is again collected 1053 until the infestation has been eliminated.



FIG. 18 is a block diagram illustrating a system for plant identification, continuous plant counting and identification of unused plant space. Growers must keep track of the number of plants in a cultivation environment while making maximum use of the cultivation space. Still another embodiment of the invention provides a system and method for the mobile platform to move in 3D space and algorithmically identify unused space in the cultivation environment. This is accomplished using imaging devices such as cameras (RGB, NDVI, and others) whose images provide a performance metric. Using cameras and certain characteristics in these images such as green pixels, corner and shape detection, unused area identification, the NDVI index, and others unused planting space can be identified and the crop yield can be estimated. The mobile platform as defined herein works to determine the overall number of plants in a cultivation environment and also constantly monitors the number of plant types in each cultivation environment. The system can also be used to determine plant properties, like crop count, stem branching, and others, using image-based machine learning techniques. FIG. 18 illustrates the plant identification system 1100 where the mobile platform 1101 can identify the total number of plants 1103, 1105, 1107 in a given space but also the amount of unused plant space 1107 that can help to raise future plant yield.



FIG. 19 is a flow chart diagram illustrating the teleoperation of an autonomous environment monitoring system. Grows require the ability to inspect the plants at a remote facility. Our mobile platform can move in 3D space and offers a mode where the farmer can specify the next measurement location and what sensor measurements should be taken, e.g., images. Based on the transferred data, the user can come up with the next measurement location and, thus, remotely control the platform. The interface can also support direct motion control, where the platform keeps a safe distance from the canopy by measuring the distance to the plants. This same remote mechanism can also be used to trigger a spraying operation via the platform in a particular user-defined location. In use, the user moves the platform to a particular location and then triggers the platform to spray. Thereafter, the platform is moved to a different point while spraying is active and then stops spraying operation. FIG. 19 shows the steps used in this teleoperation process 1200 where the latest measurement data is shown 1201 and the user specifies the next location and sensors to directly measure or directly control mobile platform motion using a user interface (UI).



FIG. 20 is a block diagram illustrating a system for charging remote plant sensors using a mobile platform. Growers often deploy wireless sensors in the field to measure relevant growing factors such as soil moisture, soil EC, soil temperature, irrigation temperature, irrigation EC, irrigation temperature, run-off EC, etc. These sensors are battery powered and need to be charged occasionally. Another embodiment of the present invention uses the mobile platform to recharge the devices. The platform can move across the full 3D space, so as to move a charging tip to the sensor so it can be inductively charged. FIG. 20 shows the charging system 1300 where a plant 1301 includes a sensor 1305. The mobile platform 1307 can be moved into position over or near the plant 1201 where a charging tip 1307 can be moved in close proximity to the sensor 1305 so that it can be inductively charged.


Finally, in order to train reliable machine learning (ML) models it is crucial that a diverse and meaningful dataset is collected. Uniformly sampling data will not result in such a dataset but contain a lot of redundant, valueless data for machine learning. Instead, the present invention provides returns to confirmed detections so to detect more data of interesting and relevant features. This might be referred to as “active” data harvesting. These measurements will be performed from different viewpoints, distances, and at different lighting conditions to increase variability. The system can also calculate the importance of an image to push the performance of the respective ML model on the platform (on the edge) by assessing its location in the embeddings space and how close it is to classification boundaries and only upload data which will improve the neural network's performance.


Those skilled in the art will recognize that there are hundreds of thousands of plant consultants in the world. Each consultant has its own clients and will typically travel to each cultivation facility to access performance metrics and to offer suggestions for performance improvement. The invention as described herein does include the capability to use the expertise of various experts and consultants in order to match the consultant to a cultivator when requiring a specific working knowledge or expertise in plant growth. In use, the data collected by each consultant can be used by a grower or cultivator, allowing them to browse through a network of consultants and experts enabling them to request a quote for consulting services. The consultant can then ask for limited access to the cultivation data via the data platform as described herein to tailor the quote. After the consultant and cultivator agree to engage, each consultant will be given access to the software platform to derive a full understanding of the status quo and to make suggestions for performance improvement. This will include the capability to remotely drive the sensor platform to certain locations in the cultivation space. These engagements can be hourly or project based (e.g. an expert is asked to assess a single image of a leaf that the cultivator has flagged as suspicious or provides 15 hours of consulting services priced at a certain rate per hour). The invention as described herein provides an improved business method with the unique capability to access a comprehensive, canopy wide assessment of the environmental conditions, including RGB images, temperature, humidity, lighting conditions, pest pressure, etc. over the full lifetime of the plants to enhance the overall business prospects of growers.


Thus, aspects of the present invention are directed to a robotic sensor and manipulation platform and methods of use that are configured to connect directly or indirectly to an aerial support and positioning system. The platform includes a robotic base connected to and moved to command positions by the aerial support and positioning system and at least one sensing and manipulation tip deployable from the robotic base that includes one or more sensors for imaging and detecting climatic data and parameters. A motor driven tip positioning mechanism is responsive to positioning commands from a control system. The tip positioning mechanism is arranged on the robotic base and connects the sensing and manipulation tip to the robotic base and is operable to move the sensing and manipulation tip to desired positions above or in the plant canopy.


In the foregoing specification, specific embodiments of the present invention have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of the present invention. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.

Claims
  • 1. A robotic sensor and manipulation platform configured to connect directly or indirectly to an aerial support and positioning system, the robotic sensor and manipulation platform comprising: a robotic base connected to and moved to command positions by the aerial support and positioning system;at least one sensing and manipulation tip deployable from the robotic base where the tip positioning mechanism includes a motor drive responsive to positioning commands from a control system, the tip positioning mechanism arranged on the robotic base and connecting the sensing and manipulation tip to the robotic base and operable to move the sensing and manipulation tip to commanded positions above or in the plant canopy; and
  • 2. A robotic sensor and manipulation system as in claim 1, wherein the performance growing model is based on both a predictive actuation model and a spatio-temporal environment model.
  • 3. A robotic sensor and manipulation system as in claim 2, wherein the spatio-temporal environment model is modeled using Dynamic Linear Models (DLM) with spatial covariates.
  • 4. A robotic sensor and manipulation system as in claim 1, wherein the performance growing model computes waypoints based on a plurality of orthogonal motion patterns with flipped major and minor axis.
  • 5. A robotic sensor and manipulation system as in claim 1, wherein the performance growing model determines microclimates for minimizing climate fluctuation and optimizing environmental control.
  • 6. A robotic sensor and manipulation system as in claim 1, wherein the robotic sensor and manipulation platform moves in three-dimensional (3D) space and algorithmically identifies unused space in the cultivation environment or estimates crop yield.
  • 7. A robotic sensor and manipulation system as in claim 1, further comprising: at least one RGB camera that uses image characteristics to identify unused plant area to estimate crop yield.
  • 8. A robotic sensor and manipulation system as in claim 7, where the image characteristics used for unused planting space calculation include at least one of green pixels, corner and shape detection, unused area identification, or an NDVI index.
  • 9. A robotic sensor and manipulation system as in claim 1, further comprising: an active pest management system that operates to spray pesticides at predetermined locations.
  • 10. A robotic sensor and manipulation system as in claim 1, further comprising: an active pest management system that operates to release benign insects at predetermined locations.
  • 11. A robotic sensor and manipulation platform configured to connect directly or indirectly to an aerial support and positioning system, the robotic sensor and manipulation platform comprising: a robotic base connected to and moved to command positions by the aerial support and positioning system;at least one sensing and manipulation tip deployable from the robotic base where the tip positioning mechanism includes at least one sensor; and
  • 12. A robotic sensor and manipulation system as in claim 11, wherein the performance growing model is based on both a predictive environment control model and a spatio-temporal environment model.
  • 13. A robotic sensor and manipulation system as in claim 12, wherein the spatio-temporal environment model is modeled using Dynamic Linear Models (DLM) with spatial covariates.
  • 14. A robotic sensor and manipulation system as in claim 11, wherein the performance growing model determines microclimates for minimizing climate fluctuation and optimizing environmental control.
  • 15. A robotic sensor and manipulation system as in claim 11, wherein the robotic sensor and manipulation platform moves in three-dimensional (3D) space and algorithmically counts plants for identifying unused planting space in the cultivation environment or estimate crop yield.
  • 16. A robotic sensor and manipulation system as in claim 11, further comprising: an active pest management system determines locations of plant infestation and operates to spray pesticides at the infested location.
  • 17. A robotic sensor and manipulation system as in claim 11, further comprising: an active pest management system determines locations of plant infestation and operates to release benign insects at the infested location.
  • 18. A robotic sensor and manipulation system as in claim 11, further comprising: at least one RGB camera that uses image characteristics to identify unused plant area or to estimate crop yield.
  • 19. A robotic sensor and manipulation system as in claim 18, where the image characteristics include at least one of green pixels, corner and shape detection or an NDVI index.
  • 20. A robotic sensor and manipulation platform for determining how environmental control systems impact a growing environment comprising: a robotic base connected to and moved to command positions by the aerial support and positioning system;at least one sensing and manipulation tip deployable from the robotic base where the tip positioning mechanism includes at least one sensor; and
  • 21. A robotic sensor and manipulation system as in claim 20, wherein the performance growing model is based on both a predictive environment control model and a spatio-temporal environment model.
  • 22. A robotic sensor and manipulation system as in claim 21, wherein the spatio-temporal environment model is modeled using Dynamic Linear Models (DLM) with spatial covariates.
  • 23. A robotic sensor and manipulation system as in claim 18, wherein the performance growing model is computed based on measurements recorded at waypoints traversing a plurality of orthogonal motion patterns with flipped major and minor axis.
  • 24. A robotic sensor and manipulation system as in claim 20, wherein the robotic sensor and manipulation platform moves in three-dimensional (3D) space and algorithmically identifies unused space in the cultivation environment or estimates crop yield.
  • 25. A robotic sensor and manipulation system as in claim 20, further comprising: further comprising: an active pest management system that operates to spray pesticides at predetermined locations.
  • 26. A robotic sensor and manipulation system as in claim 20, further comprising: further comprising: an active pest management system that operates to release benign insects at predetermined locations.