The present invention relates to an insect elimination system for eliminating a flying insect in an airspace and to a use of such insect elimination system.
Insect elimination may be carried out for a variety of reasons. For example, insects, in particular insect bites or stings may cause a transfer of diseases such as Malaria and Zika virus. As another example, an insect in a room, such as a bedroom may result in sleep deprivation or negatively affect a quality of sleep. Also, insects may contribute to damage to plants, for example by the insects themselves eating leafs of the plant, larvae of the insects eating leafs of the plant or by a transfer of diseases that harm the plant.
In order to carry out insect elimination, many solutions have been devised. For example, insecticides have been developed in order to repel the insects. As another example, for indoor use insect screens and mosquito nets may keep insects out of a certain area.
Furthermore, natural solutions may be deployed, for example certain plant varieties may be planted which keep insects away by dispersing certain substances.
The invention aims to provide an alternative means of insect repelling.
According to an aspect of the invention, there is provided an insect elimination system for eliminating a flying insect in an airspace, comprising:
Thus according to the invention, an unmanned aerial vehicle, UAV, also called a drone, is applied. The UAV can be guided through the airspace of interest and comprises at least one propeller for propelling the UAV through the airspace. The propeller may for example be a single propeller which rotates about a vertical axis in a horizontal plane. As another example, plural propellers may be provided, such as 2, 3 or 4 propellers.
The system further comprises a camera, such as a video camera, which images at least a part of the airspace and provides image data, such as video data, representative of at least a part of the airspace. A single camera device may be employed. Alternatively, plural camera devices may be employed, for example each surveying a different part of the airspace, or a combination of a stationary camera device and a camera device that is attached to the UAV. The camera may for example be a CMOS camera or employ any other camera technology. The image data may comprise video data, a time sequence of pictures, or any other image data. The image data, as output by the camera, is provided via an image data connection to a controller. The controller (which may also be identified as control device: the terms controller and control device being interchangeable in the present document) may for example be formed by any control logic or data processing device, such as a microprocessor circuit, a microcontroller circuit, etc. The image data connection may be formed by any analogue or digital data connection and may be e.g. a wired connection or a wireless connection (depending on location the camera and the controller. For example, a stationary camera and a stationary controller may communicate via a wired connection, while in case the camera would be stationary and the controller would be provided at the UAV, a wireless connection may be employed). The controller monitors the image data for the occurrence of an image of an insect. Thereto, the controller may employ image processing to recognize an image of an insect. Alternatively, other criteria may be applied, such as a speed of movement of an item through the airspace, as will be explained in more detail below.
Having detected an image of an insect, the controller proceeds by guiding the UAV. Thereto, the controller is connected to the UAV by means of any suitable control data connection, such as a wireless data connection. The controller guides the UAV to hit the insect by the propeller. As the propeller of the UAV moves at a high speed, the insect is hit by one or more of the propeller blades with a great impact. Thereby, the insect is eliminated. Throughout this document, the terms insect elimination and insect repelling may be interchanged Furthermore, due to the centrifugal force caused by the rotation of the propeller, a releasing of remains of the insect from the propeller is facilitated in order to keep the propeller clean.
The UAV may navigate through the airspace to hit the insect from any angle. For example, the UAV may navigate to the insect sideways. In an embodiment, use is made of an air stream in the air space generated by the propeller. The airstream sucks air towards the propeller, generally at a top side of the UAV. Navigating the UAV to a position where the insect is subjected to the suction of the propeller will facilitate elimination, as the suction effect will provide that a less accurate navigation of the UAV may be required, as an air volume in the airspace where a notable amount of suction occurs, may be larger than the volume in the airspace through which the propeller itself rotates. Thus, the controller may navigate the UAV to a position in the airspace below the insect, whereby the insect is subjected to the suction of air by the propeller, thus providing that the insect is sucked towards to propeller in order to be hit.
In an embodiment, the controller is further configured to
Thus, navigation may be facilitated in that the controller first determines a position below the insect, for example a position at a predetermined distance below the insect in order to get the insect to be subjected to the suction by the propeller or determine a position in a horizontal plane in which the UAV is present at that moment in time, so as to enable moving to the position below the insect by means of a horizontal movement of the UAV. Having determined that position, the controller guides the UAV to the position below the insect (the elimination start position) and then guides the UAV upwardly towards the insect. In order to take account of the movements of the insect in the airspace, updated image data may be employed to iteratively correct the position of the UAV to remain below the insect. By the movement of the UAV upward, an upward thrust is generated by the propeller, which increases a suction of air towards the propeller and increases a rotational speed of the propeller, thereby enabling to more effectively eliminate the insect.
In an embodiment, the camera is attached (e.g. connected, mounted, integrally mounted etc.) to the UAV. Thereby, an easy navigation may be provided, as the closer the UAV gets to the insect, the closer the insect gets to the camera, the more accurately the position of the insect may be derived from the image data, for example making use of size of the insect in the image and position of the insect in the image. Hence, navigation by the controller may be facilitated, allowing a relatively low processing power controller to be employed. In case the camera is attached to the UAV, the controller may likewise be attached to or provided in the UAV thus providing an integral solution.
In an embodiment, a field of view of the camera is vertically oriented upwards, wherein the controller is further configured to derive an elimination start position from a horizontal path calculated on the basis of the location of the image of the insect in the image data and a centre of the image data, and wherein the controller is configured to guide the UAV to navigate to the elimination start position by controlling the UAV to move according to the horizontal path. Thereby, the controller navigates the UAV to a position below the insect by guiding the UAV to a position where the insect gets to a centre of the image. Tracking of the movements of the insect may also be facilitated, by iteratively determining a position of the insect in the image relative to a centre of the image, and guide the UAV so that the insect moves towards the centre of the image.
Alternatively to the camera at the UAV, in another embodiment, the camera is positioned at a stationary location. The camera may survey the airspace in its entirety (or a part thereof), whereby both an insect as well as the UAV may be imaged by the camera. The controller may likewise be provided at the stationary location or may be provided in or attached to the UAV.
In an embodiment, the controller is further configured to:
In an embodiment, the controller is further configured to
For example, the distance of the UAV to the camera may be derived from the size of the UAV in the image data. Thereto, the controller may determine the size of the image of the UAV from the image data, and compare the size of the image of the UAV to a reference (such as a known size of the UAV, a look up table listing size versus distance, a formula expressing distances a function of size etc.), thus to determine the distance of the UAV from the camera. The position of the UAV in the image may be employed by the controller to estimate the angle at which the UAV is viewed in respect of an optical axis of the camera. For example, in case the optical axis of the camera is oriented at an angle 45 degrees upwardly, the position of the image of the UAV in the camera image in respect of the centre of the image as captured by the camera will indicate an angle in respect of the optical axis of the camera.
In an embodiment, the UAV comprises recognition markers, such as LED's, and wherein the controller is configured to determine the image size of the UAV from the distance between the recognition markers, thereby allowing to reliably detect the UAV in the image as well as to provide a fast estimation of the distance of the UAV from the camera from the distance between the recognition markers.
In an embodiment, the controller is further configured to
As an alternative to the position determination from image size and image position, the camera may comprise a stereoscopic camera, and the image data may comprise stereoscopic image data, whereby the controller may be configured to derive the position of the UAV and the insect in the airspace from the stereoscopic image data.
In an embodiment, the stationary location comprises a charging station configured to charge the UAV. Thus, the UAV can be placed at a standby position at the charging station, where it is charged and from where the stationary camera observes the airspace. Charging may for example be wireless. The camera may also be employed to enable the controller to guide the UAV to the charging station based on the image data.
In an embodiment, the controller is configured to recognize the insect in the image data on the basis of at least one criterion from a size criterion, a shape criterion, a speed criterion and a colour criterion. By distinguishing the insect in the image data from one or more of size, shape, speed, and colour, a relatively easy and reliable insect detection may be provided. Also, a distinction can be made between different types of insects based on selection criteria such as colour, size, etc. Using at least one of the size criterion, colour criterion, shape criterion and speed criterion, different types of insects, such as a mosquito, a fly, a bee, a wasp, may be distinguished by the controller. Hence, the controller may operate the UAV to selectively act against certain types of insects, hence enabling to make the insect elimination system selective.
In an embodiment, a sensitivity wavelength band of the camera extends into an infrared wavelength band, thus enabling to more reliably distinguish the insect from any other object in the airspace based on the insects emission or reflection of infrared radiation.
Furthermore, the infrared radiation is generally invisible to humans, which may enable to provide a less obtrusive insect elimination system. The system may comprise an infrared light source, such as an infrared diode, positioned to irradiate into at least part of a field of view of the camera. When an insect is irradiated by the infrared radiation, an image of the insect may be captured by the camera at a high contrast. Hence, detection of the insect is made possible, at low ambient light conditions, such as at night.
According to a further aspect of the invention, there is provided a use of the insect elimination system according to the invention for insect repelling.
The insect elimination system may for example be employed to eliminate insects in agriculture, to eliminate insects in horticulture, such as in a greenhouse, to eliminate insects in a room, such as a bedroom in order to improve sleep quality, to reduce spreading of a disease via insects or for insecticide free insect repelling.
Further advantages, effects and features of the invention will follow from the below description and the appended drawing showing a non-limiting embodiment, wherein:
Throughout the figures, the same or similar items are denoted by the same reference numerals.
Having eliminated the insect, the image of the flying insect as obtained by the camera will have disappeared, causing the controller to detect that the observed part of the airspace is clear, and causing the controller to guide the UAV back to the standby/surveillance position.
An image of the upwardly oriented field of view of the camera is schematically depicted in
In either embodiment, the insect may be recognized using any suitable combination of size, speed, shape, and colour. Thus, an insect may be recognized by the controller as a small, moving object in the airspace exhibiting a correlation to a reference image of an insect and a colour in a colour band.
The insect elimination system may for example be employed to eliminate insects in agriculture, to eliminate insects in horticulture, such as in a greenhouse, to eliminate insects in a room, such as a bedroom, to reduce spreading of a disease via insects or for insecticide free insect repelling.
Number | Date | Country | Kind |
---|---|---|---|
2017984 | Dec 2016 | NL | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/NL2017/050834 | 12/12/2017 | WO | 00 |