The present disclosure relates to aerial analysis of a ground surface using a distance sensor of an unmanned aerial vehicle (UAV).
There has been a lot of research activity to utilize unmanned vehicles, and in particular unmanned aerial vehicles (UAVs) such as drones, in various environments. In a certain environment, it may be necessary for a UAV to aerially investigate information regarding a ground surface. For example, if during its flight, the UAV has lost a control signal therefor, developed its engine problem, or run out of its power, the UAV may attempt an emergency landing in an unknown environment, where there does not exist a previously designated landing zone, which attempt may be preceded by selecting a safe landing zone based on the ground surface information. Several studies have been directed towards employing an image sensor such as a camera attached to an aerial vehicle to aerially capture an image of a ground surface and classifying the captured image with Artificial Intelligence (AI) or Machine Learning (ML). It has also been proposed to use a distance sensor with a wide field of view (FoV) of, e.g., 360 degrees, for downward inspection of a ground surface.
Disclosed herein is aerial analysis of a ground surface using a distance sensor of a UAV.
In an example, a method for aerial analysis of a ground surface includes: controlling a distance sensor of an unmanned aerial vehicle (UAV) to be successively oriented in a plurality of sensing directions towards the ground surface; and analyzing the ground surface based on distance measurement data indicative of distances measured by the distance sensor in the plurality of sensing directions, the plurality of sensing directions corresponding to their respective points defining a planar trajectory, the planar trajectory including a plurality of loops winding an identical inner point.
This Summary is provided to introduce a few aspects in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to determine the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that offer any or all advantages discussed herein.
In accordance with the present disclosure, it is possible to aerially analyze a ground surface using a distance sensor included in a UAV.
In accordance with the present disclosure, it is possible to figure out an undulation of a ground surface based on distance measurement data provided from a distance sensor of a UAV.
In accordance with the present disclosure, when attempting an emergency landing in an unknown environment, a UAV may determine a safe landing zone in a rapid and accurate manner while refraining from or minimizing movement of the UAV.
Various terms used in the present disclosure are chosen from a terminology of commonly used terms in consideration of their function herein, which may be appreciated differently depending on an intention of a person skilled in the art, a precedent case, or an emerging new technology. In specific instances, some terms are ascribed their meanings as set forth in the detailed description. Accordingly, the terms used herein are to be defined consistently with their meanings in the context of the present disclosure, rather than simply by their names.
The terms “comprising,” “including,” “having,” etc. are used herein when specifying the presence of the elements listed thereafter, e.g., certain features, numbers, steps, operations, constituent elements, information, or a combination thereof. Unless otherwise indicated, these terms and variations thereof are not meant to exclude the presence or addition of other elements.
As used herein, the terms “first,” “second,” and so forth are meant to identify several similar elements. Unless otherwise specified, such terms are not intended to impose limitations, e.g., a particular order of these elements or of their use, but rather are used merely for referring to multiple elements separately. For instance, an element may be referred to in an example with the term “first” while the same element may be referred to in another example with a different ordinal number such as “second” or “third.” In such examples, these terms are not to limit the scope of the present disclosure. Also, the use of the term “and/or” in a list of multiple elements is inclusive of all possible combinations of the listed items, including any one or plurality of the items. Further, singular expressions include plural expressions unless expressly stated otherwise.
Certain examples of the present disclosure will now be described in detail with reference to the accompanying drawings. The present disclosure may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these examples are given in order to provide a better understanding of the scope of the present disclosure.
The example navigation and control system 100 of
In the example of
In the illustrated example, the sensing unit 110 includes a distance sensor 115. The distance sensor 115 measures a distance from the distance sensor 115 to a target object in a given sensing direction. For example, the distance sensor 115 may include a non-image sensor, such as a radar sensor, an ultrasonic sensor, a light detection and ranging (LiDAR) sensor, or the like, which may provide measurement data less susceptible to an environmental factor, e.g., smoke and fog, as compared to an image sensor.
Accordingly, the processing unit 150 may acquire, from the distance sensor 115, distance measurement data indicative of the distance measured in the sensing direction. The term “sensing direction” is intended to mean a main direction for measurement by an employed sensor, e.g., a direction of radiation of a detection signal from the distance sensor 115, a direction of reflection of the detection signal on a target object, or a combination thereof. For example, a heading direction of the distance sensor 115 may point from a rear end of the distance sensor 115 to a front end of the distance sensor 115, e.g., on which there are located a transmitter to radiate the detection signal and a receiver to receive the reflected signal, and may be given as a sensing direction.
In some example implementations, the distance sensor 115 may be mounted on the UAV 10 to have limited attitude angles, e.g., a combination of a yaw angle and a pitch angle. For example, the front end of the distance sensor 115 is capable of a circular motion with the rear end thereof fastened to a bottom surface of the UAV 10, and thus the heading direction of the distance sensor 115 may be in a downward direction, either vertically downward or slantingly downward, to a limited extent. In this example, an orientation of the distance sensor 115 may be allowed within a limited direction range.
In the example of
In some example implementations, the distance sensor 115 may be oriented in a certain direction by the actuator 120 in accordance with a control input provided from the processing unit 150 to the actuator 120. Such orientation of the distance sensor 115 may be for the distance sensor 115 to perform distance measurement at least in that direction. Then, the distance sensor 115 may be referred to as being oriented in a given sensing direction. In this way, the distance sensor 115 may be controlled to be successively oriented in several sensing directions, for example, such that its heading direction follows a certain trajectory over a time period.
The distance sensor 115 may detect, and measure a distance to, a target object only when the target object is present in a Field of View (FoV) of the distance sensor 115, which means a whole span within which certain target objects can be observed. As shown in
In some example implementations, the sensing unit 110 may further include an additional sensor to provide other measurement data. For example, such sensor may be an inertial sensor, e.g., a gyroscope, and thus the processing unit 150 may calculate a location, a velocity and/or an attitude of the UAV 10 using measurement data, e.g., angular velocity measurement data, that is output from this sensor.
In the example of
In some example implementations, the storage unit 130 may store a map that is constructed through a ground surface analysis process as described below, such as a topographic map, a slope rate map, and a roughness map, which may be defined for a plurality of points in a given target environment, e.g., a plurality of coordinate points of a ground surface of an unknown environment. For example, the topographic map may be constructed to represent an undulation of a ground surface of the target environment, the slope rate map may be constructed to represent a local change rate feature of the undulation, and the roughness map may represent another local change rate feature of the undulation, all of which maps will be described in detail below.
In the example of
In some example implementations, the processing unit 150 may perform operations for aerial analysis of a ground surface, which operations may involve controlling the distance sensor 115 of the UAV 10 to be successively oriented in several sensing directions and analyzing the ground surface based on distance measurement data indicative of distances respectively measured in the sensing directions.
An example process for such ground surface analysis is now described with reference to
First, the processing unit 150 may control the distance sensor 115 of the UAV 10 to be successively oriented in a plurality of sensing directions towards the ground surface, where these sensing directions correspond to their respective points defining a planar trajectory. For example, as shown in
In some example implementations, the planar trajectory 390 on the plane 300 may include a plurality of loops winding an identical inner point, e.g., a point at which the vertically downward direction intersects the plane 300. Therefore, each of the plurality of loops may wind an inner one of the plurality of loops, or may be wound by an outer one of the plurality of loops, or both. Further, each loop may be an open-ended loop, e.g., a C-shaped loop, or may be a closed loop, e.g., an O-shaped loop. In a particular example, the planar trajectory 390 may be a spiral trajectory as shown in
With the distance sensor 115 having a limited FoV as shown in
In some example implementations, in order for the distance sensor 115 to be successively oriented in the plurality of sensing directions, the processing unit 150 may determine a subsequent sensing direction based on a region into which, in a current sensing direction, the FoV of the distance sensor 115 is projected on a reference plane, e.g., the plane 300.
For example, the plurality of sensing directions in which the distance sensor 115 is successively oriented may be determined such that a region into which, in a subsequent one of these sensing directions, the FoV of the distance sensor 115 is projected on the reference plane has a larger area than that of a region into which, in a current one of these sensing directions, the FoV of the distance sensor 115 is projected on the reference plane. In the example of
Additionally or alternatively, the plurality of sensing directions in which the distance sensor 115 is successively oriented may be determined such that the region into which, in the current one of these sensing directions, the FoV of the distance sensor 115 is projected on the reference plane partially overlaps the region into which, in the subsequent one of the these sensing directions, the FoV of the distance sensor 115 is projected on the reference plane, e.g., such that the overlapping part accounts for not less than a lower limit percentage and not greater than an upper limit percentage of each of the projection regions. As shown in
For the sake of convenience, it is assumed that the region 200 is a region into which the FoV of the distance sensor 115 is projected on a plane that is distant from the distance sensor 115 by a unit distance so that h is, for example, one (1) meter long in
The four points P1, P2, P3 and P4 on the boundary of the region 200 may be respectively projected into four corresponding points on a boundary of the region 500. The area of the region 500 may be computed based on coordinates of such corresponding points. For example, the coordinates of each of the corresponding points on the boundary of the region 500 may be calculated using a planar projection shadow equation. According to this equation, letting the location of the distance sensor 115 be the point l=(lx, ly, lz) as in
In some example implementations, the processing unit 150 may estimate, based on an area of each of a plurality of regions into which, in a plurality of respective sensing directions, the FoV of the distance sensor 115 is projected on the reference plane, a possible area of a to-be-finally-analyzed portion of the ground surface, i.e., a union of all of the projection regions. For example, the processing unit 150 may estimate a maximum possible area and/or a minimum possible area of the union of these regions based on an area of each of a plurality of regions, including the regions 310, 320, 330, and 340, into which the FoV of the distance sensor 115 is projected on the plane 300 in a plurality of respective sensing directions, including sensing directions pointing to the points 301, 302, 303, and 304, and possibly based further on the above-described upper limit and lower limit percentages.
In some example implementations, if the estimated area is smaller than a threshold area value, the processing unit 150 may control the distance sensor 115 to be oriented in an additional sensing direction subsequent to the successive orientations in the above-described sensing directions. Accordingly, it is made possible to analyze a ground surface portion with a desired sufficiently wide area while gradually adding sensing directions in which the distance sensor 115 is to be oriented.
In some example implementations, if the estimated area is larger than or equal to the threshold area value, the processing unit 150 may finish collecting the distance measurement data indicative of distances measured by the distance sensor 115 in the respective sensing directions towards the ground surface and may analyze the ground surface based on the collected distance measurement data. Such analysis may include constructing a map of the ground surface as follows.
First, the processing unit 150 may receive, from the distance sensor 115, the distance measurement data indicative of the distances measured in the sensing directions towards the ground surface. Therefore, based on the sensing directions of the distance sensor 115, as well as the distance measurement data received from those directions, the processing unit 150 may map, onto a plurality of coordinate points of the ground surface, their respective altitude values indicative of an undulation of the ground surface, for example, an altitude value Hi,j calculated for a coordinate point (i,j). As such, the processing unit 150 may construct a topographic map representing such mapping and then select, based on the altitude values, a particular one of the plurality of coordinate points as, e.g., a coordinate point corresponding to a flat place on the ground surface. By way of example, there is shown in
Then, by using the topographic map, the processing unit 150 may map, onto the coordinate points represented in the topographic map, their respective slope rate values indicative of a local change rate feature of the undulation of the ground surface. As such, the processing unit 150 may construct a slope rate map representing such mapping and then select, based on the slope rate values, a particular one of the plurality of coordinate points as, e.g., the coordinate point corresponding to the flat place on the ground surface, as described above.
Specifically, for each of the coordinate points, the slope rate value may be calculated based on the altitude value mapped to that coordinate point and the altitude value mapped to each of a first limited number of, e.g., three, nearby coordinate points. For example, as represented by the equation below, the processing unit 150 may calculate an inclination SXi,j between a coordinate point (i,j) and a nearby coordinate point (i+1,j), an inclination SYi,j between the coordinate point (i,j) and a nearby coordinate point (i,j+1), and an inclination SDi,j between the coordinate point (i,j) and a nearby coordinate point (i+1,j+1), and then calculate, for the coordinate point (i,j), the slope rate value SlopeRatei,j to be a norm of the aforementioned three inclination values.
where Δx, Δy, and Δd are a distance between the coordinate point (i,j) and the coordinate point (i+1,j), a distance between the coordinate point (i,j) and the coordinate point (i,j+1), and a distance between the coordinate point (i,j) and the coordinate point (i+1,j+1), respectively. By way of example, there is shown in
By using the topographic map instead of, or in addition to, the mapping of the slope rate values, the processing unit 150 may map, onto the coordinate points represented in the topographic map, their respective roughness values indicative of another local change rate feature of the undulation of the ground surface. As such, the processing unit 150 may construct a roughness map representing such mapping and then select, based on the roughness values, a particular one of the plurality of coordinate points as, e.g., the coordinate point corresponding to the flat place on the ground surface, as described above.
Specifically, for each of the coordinate points, the roughness value may be calculated based on the altitude value mapped to that coordinate point and the altitude value mapped to each of a second limited number, e.g., greater than the above-mentioned first number, of nearby coordinate points. For example, for a sample region defined with several, e.g., a number N×N of, adjacent coordinate points of the ground surface, the processing unit 150 may calculate a difference between the altitude value mapped to a coordinate point (i,j) in the sample region and the altitude value mapped to each of all other coordinate points (m,n) in the sample region, calculate a distance dm,n,i,j between those two coordinate points, and then calculate the roughness value Roughnessi,j for the coordinate point (i,j), as follows:
Roughnessi,j=Σm=1NΣn=1N√{square root over ((Hm,n−Hi,j)2)}/dm,n,i,j Eq. 4
As such, this equation contemplates that as the distance ddm,n,i,j becomes greater, the altitude difference will have a less effect on the local surface roughness at the coordinate point (i,j). By way of example, there is shown in
As mentioned earlier, each of the slope rate and the roughness values defined for a coordinate point is indicative of a local change rate feature of the undulation of the ground surface. Further, it means that as these two values becomes smaller, that coordinate point is highly likely to correspond to a flatter place of the ground surface. In addition, the slope rate may be obtained in a relatively simpler and less time-consuming fashion, and the roughness may indicate a degree of flatness of the ground surface in a relatively more accurate manner. In a particular example, the processing unit 150 may determine a flat place of the ground surface in an efficient manner using a proper combination of the scheme of calculation of slope rate values and the scheme of calculation of roughness values, as described below with respect to
In the following, with reference to
In an operation 710, a topographic map is constructed based on distance measurement data collected from the distance sensor 115 of the UAV 10. As mentioned earlier, the distance measurement data may be indicative of distances respectively measured by the distance sensor 115 in a plurality of sensing directions towards a ground surface. Accordingly, altitude values at a plurality of respective coordinate points of the ground surface may be calculated and the calculated altitude values may be included in the topographic map.
In an operation 720, a slope rate map is constructed based on the constructed topographic map. As mentioned earlier, a slope rate value at each of the coordinate points may be calculated based on the altitude value at that coordinate point and the altitude value at each of a limited number of coordinate points lying near that coordinate point, e.g., in accordance with Eq. 3, and the calculated slope rate value may be included in the slope rate map.
In an operation 730, a candidate coordinate point of the plurality of coordinate points of the ground surface and a corresponding sample region of the ground surface are identified based on the constructed slope rate map, where the candidate coordinate point and another limited number of coordinate points lying near the candidate coordinate point are all coordinate points in this sample region. For example, one of the plurality of coordinate points that satisfies a certain decision criterion, e.g., a coordinate point that has a slope rate value less than a threshold slope rate value, in other words, a coordinate point that has a slope rate value whose reciprocal is greater than a threshold value, may be identified as the candidate coordinate point. Subsequently, the sample region may be identified with the identified candidate coordinate point centered therein, which sample region may be regarded as a group of a number of, e.g., N×N, coordinate points.
In an operation 740, a roughness map is constructed based on the topographic map for the identified sample region. For example, a roughness value at each of the coordinate points in the sample region may be calculated based on the altitude value at that coordinate point and the altitude value at each of the other coordinate points in the sample region, e.g., in accordance with Eq. 4, and the calculated roughness value may be included in the roughness map.
As long as there exists another coordinate point that satisfies the aforementioned decision criterion, the operations 730 and 740 may be repeated. However, even with a candidate coordinate point identified, if its corresponding sample region is not defined only with coordinate points represented in the topographic map in a manner identical to that for another sample region, the candidate coordinate point may be discarded and the roughness map of the sample region may also not be constructed.
In an operation 750, a landing zone of the UAV 10 is determined based on the constructed roughness map. For example, for each identified sample region, an average of the roughness values mapped to their respective coordinate points in that region may be computed. Then, the landing zone of the UAV 10 may be selected to correspond to a sample region having a minimum average value, e.g., the region surrounded by a thick border in the topographic map shown in
As such, in accordance with the example process 700, it is possible to: select a number of coordinate points that are more than likely to correspond to a landing zone of the UAV 10 by using slope rate values that may be calculated in a simple manner (i.e., rapid but rough analysis of the ground surface); perform an evaluation of a sample region taken with each selected coordinate point centered therein by using roughness values that are required to be calculated in a relatively complex manner (i.e., more precise analysis of the selected region of the ground surface); and thereby finally determine the landing zone of the UAV 10. In this way, it is possible to efficiently select an optimal flat region on which the UAV 10 may safely land even in any environment, e.g., an unknown environment.
The following are various examples pertaining to aerial analysis of a ground surface using a distance sensor of a UAV.
In Example 1, a method for aerial analysis of a ground surface includes: controlling a distance sensor of an unmanned aerial vehicle (UAV) to be successively oriented in a plurality of sensing directions towards the ground surface; and analyzing the ground surface based on distance measurement data indicative of distances respectively measured by the distance sensor in the plurality of sensing directions.
Example 2 includes the subject matter of Example 1, wherein the method is performed during hovering of the UAV above the ground surface.
Example 3 includes the subject matter of Example 1 or 2, wherein the plurality of sensing directions includes one or more slantingly downward directions.
Example 4 includes the subject matter of Example 3, wherein the plurality of sensing directions further includes one vertically downward direction.
Example 5 includes the subject matter of any of Examples 1 to 4, wherein the plurality of sensing directions corresponds to their respective points defining a planar trajectory, wherein the planar trajectory includes a plurality of loops winding an identical inner point.
Example 6 includes the subject matter of Example 5, wherein controlling the distance sensor includes controlling the distance sensor such that a heading direction of the distance sensor follows the planar trajectory across the successive orientations.
Example 7 includes the subject matter of Example 5 or 6, wherein the plurality of loops is a plurality of open-ended loops, is a plurality of closed loops, or is one or more open-ended loops and one or more closed loops.
Example 8 includes the subject matter of any of Examples 5 to 7, wherein the planar trajectory is a spiral trajectory.
Example 9 includes the subject matter of any of Examples 1 to 8, wherein controlling the distance sensor includes determining a subsequent one of the plurality of sensing directions based on a region into which, in a current one of the plurality of sensing directions, a field of view (FoV) of the distance sensor is projected on a reference plane.
Example 10 includes the subject matter of Example 9, wherein the determination is made such that the region overlaps another region into which, in the subsequent direction, the FoV is projected on the reference plane and/or such that the other region has a larger area than that of the region.
Example 11 includes the subject matter of Example 9 or 10, wherein the method further includes controlling the distance sensor to be oriented in an additional sensing direction subsequent to the successive orientations if a possible area of a union of a plurality of regions into which, in respective ones of the plurality of sensing directions, the FoV is projected on the reference plane is less than a threshold value, wherein the ground surface is analyzed based further on additional distance measurement data received from the distance sensor in the additional sensing direction.
Example 12 includes the subject matter of any of Examples 9 to 11, wherein the FoV is defined by a limited horizontal angle of view and a limited vertical angle of view.
Example 13 includes the subject matter of any of Examples 1 to 12, wherein the UAV is a rotorcraft drone.
Example 14 includes the subject matter of any of Examples 1 to 13, wherein the UAV is a vertical take-off and landing (VTOL) UAV.
Example 15 includes the subject matter of any of Examples 1 to 14, wherein analyzing the ground surface includes: calculating, based on the distance measurement data, altitude values at a plurality of respective coordinate points of the ground surface; and selecting, based on the altitude values, a particular one of the plurality of coordinate points.
Example 16 includes the subject matter of Example 15, wherein selecting the particular coordinate point includes: calculating, based on the altitude values, slope rate values at the respective coordinate points; and selecting, based on the slope rate values, the particular coordinate point.
Example 17 includes the subject matter of Example 16, wherein selecting, based on the slope rate values, the particular coordinate point includes: identifying, based on the slope rate values, a candidate coordinate point of the plurality of coordinate points and a corresponding sample region of the ground surface, the candidate coordinate point and another limited number of nearby ones of the plurality of coordinate points being all coordinate points in the corresponding sample region; calculating, based on the altitude values at respective ones of all of the coordinate points in the corresponding sample region, roughness values at the respective coordinate points in the corresponding sample region; and selecting, based on the roughness values, the particular coordinate point.
Example 18 includes the subject matter of any of Examples 15 to 17, wherein the particular coordinate point is selected as a coordinate point corresponding to a landing zone of the UAV.
In Example 19, there is provided a computer-readable storage medium having stored therein computer-executable instructions that when executed by a computer processor, cause the computer processor to perform the method recited in any of Examples 1 to 18.
In Example 20, a computing apparatus includes: a processor; and a memory encoded with a set of computer program instructions executable by the processor to perform the method recited in any of Examples 1 to 18.
In Example 21, an unmanned aerial vehicle (UAV) includes: a distance sensor; an actuator; and a processing unit to perform operations for aerial analysis of a ground surface, the operations including: causing the actuator to successively orient the distance sensor in a plurality of sensing directions towards the ground surface; and analyzing the ground surface based on distance measurement data indicative of distances respectively measured by the distance sensor in the plurality of sensing directions.
Example 22 includes the subject matter of Example 21, wherein the operations are performed during hovering of the UAV above the ground surface.
Example 23 includes the subject matter of Example 21 or 22, wherein the plurality of sensing directions includes one or more slantingly downward directions.
Example 24 includes the subject matter of Example 23, wherein the plurality of sensing directions further includes one vertically downward direction.
Example 25 includes the subject matter of any of Examples 21 to 24, wherein the plurality of sensing directions corresponds to their respective points defining a planar trajectory, wherein the planar trajectory includes a plurality of loops winding an identical inner point.
Example 26 includes the subject matter of Example 25, wherein causing the actuator to successively orient the distance sensor in the plurality of sensing directions includes causing the actuator to orient the distance sensor such that a heading direction of the distance sensor follows the planar trajectory across the successive orientations.
Example 27 includes the subject matter of Example 25 or 26, wherein the plurality of loops is a plurality of open-ended loops, is a plurality of closed loops, or is one or more open-ended loops and one or more closed loops.
Example 28 includes the subject matter of any of Examples 25 to 27, wherein the planar trajectory is a spiral trajectory.
Example 29 includes the subject matter of any of Examples 21 to 28, wherein causing the actuator to successively orient the distance sensor in the plurality of sensing directions includes determining a subsequent one of the plurality of sensing directions based on a region into which, in a current one of the plurality of sensing directions, a field of view (FoV) of the distance sensor is projected on a reference plane.
Example 30 includes the subject matter of Example 29, wherein the determination is made such that the region overlaps another region into which, in the subsequent direction, the FoV is projected on the reference plane and/or such that the other region has a larger area than that of the region.
Example 31 includes the subject matter of Example 29 or 30, wherein the processing unit is further to cause the actuator to orient the distance sensor in an additional sensing direction subsequent to the successive orientations if a possible area of a union of a plurality of regions into which, in respective ones of the plurality of sensing directions, the FoV is projected on the reference plane is less than a threshold value, wherein the ground surface is analyzed based further on additional distance measurement data received form the distance sensor in the additional sensing direction.
Example 32 includes the subject matter of any of Examples 29 to 31, wherein the FoV is defined by a limited horizontal angle of view and a limited vertical angle of view.
Example 33 includes the subject matter of any of Examples 21 to 32, wherein the UAV is a rotorcraft drone.
Example 34 includes the subject matter of any of Examples 21 to 33, wherein the UAV is a vertical take-off and landing (VTOL) UAV.
Example 35 includes the subject matter of any of Examples 21 to 35, wherein analyzing the ground surface includes: calculating, based on the distance measurement data, altitude values at a plurality of respective coordinate points of the ground surface; and selecting, based on the altitude values, a particular one of the plurality of coordinate points.
Example 36 includes the subject matter of Example 35, wherein selecting the particular coordinate point includes: calculating, based on the altitude values, slope rate values at the respective coordinate points; and selecting, based on the slope rate values, the particular coordinate point.
Example 37 includes the subject matter of Example 36, wherein selecting, based on the slope rate values, the particular coordinate point includes: identifying, based on the slope rate values, a candidate coordinate point of the plurality of coordinate points and a corresponding sample region of the ground surface, the candidate coordinate point and a limited number of nearby ones of the plurality of coordinate points being all coordinate points in the corresponding sample region; calculating, based on the altitude values at respective ones of all of the coordinate points in the corresponding sample region, roughness values at the respective coordinate points in the corresponding sample region; and selecting, based on the roughness values, the particular coordinate point.
Example 38 includes the subject matter of any of Examples 35 to 37, wherein the particular coordinate point is selected as a coordinate point corresponding to a landing zone of the UAV.
In a particular example, the apparatus, device, system, machine, or the like discussed herein may be, include, or be implemented in any suitable type of computing apparatus. The computing apparatus may include a processor and a computer readable storage medium that is readable by the processor. The processor may execute one or more instructions stored in the computer readable storage medium. The processor may also read other information stored in the computer readable storage medium. In addition, the processor may store new information in the computer readable storage medium and update certain information stored in the computer readable storage medium. The processor may include, for example, a central processing unit (CPU), a digital signal processor (DSP), a graphics processing unit (GPU), a processor core, a microprocessor, a micro-controller, a field-programmable gate array (FPGA), an application specific integrated circuit (ASIC), other hardware and logic circuits, or any suitable combination thereof. The computer readable storage medium is encoded with a variety of information, for example, a set of processor executable instructions that are executable by the processor, and/or other information. For example, the computer-readable storage medium may have stored therein computer program instructions that when executed by the processor, cause the computing apparatus, e.g., the processor, to perform some operations disclosed herein and/or information, data, variables, constants, data structures, and the like that are used in such operations. The computer-readable storage medium may include read-only memory (ROM), random-access memory (RAM), volatile memory, non-volatile memory, removable memory, non-removable memory, flash memory, solid-state memory, other types of memory devices, magnetic media such as a hard disk, a floppy disk, and a magnetic tape, optical recording media such as a CD-ROM and a DVD, magneto-optical media such as a floptical disk, other types of storage devices and storage media, or any suitable combination thereof.
In a particular example, the operations, techniques, processes, or certain aspects or portions thereof, described herein may be embodied in a computer program product. Such computer program may be implemented in a certain type of, e.g., compiled or interpreted, programming language that is executable by a computer, such as assembly, machine language, procedural language, object-oriented language, and the like, and may be combined with hardware implementation. The computer program product may be distributed in the form of a computer-readable storage medium or in an on-line manner. For the online distribution, a portion or whole of the computer program product may be temporarily stored, or temporarily created, in a server, for example, in a computer-readable storage medium of the server.
The foregoing description has been presented to illustrate and describe some examples in detail. It should be understood by those skilled in the art that many modifications and variations are possible in light of the above teaching. In various examples, suitable results may be achieved if the above-described techniques are performed in a different order, and/or if some of the components of the above-described systems, architectures, devices, circuits, and the like are coupled or combined in a different manner, or substituted for or replaced by other components or equivalents thereof.
Therefore, the scope of the disclosure is not to be limited to the precise form disclosed, but rather defined by the following claims and equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
10-2021-0047843 | Apr 2021 | KR | national |