The present disclosure relates generally to a navigation control system for stationary obstacle avoidance. More particularly to a system and method for navigating an aerial robotic device in the presence of static, or stationary, obstacles within a bounded movement volume.
An unmanned, or uncrewed, aerial vehicle (UAV) commonly known as a drone is an aircraft without a human pilot on board. UAVs are a component of an unmanned aircraft system (UAS), which include the UAV itself, a ground-based controller, and a communication system for facilitating bi-directional communication between the UAV and the ground-based controller. The flight of UAVs may operate with various degrees of autonomy, either under remote control by a human operator or autonomously using onboard sensors and controllers.
Traditional wired aerial robotic devices require manual control of their movements by a trained operator using a joystick apparatus. However, such manual control is an overly labour-intensive process and requires significant motor skills on the part of the human operator.
In one aspect of the present disclosure, there is provided an aerial navigation system. The aerial navigation system comprises an aerial robotic device suspended from a vertical wire connected to a carrier device. The carrier device is connected to a plurality of anchor points mounted on corresponding plurality of upright members at a substantially same height from a ground through a set of horizontal wires in a bounded horizontal plane mutually subtended by the plurality of anchor points. The aerial robotic device is moveable within an aerial movement volume defined between the ground, the plurality of upright members and the horizontal plane. The aerial navigation system further comprises a navigation control system for navigating the aerial robotic device in the Aerial Movement Volume. The navigation control system is configured to detect one or more stationary obstacles located in the Aerial Movement Volume, create a 3D map representing the Aerial Movement Volume together with one or more bounding boxes enclosing each stationary obstacle in the Aerial Movement Volume, compute an optimal route for the aerial robotic device from a start location to a destination location so as to avoid an intervening stationary obstacle, determine control parameters for a plurality of electric stepper motors driving the carrier device and the aerial robotic device based on the computed optimal route for the aerial robotic device, and navigate the aerial robotic device in accordance with the computed optimal route to enable the aerial robotic device to reach the destination location while avoiding intervening stationary obstacles.
In another aspect of the present disclosure, there is provided a method for navigating the aerial robotic device. The method comprises detecting one or more stationary obstacles located in the Aerial Movement Volume, creating a 3D map representing the Aerial Movement Volume together with one or more bounding boxes enclosing each stationary obstacle in the Aerial Movement Volume, computing an optimal route for the aerial robotic device from a start location to a destination location so as to avoid an intervening stationary obstacle, determining control parameters for a plurality of electric stepper motors driving the carrier device and the aerial robotic device respectively, based on the computed optimal route for the aerial robotic device, and navigating the aerial robotic device in accordance with the computed optimal route to enable the aerial robotic device to reach the destination location while avoiding intervening stationary obstacles.
In yet another aspect of the present disclosure, embodiments disclosed herein are also directed to a non-transitory computer readable medium having stored thereon computer-executable instructions which, when executed by a processor, causes the processor to perform the steps of the method disclosed herein.
It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
In the accompanying drawings, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although the best mode of carrying out the present disclosure has been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
A carrier device 105 is coupled to the electric motors at corresponding ones of the anchor points 104 using a set of wires 102 (hereinafter individually referred to as ‘the horizontal wire’ and denoted using identical reference numeral ‘102’). That is, the rotor from each electric motor is coupled with a first end of a corresponding horizontal wire 102 that is arranged so that the rest of the corresponding horizontal wire 102 is at least partly wrapped around the rotor. Moreover, a second end of each horizontal wire 102 from the set of horizontal wires 102 is coupled with the carrier device 105. The carrier device 105 itself houses at least one electric motor (not shown), each of which includes a rotor (not shown). In an example, each of the electric motors associated with the carrier device 105 may be implemented by use of a direct current (DC) stepper motor. The rotor of the carrier device 105 is coupled with a first end of a wire 107 (hereinafter referred to as ‘the vertical wire’ and denoted using identical reference numeral ‘107’). The vertical wire 107 is arranged so that the rest of the vertical wire 107 is at least partly wrapped around the rotor of the carrier device 105. A robotic device 106 is suspended from a second end of the vertical wire 107. Thus, the set of horizontal wires 102, the ground G, the plurality of upright members 103 collectively define a volume within which the robotic device 106 resides. For clarity, this volume will be referred to henceforth as the Aerial Movement Volume 110.
The carrier device 105 is adapted to operably move within the bounded horizontal plane 112 defined between the elevated anchor points 104. This movement is achieved through the activation of the electric motors in the anchor points 104 to cause the horizontal wire 102 coupled to each electric motor to be further wound or unwound from the electric motor's rotor, thereby shortening or lengthening each such horizontal wire 102. The robotic device 106 is adapted to move vertically relative to the carrier device 105 through the activation of the electric motor(s) in the carrier device 105 to cause the vertical wire 107 coupled to each electric motor of the carrier device 105 to be further wound or unwound from the electric motor's rotor, thereby shortening or lengthening the vertical wire 107.
A depth-detecting sensor 116 is mounted on the carrier device 105 in a downwards facing orientation. In one embodiment, the depth-detecting sensor 116 may include an RGB-D sensor that combines RGB colour information with per-pixel depth information. In another embodiment, the depth-detecting sensor 116 may include a radar sensor. The skilled person will understand that the above-mentioned examples of depth-detecting sensors are provided for illustration purposes only. In particular, the skilled person will understand that the preferred embodiment is not limited to the use of these above-mentioned depth-detecting sensors. Instead, the preferred embodiment is operable with any sensor capable of detecting the distance between itself and another object detected within the range of the sensor.
When the Aerial Movement Volume 110 is defined by the relative arrangement of the horizontal wires 102, upright members 103 and the ground G, the location of the robotic device 106 within the Aerial Movement Volume 110 is defined by the following parameters:
The co-ordinates of the carrier device 105 in the horizontal plane 112 is determined by the lengths of the individual horizontal wires 102 coupling the carrier device 105 to each of the respective elevated anchor points 104. Similarly, the distance between the carrier device 105 and the robotic device 106 is denoted by the unwound length of the vertical wire 107 connecting the robotic device 106 to the carrier device 105.
The carrier device 105 is adapted to move within the bounded horizontal plane 112. This movement is achieved through the activation of the electric motors in the anchor points 104 to cause the horizontal wire 102 coupled to each electric motor to be further wound or unwound from the electric motor's rotor, thereby shortening or lengthening each such horizontal wire 102. The robotic device 106 is adapted to move vertically relative to the carrier device 105 through the activation of the electric motor(s) in the carrier device 105 to cause the vertical wire 107 coupled to each electric motor to be further wound or unwound from the electric motor's rotor, thereby shortening or lengthening the vertical wire 107. In view of the above, for brevity, the electric motors in the anchor points 104 will be referred to henceforth as Horizontal Movement Motors. Similarly, the electric motor(s) in the carrier device 105 will be referred to henceforth as Vertical Movement Motor(s).
In an embodiment of the present disclosure, the aerial module 101 is controlled by a navigation control system 114 (hereinafter referred to as ‘the control unit’ and denoted using identical reference numeral ‘114’). The control unit 114 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, logic circuitries, and/or any devices that manipulate data based on one or more instructional codes. The control unit 114 may be implemented as a combination of hardware and software, for example, programmable instructions that are consistent with the implementation of one or more functionalities disclosed herein.
In an embodiment of the present disclosure, the control unit 114 may be configured to determine a location of the robotic device 106 within the Aerial Movement Volume 110. The control unit 114 is further configured to synchronise the operations of the Horizontal Movement Motors and the Vertical Movement Motors to permit the robotic device 106 to be moved from its current location to a target location within the Aerial Movement Volume 110, without the necessity of human intervention. In an embodiment of the present disclosure, the control unit 114 includes a navigation system configured to calculate a route between the current and target locations of the robotic device 106.
With combined reference to
Within the CDRS 201, the position of each of P1, P2 and P3 is denoted by (xP1, yP1), (xP2, yP2) and (xP3, yP3) respectively. The first vertex P1 is defined to be the origin of the CDRS 201. Thus, xP1=0 and yP1=0. From this, it can also be inferred that yP2=0. The remaining co-ordinates of the second and third vertices P2 and P3 are computed based on the known distances {dP1P2, dP1P3, dP2P3} between the upright members 103. More specifically, the co-ordinates of the three vertices P1, P2 and P3 are as follows:
Referring to the CDRS 201 shown in
From the above formulation for the CDRS 201, the lengths (l1A, l2A and l3A) of the line segments connecting the start point A of the carrier device 105 to the vertices P1, P2 and P3 can be expressed as follows:
l
1A=√{square root over (xA2+yA2)} (4)
l
2A=√{square root over ((xP2−xA)2+yA2)} (5)
l
3A=√{square root over ((xA−xP3)2+(yP3−yA)2)} (6)
By considering the elevation of the Horizontal Movement Motors, the CDRS 201 may be expanded into a 3D navigation reference system (3DNRS) 300 as explained with reference to
With combined reference to
In an embodiment of the present disclosure, a first horizontal plane CDRS' is the plane defined by the vertices P′1, P′2 and P′3, whereas a second horizontal plane CDRS is the plane defined by the vertices P1, P2 and P3. In other words, the second horizontal plane CDRS of
Using this construction, defining P′1 as the origin of the 3DNRS, and knowing the distances {dP1P2, dP1P3, dP2P3} between each of the first, second and third Horizontal Movement Motors and the ground G, the x and y co-ordinates of the vertices P′1, P′2 and P′3 of the 3DNRS are defined as follows:
The x and y co-ordinates of the vertices P′1, P′2 and P′3 of the 3DNRS are the same as those of the P1, P2 and P3 vertices. Indeed, the P′1, P′2 and P′3 vertices differ from the P1, P2 and P3 vertices in the z coordinate only (zP1=zP2=zP3=dP′1P1). Similarly, the distances between corresponding vertices in the first horizontal plane CDRS' is the same as those in the second horizontal plane CDRS. For example, dP1P2=dP′1P′2.
The z co-ordinate of the start point A represents the elevation from the ground of the robotic device 106 at the start point A. Since the carrier device 105, from which the robotic device 106 is suspended, is adapted to move within the bounded horizontal plane 112, the length l4A of the line segment connecting the start point A to the second horizontal plane CDRS is given by
l
4A
=h−z
A (10)
This length l4A is equivalent to the length of the vertical wire 107 needed to bring the robotic device 106 to the elevation represented by the z co-ordinates of the start point A.
The target location of the robotic device 106 within the Aerial Movement Volume 110 corresponds to an end point B. In an analogous manner to the above derivation of the co-ordinates of the start point A, the co-ordinates of the end point B can also be defined in terms of the lengths of the horizontal wires l1B, l2B and l3B and the length of the vertical wire l4B that would be needed to respectively position the carrier device 105 and the robotic device 106 at the end point B corresponding to the target location of the robotic device 106.
In other words, using the above formulation for the CDRS, l1B, l2B and l3B are the lengths of the line segments connecting the end point B to the vertices P1, P2 and P3 of the CDRS. In a similar manner to the equations (4), (5) and (6) for the start point A, the lengths l1B, l2B and l3B are given by:
l
1B=√{square root over (xB2+yB2)} (11)
l
2B=√{square root over ((xP′
l
3B=(xB=xP′
Similarly, the length l4B is given by:
l
4B
=h−z
B (14)
A movement trajectory of the robotic device 106 from the current location to the target location in the Aerial Movement Volume 110 is denoted by the line segment connecting start point A to end point B in
Referring back to
In an embodiment of the present disclosure, the control unit 114 is configured to implement a navigation algorithm to compute parameters for each of the Horizontal Movement Motors and the Vertical Movement Motor(s) to cause movement of the carrier device 105 along the route, or trajectory, from the start point A to the end point B in the CDRS as shown in the views of
diri=sign(lhmA−lhmB) (17)
Each local computing device may be provided with a buffer. Using the above equations, the control unit 114 may calculate the control parameters (nrothm, dirhm and Ohm) for each Horizontal Movement Motor and communicate the control parameters for a given Horizontal Movement Motor to the local computing device associated therewith. The local computing device stores the control parameters (nrothm, dirhm and θhm) in its buffer. For brevity, the control parameters (nrothm, dirhm and θhm) for a given Horizontal Movement Motor will be collectively referred to henceforth as a Horizontal Movement Control Parameter Tuple Thm=(nrothm, dirhm, θhm).
In an embodiment of the present disclosure, synchronisation of movements of all Horizontal Movement Motors is achieved through their connection through a real-time synchronization interface 118 such as, for example, with use of an EtherCAT microchip to allow the carrier device 105 to be moved at a pre-defined speed ξ (e.g. ξ=0.1 m/s). The pre-defined speed and direction of travel computed by the control unit 114 for the robotic device 106 may take into account a balance, for instance, a trade-off between one or more imperatives including, but not limited to, reducing travel time considering the constraints imposed by the physical limitations of the aerial module 101 or executing smooth starting and stopping of the robotic device 106 whilst ensuring safe movement of the robotic device 106 within the Aerial Movement Volume 110 of the aerial module 101.
For sake of simplicity in this disclosure, referring to
With further execution of the navigation algorithm, the system's movements are expanded from the horizontal plane 112 to the Aerial Movement Volume 110. Specifically the robotic device 106 is lowered/raised from its current elevation (i.e. the elevation of the start point A) to the elevation of the endpoint B. This is achieved using the Vertical Movement Motor(s) which controls the vertical wire 107 that links the carrier device 105 and the robotic device 106. The control parameters (nrotvm, dirvm, and θvm) for the Vertical Movement Motor(s) are determined using the equations below.
For brevity, the control parameters nrotvm, dirvm and θvm) or each Vertical Movement Motor will be collectively referred to henceforth as a Vertical Movement Control Parameter Tuple Tvm=(nrotvm, dirvm, θvm).
The same tnav is used for both the movements of the carrier device 105 and the robotic device 106 to synchronize the Horizontal Movement Motors and the Vertical Movement Motor(s), and therefore, to move the robotic device 106 along the AB line segment.
The previous discussion has focused on movement within the Aerial Movement Volume 110 from a known start point A to a known end point B. The following discussion builds on this, to describe methods of determining successive points along a movement trajectory from a start point S to a destination point D wherein the movement trajectory enables avoidance of intervening obstacles between point S and point D. Having determined these successive points, the method described above is used to move the robotic device 106 between successive points along the movement trajectory until the robotic device 106 reaches the destination point D.
As shown, at step 402, the method 400 includes the performing of an Initialisation Phase. The Initialisation Phase 402 builds a map of obstacles in the Aerial Movement Volume 110. The Initialisation Phase 402 is executed at system installation time, and repeated periodically to address changes in the surveyed environment.
As shown, at step 502, the Initialisation Phase 402 of the method 400 includes the carrier device 105 performing reconnaissance of the Aerial Movement Volume 110 by executing a pre-defined movement schema in the horizontal plane (112 in
As shown, in this example, the movement schema comprises a looped zig-zag patrol pattern 600. More specifically, the horizontal plane (112 in
The skilled person will understand that the looped zig-zag patrol pattern shown in
Returning to
At step 506, the Initialisation Phase 402 of the method 400 further includes applying a stitching algorithm to the images captured by the depth-detecting sensors 116 to construct a panoramic view of the Aerial Movement Volume 110. From this, a map is created of objects detected in the Aerial Movement Volume 110 detailing the distance of each such detected object from the depth-detecting sensor 116 when it is positioned overhead the detected object. For brevity, this map is referred to henceforth as a Depth Map of the Aerial Movement Volume 110.
Since the elevation of the horizontal plane (112 in
At step 508, the Initialisation Phase 402 of the method 400 further includes applying a segmentation algorithm to the Depth Map to detect zones therein with the same elevation.
At step 510, the Initialisation Phase 402 of the method 400 further includes applying a Hough transform to the borders of the detected zones to detect the corner points thereof.
An Elevated Zone is defined as a zone in the Depth Map whose elevation exceeds a pre-defined tolerance threshold. Thus, the detection of an Elevated Zone in the Depth Map indicates the presence of a potential obstacle to the movement of the robotic device 106 in the Aerial Movement Volume 110.
At step 512, the Initialisation Phase 402 of the method 400 further includes using corner points of one or more adjoining Elevated Zones to establish a 3D bounding box around a corresponding Elevated Zone or a group of adjoining Elevated Zones. For brevity and clarity, a 3D bounding box around an Elevated Zone or a group of adjoining Elevated Zones will be referred to henceforth as an Obstacle Bounding Box.
Returning to
Let xiobs and yiobs be the x and y coordinates of a point in an ith Obstacle Bounding Box Bobs(i) projected into the horizontal plane (112 in
At step 514, the Initialisation Phase 402 of the method 400 further includes creating a Static Bounding Box List (SBBL) using the Obstacle Bounding Boxes established in step 512. A Static Bounding Box List (SBBL) is a list of all the Obstacle Bounding Boxes established in the Aerial Movement Volume 110. In other words, defining Nbox as the number of Obstacle Bounding Boxes established in the Aerial Movement Volume 110, the Static Bounding Box List (SBBL) is given by SBBL={Bobs(i)}i=1N
At step 516, the Initialisation Phase 402 of the method 400 further includes forming a Static Obstacle Map (SOM) by the union of the Obstacle Bounding Boxes identified in the SBBL. In other words, the Static Obstacle Map (SOM) is given by SOM={Bobs(i)∪Bobs(j≠i)}i=1,j=N
Returning to
At step 404, the method 400 includes performing a Navigation Path Creation Phase. The purpose of the Navigation Path Creation Phase 404 is to determine an optimal route that allows the robotic device 106 to navigate between any two points in the Aerial Movement Volume 110 while avoiding obstacles detected during the Intialisation Phase 402 and represented in the Static Obstacle Map. The input to the Navigation Path Creation Phase 404 includes a start location S=(xS, yS, zS) of the robotic device 106 within the Aerial Movement Volume 110, the required destination location D=(xD, yD, zD), the Static Bounding Box List (SBBL) and the Static Obstacle Map (SOM) determined during the Initialisation Phase 402.
At step 802, the Navigation Path Creation Phase 404 of the method 400 includes establishing a Robot Bounding Box Brob using a convex approximation of the space occupied by the robotic device 106 and an added pre-defined safety tolerance volume around the robotic device 106. The Robot Bounding Box Brob is dimensioned to fully enclose the robotic device 106 and the safety tolerance volume. By centering the robotic device 106 in the Robot Bounding Box Brob, the x and y co-ordinates of its vertices can be defined as ±xrob and ±yrob. Thus, the Robot Bounding Box Brob may be represented as the Robot Bounding Box Function Brob(x, y) where Brob (−xrob<x<xrob, −yrob<y<yrob)=zrob and Brob(|x−xrob|<0, |y−yrob|<0)=−∞.
At step 804, the Navigation Path Creation Phase 404 of the method 400 further includes determining one or more Free Movement Regions. A Free Movement Region is a region in the Aerial Movement Volume 110 where the robotic device 106 can move without colliding with a detected obstacle. A Free Movement Region may be determined by computing the grayscale morphological dilation of an Obstacle Bounding Box Function Biobs(x, y) with the Robot Bounding Box Function Brob (x, y), as follows:
In equation (22), sup denotes supremum and (xj,yj) spans the horizontal plane (112 in
(xf,yf,zf)|zf>f(xf,yf). (23)
Similarly, Occupied Regions are defined by the points not satisfying equation (23). Since Obstacle Bounding Box Functions Biobs(x, y) and the Robot Bounding Box Function Brob(x, y) are both defined by cuboids, the resulting Occupied Regions are also a union of cuboids.
Using this formulation, the problem solved during the Navigation Path Creation Phase 404 of the method 400 may be stated as: given the robotic device's current location S=(xS,yS,zS); a desired destination D=(xD,yD,zD); and Free Movement Regions Wf in the Aerial Movement Volume 110, find an optimal path from S to D, within Wf.
At step 806, the Navigation Path Creation Phase 404 of the method 400 further includes defining a Search Plane containing the start location S and destination location D. The Search Plane is defined by an equation of the form:
z=ax+by+c (24)
Using the known coordinates of the start location S in equation (24) produces two equations in unknowns a, b and c:
z
S
=ax
S
+by
S
+c (25)
z
D
ax
D
+by
D
+c (26)
The Search Plane is defined by an additional constraint in which all points in the Search Plane, have a constant elevation on a same line orthogonal to the line S-D. The angle of this line with respect to the x direction is given by:
cos(α)=(yD−yS)/√{square root over ((xD−xS)2+(yD−yS)2)} (27)
Displacing point A with a unit step on the plane z=zA in a direction perpendicular to the line S-D, leads to a point A′ with coordinates
x
A′
=x
A+cos α (28)
y
A′
=y
A+sin α (29)
z
A′
=z
A (30)
This supports the third equation needed to find the coefficients defining the Search Plane
z
A
=ax
A′
+by
A′
+c (31)
In step 808, the Navigation Path Creation Phase 404 of the method 400 further includes establishing a Search Plane Collision Map. A first step to establishing the Search Plane Collision Map comprises selecting from the vertices of each Obstacle Bounding Box listed in the Static Bounding Box List, those that are located above the Search Plane. Thus, the selected vertices Vsi are given by:
V
s
i={(xki,yki,hi)|hi≥axki+byki+c},k=1,2,3,4 (32)
Depending on the shape of an obstacle whose Obstacle Bounding Box is listed in the Static Bounding Box List; it can have 0 to 4 vertices located above the Search Plane. If an Obstacle Bounding Box Bobs(i) has at least two such vertices, a 2D Collision Bounding Box CB(i) is defined for the Obstacle Bounding Box Bobs(i). The Collision Bounding Box CB(i) is defined by those vertices located above the Search Plane of maximum and minimum valued x and y co-ordinates (xs_mini, ys_mini) and (xs_maxi, ys_maxi).
The Search Plane Collision Map is formed by the union of the Collision Bounding Boxes, CB(i). An example of a Search Plane Collision Map 900 is given with reference to
Returning to
In step 1102, the Optimal Route Polyline step 810 of the Navigation Path Creation Phase 404 of the method 400 includes projecting the start and destination locations S and D into the Search Plane, to produce S* (xs,yS,zmax) and D* (xD,yD,zmax).
The route between S* and D* which avoids Collision Bounding Boxes CB(i) is formed from a 3D polyline. A 3D polyline is a connected sequence of straight-line segments created as a single object in a three dimensional space. The object is specified by the endpoints of each of its line segments; and the endpoint of each line segment is known as a vertex of the polyline. For the sake of brevity, the 3D polyline connecting S* and D* which avoids collision bounding boxes CB(i), may be referred to henceforth as the Candidate Route Polyline (CRP). In a Candidate Route Polyline (CRP) comprising k line segments, individual line segments may be denoted by CRPSeg(i) where i=1 to k; and the Candidate Route Polyline's vertices may be denoted by CRPVert(j) where j=1 to k+1. Thus, CRPVert(1) and CRPVert(2) may be the endpoints of CRPSeg(1); CRPVert(2) and CRPVert(3) may be the endpoints of CRPSeg(2), and so on, until CRPVert(k) and CRPVert(k+1) which will be the endpoints of CRPSeg(k). For brevity, CRPVert(1) and CRPVert(k+1) will be referred to henceforth as the First Terminal Vertex and the Second Terminal Vertex respectively. Similarly, the remaining vertices CRPVert(j) where j=2 to k will be referred to henceforth as the Non-Terminal Vertices.
Since the Candidate Route Polyline (CRP) connects S* and D*, the co-ordinates of the First Terminal Vertex will be S* and the co-ordinates of the Second Terminal Vertex will be D*. The co-ordinates of the First Terminal Vertex and the Second Terminal Vertex remain fixed during the calculate Optimal Route Polyline step 810 of the Navigation Path Creation Phase 404 of the method 400. However, the co-ordinates of the Non-Terminal Vertices are varied during the Optimal Route Polyline step 810 to enable the Optimal Route Polyline to be determined.
In step 1104, the Optimal Route Polyline step 810 of the Navigation Path Creation Phase 404 of the method 400 further includes selecting initial positions of the Non-Terminal Vertices of a Candidate Route Polyline (CRP).
In step 1106, the Optimal Route Polyline step 810 of the Navigation Path Creation Phase 404 of the method 400 further includes calculating the length of the Candidate Route Polyline (CRP).
In step 1108, the Optimal Route Polyline step 810 of the Navigation Path Creation Phase 404 of the method 400 further includes using an optimisation algorithm to iteratively:
The step 1108 of using the optimisation algorithm is continued until no further reduction in the length of the Candidate Route Polyline (CRP) is achieved. In the present embodiment, the optimisation algorithm used in the step 1108 is the rapidly exploring random tree (RRT) algorithm. However, the skilled person will understand that this path planning algorithm is provided for illustration purposes only. In particular, the skilled person will understand that the preferred embodiment is not limited to the use of the rapidly exploring random tree (RRT) algorithm. Instead, the preferred embodiment is operable with any suitable path planning algorithm such as the A* graph traversal algorithm, the D* incremental search algorithm and the Probablistic Roadmap Planner. The result of the optimisation process is the Candidate Route Polyline (CRP) with the shortest length. Thus, this Candidate Route Polyline (CRP) is the Optimal Route Polyline (ORP).
The Optimal Route Polyline (ORP) comprises k line segments wherein individual line segments may be denoted by ORPSeg(i) where i=1 to k. The Optimal Route Polyline's vertices may be denoted by ORPVert(j) where j=1 to k+1. Furthermore, ORPVert(1)=S* and ORPVert(k+1)=D*. Thus, ORPVert(1) and ORPVert(2) may be the endpoints of ORPSeg(1); ORPVert(2) and ORPVert(3) may be the endpoints of ORPSeg(2), and so on, until ORPVert(k) and ORPVert(k+1) which will be the endpoints of ORPSeg(k).
With combined reference to
Returning to
Therefore, the output from the Control Parameter Creation Phase 406 comprises at least four Controller Lists L1, L2, L3 and L4 wherein the each of the first, second and third Controller Lists L1, L2, L3 comprises the Horizontal Movement Control Parameter Tuples Thm(i) where i=1 to k. Similarly, the fourth and further Controller Lists L4 comprises the Vertical Movement Control Parameter Tuples Tvm(i) where i=1 to k. In other words, the number of Horizontal Movement Control Parameter Tuples or Vertical Movement Control Parameter Tuple(s) in each Controller List equals the number of segments (k) in the Optimal Route Polyline (ORP). Thus, the output from the Control Parameter Creation Phase 406 is L1∈3×k, L2∈3×k, L3∈3×k and L4∈3×k. Each controller list L1, L2, L3 and L4 is written to a memory buffer in the controller corresponding with the relevant controller list.
At step 408, the method 400 includes performing a Device Navigation Phase. During the Device Navigation Phase 408, the controllers of the Horizontal Movement Motors and the Vertical Movement Motor(s) are operated in synchrony. Thus, corresponding Horizontal Movement Control Parameter Tuples and Vertical Movement Control Parameter Tuple(s) from each controller list (e.g. Thm(i), Thm(i), Thm(i) and Tvm(i) where i=1 to k) are executed in parallel by each controller. Executing the Horizontal Movement Control Parameter Tuples and Vertical Movement Control Parameter Tuple(s) associated with a given line segment ORPSeg(i) of the Optimal Route Polyline (ORP), the controllers cause the robotic device 106 to execute a uniform linear movement along the line segment ORPSeg(i). At the end of the line segment ORPSeg(i), the controllers execute the next Horizontal Movement Control Parameter Tuples and Vertical Movement Control Parameter Tuple(s) from their respective controller lists to cause the robotic device 106 to move along the next line segment ORPSeg(i+1) of the Optimal Route Polyline (ORP). This process is continued until the robotic device 106 reaches the required destination D.
It is hereby contemplated that functions consistent with the present disclosure can be embodied as one or more computer-executable software instructions or code that may be stored on a non-transitory computer readable medium. It should be noted that the control unit 114 of the present disclosure may also include one or more processors, micro-processors, controllers, micro-controllers, actuators and the like to individually, or collectively, control operation of the various electric motors in a manner consistent with the present disclosure. These processors, micro-processors, controllers, micro-controllers, actuators and the like may be readily embodied in the form of general purpose computers or application specific controllers that can be readily implemented for use in facilitating operation of the control unit 114 disclosed herein. These software instructions when executed by a processor of the control unit 114 can cause the processor to perform the steps of the methods shown in respective ones of the
In an embodiment of the present disclosure, equipped with this formulation, a closed loop control system (including for example, model-based predictive control mechanisms) may be implemented to adapt the movement parameters in real time to conform with curvilinear kinematics. Such adaptation would allow the robotic device 106 to autonomously implement 3D curvilinear trajectories including spiral, conchoid, helical and hemispherical flight paths. Furthermore, the above formulation supports adaptive control of velocity during different stages of the curvilinear trajectory, such that the robotic device 106 accelerates/decelerates to different velocities at different stages of the curvilinear trajectory. These features would enable the aerial module 101 to be implemented for use in enhanced autonomous reconnaissance and surveillance applications. Example use cases may include, but are not limited to, detailed sweep-in views of a surveyed scene, adaptive top down and side-ways views of stacked or tall items (for example, pallets in a warehouse facility), or items partially obscured by one or more obstacles, and tracking of subjects moving in a curvilinear path.
Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “consisting of”, “have”, “is” used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural.
This application claims priority to and the benefit of U.S. Provisional Application Ser. No. 63/034,155, filed Jun. 3, 2020, U.S. Provisional Application Ser. No. 63/034,165, filed Jun. 3, 2020, and U.S. Provisional Application Ser. No. 63/043,816, filed Jun. 25, 2020, the entire disclosures of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
63043816 | Jun 2020 | US | |
63034155 | Jun 2020 | US | |
63034165 | Jun 2020 | US |