This application claims the benefit of Singapore Patent Application number 10201802492Y filed 26 Mar. 2018, the entire contents of which are incorporated herein by reference for all purposes.
Various embodiments relate to aerial vehicles, methods of imaging a tunnel and methods of imaging a shaft.
Some enclosed infrastructures, such as train tunnels, sewage tunnels and other underground networks, require an infrastructure surveillance system that generates minimum disturbance to its surroundings while capturing as much data as possible. Many infrastructure surveillance systems today are ground based, making them vulnerable to debris and liquid on the floor of the infrastructure as they operate. Conventional robots may not be ideal as infrastructure surveillance systems. For example, unmanned ground vehicles (UGV) are unable to traverse sewage tunnels filled with silts, sewerage or debris. Unmanned surface vessels (USV) can only work in tunnels that are partially filled with a liquid of highly diluted consistency. Pipeline and tunnel robots can only work in small to medium diameter pipes and may require complex hoisting and winching mechanisms for deployment and retrieval.
According to various embodiments, there may be provided an aerial vehicle including: an airframe including a central member defining a longitudinal axis; a gimbal coupled to the central member; a camera mounted on the gimbal to face a direction at least substantially orthogonal to the longitudinal axis; wherein the gimbal is rotatable about the longitudinal axis to spin the camera around the longitudinal axis; and a propulsion means configured to propel the aerial vehicle, wherein the propulsion means is offset from the camera along the longitudinal axis.
According to various embodiments, there may be provided a method of imaging a shaft, the method including: flying an aerial vehicle along a depthwise direction of the shaft; wherein the aerial vehicle includes an airframe defining a longitudinal axis, and a camera mounted on the airframe to face a direction at least substantially orthogonal to the longitudinal axis; wherein the longitudinal axis is at least substantially parallel to the depthwise direction when the aerial vehicle is in flight; rotating the aerial vehicle about the longitudinal axis while the aerial vehicle is in flight such that the camera revolves around the longitudinal axis to capture a spiral panoramic image; and reconstructing a virtual three-dimensional model of the shaft based on the spiral panoramic image.
According to various embodiments, there may be provided a method of locating an aerial vehicle in an enclosed space, the method including: storing geometrical information about the enclosed space in a memory; measuring a plurality of distances using a plurality of range sensors, each range sensor mounted on a respective position on the airframe; wherein each measured distance is a distance between the respective position and a nearest surface of the enclosed space from the respective position; and determining a planar position of the aerial vehicle based on the measured distances and the geometrical information.
In the drawings, like reference characters generally refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the following description, various embodiments are described with reference to the following drawings, in which:
Embodiments described below in context of the aerial vehicles are analogously valid for the respective methods, and vice versa. Furthermore, it will be understood that the embodiments described below may be combined, for example, a part of one embodiment may be combined with a part of another embodiment.
It will be understood that any property described herein for a specific aerial vehicle may also hold for any aerial vehicle described herein. It will be understood that any property described herein for a specific method may also hold for any method described herein. Furthermore, it will be understood that for any aerial vehicle or method described herein, not necessarily all the components or steps described must be enclosed in the device or method, but only some (but not all) components or steps may be enclosed.
It should be understood that the terms “on”, “over”, “top”, “bottom”, “down”, “side”, “back”, “left”, “right”, “front”, “lateral”, “side”, “up”, “down” etc., when used in the following description are used for convenience and to aid understanding of relative positions or directions, and not intended to limit the orientation of any device, or structure or any part of any device or structure. In addition, the singular terms “a”, “an”, and “the” include plural references unless context clearly indicates otherwise. Similarly, the word “or” is intended to include “and” unless the context clearly indicates otherwise.
The term “coupled” (or “connected”) herein may be understood as electrically coupled or as mechanically coupled, for example attached or fixed, or just in contact without any fixation, and it will be understood that both direct coupling or indirect coupling (in other words: coupling without direct contact) may be provided.
In order that the invention may be readily understood and put into practical effect, various embodiments will now be described by way of examples and not limitations, and with reference to the figures.
According to various embodiments, an aerial vehicle may be provided for inspecting covered infrastructures. The aerial vehicle may be equipped with a revolving camera system. The revolving camera system may capture panoramic images of the immediate surroundings as the aerial vehicle moves forward. Unlike a conventional 360° camera with wide-angled lens, the revolving camera system may capture images of entire lateral surfaces, and the images can be of high fidelity and minimal optical distortion, while only using a single camera. The images captured by the revolving camera system may be stitched together to virtually reconstruct the infrastructure for detection of damages such as cracks and deterioration. These images may be processed to reconstruct the infrastructure in a virtual space and used for detecting cracks and deteriorations. The aerial vehicle may be an unmanned aerial vehicle (UAV) that may autonomously carry out visual inspection of the covered infrastructures. The aerial vehicle may have a minimal set of simple navigation sensors, so that the weight, battery and computational power required by the navigation sensors are low. The navigation sensors may include a sparse array of time-of-flight (ToF) rangefinders that are arranged in an optimal fashion to localize in different environments, such as horizontal tunnels and vertical shafts. The aerial vehicle may be powered by lithium-ion batteries that have higher energy denser than traditional lithium-polymer batteries to enhance the endurance of the aerial vehicle.
One possible application of the aerial vehicles according to various embodiments, may be to inspect underground sewerage infrastructure, like the Deep Tunnel Sewerage System (DTSS) in Singapore which is large and extensive. The tunnels in the DTSS are protected with specially-designed Corrosion Protection Lining and periodic inspections are required. The environment inside these tunnels, which extend more than 30 m underground, is hazardous and human access is difficult and dangerous. The aerial vehicle may be able access the tunnels of about 3 to 6 m in diameter via the vertical direct access shaft of about 3 to 5 metres in diameter, without requiring winch and hoisting system like conventional pipeline robots. Being agile and versatile to traverse 3D space, the aerial vehicle may be able to enter the tunnels even in the presence of sewerage, silts, debris and unknown obstacles in a fully operational sewerage system.
The aerial vehicle may include rotors 108 as the propulsion means. At least one rotor may be coupled to each respective arm 106. The aerial vehicle 100 may include a camera system 102 coupled to the central member 104. The camera system 102 may include a camera 110. The camera 110 may be fixed in position and orientation with respect to the camera system 102. The camera system 102 may be configured to rotate 150 about the X axis 120, for example using a rotatable gimbal of the camera system 102. As the camera system 102 rotates, the camera 110 may revolve around the X axis 120. The camera system 102 may be an internal camera system. The internal camera system may be housed within a cavity in the central frame 104. The central frame 104 may include an at least substantially transparent window aligned, for example longitudinally aligned, with the camera 110 so that the camera 110 may receive light from outside of the central frame 104. Alternatively, the camera system 102 may be an external camera system, mounted outside of the central frame. The external camera system may rotate around the central frame 104. For example, the camera system 102 may include a ring-shaped gimbal that receives the central frame in a centre of the gimbal, for example, in a concentric manner.
According to various embodiments, the aerial vehicles 100 or 200 may be unmanned aerial vehicles (UAV), which may also be referred herein as aerial robots. The aerial vehicles 100 or 200 may include a datalink configured to receive and transmit data between the aerial vehicle and a ground control station. The aerial vehicles may be designed and optimized to accommodate the revolving camera system and achieve high endurance. The aerial vehicles may feature a fine-tuned propulsion system which allows for maximum cooling of its sensors and payloads. The aerial vehicles may be manually piloted semi-autonomously using long range radio and video transmission. In cases where manual control is not necessary or not possible, the aerial vehicles may fly autonomously using pre-planned flight paths. The aerial vehicles may include a memory storing geometrical information about an enclosed space. Alternatively, the geometrical information about the enclosed space may be stored external to the aerial vehicle. A processor, either onboard the aerial vehicle or external to the aerial vehicle, may determine a planar position of the aerial vehicle in the enclosed space based on measurements from range sensors on the aerial vehicle and further based on the geometrical information.
According to various embodiments, the aerial vehicle 400 may fly in a spiral path around the depthwise axis of the shaft 440, instead of spin about the Z axis 140. This may be especially applicable if the aerial vehicle 400 is a fixed wing aerial vehicle, like the aerial vehicle 200.
Optionally, the aerial vehicle 100 may include an optical flow sensor 514 to aid in obstacle avoidance and localization within the covered infrastructure. The aerial vehicle 100 may also include a front operator camera 522, for an operator of the aerial vehicle 100 to see where the aerial vehicle 100 is heading during flight. The aerial vehicle 100 may include a fine-tuned propulsion system which allows for maximum cooling of both the motors and the electronic speed controllers. The aerial vehicle 100 may be manually piloted semi-autonomously using long range radio and video transmission systems. In cases where manual control is not necessary or not possible, the aerial vehicle may make autonomous flights using pre-planned paths which may be stored in the controller 516.
In addition to the above-mentioned sensors and components, the UAV can also carry an array of environmental sensors that measure the immediate environmental conditions such as temperature and pressure. The array of environmental sensors may also include integrated hazardous gas sensor 520 that detect or measure the concentration of specific gases. If a dangerous operating environment is detected, the UAV can be programmed to return to home in a low-power state.
The main challenges of autonomously navigating in the tunnel environments using aerial robots is the problem of localization in pitch-black GPS-denied environments, and the development of an energy-efficient aerial platform and sensing methodology to perform extended hours of inspection in long tunnels.
According to various embodiments, an aerial vehicle may employ a sparse sensing system for obstacle avoidance and localization in tunnels and shafts, to address the abovementioned challenges. The sparse sensing system may be lightweight and energy efficient so that the aerial vehicle may have a high payload capacity and may have a long endurance. The sparse sensing system may require prior knowledge of the tunnel geometry and may perform well in tunnel environments that are relatively featureless, especially so under poor illumination.
The sparse sensing system may include an array of ranging sensors (for example: ToF sensors) mounted on the aerial vehicle. Depending the on the environment, there may be an optimal sensor configuration that may enable localization with the lowest degree of errors. The optimal configuration may be mathematically formulated as a spatial optimization problem with constraints such as preventing occlusion to the rotating camera field-of-view and the feasibility for mechanical implementation on the aerial vehicle.
In the following, the design optimization of the sparse sensing system is described in detail.
The localization approach employed on the aerial vehicle may rely on the knowledge of the geometry of the tunnel, and may be formulated using parametric representation of the known geometry. The position of the robot may be estimated based on the sparse array of rangefinders. Analytically, there exists a sub-optimal placement of the sensors in the geometrical blindspot of the environment. For instance, assuming the front of the robot is aligned with the longitudinal axis of the tunnel and all sensors are placed pointing directly in front and behind the robot, pose estimation may not be possible when the robot is rotated in the yaw axis such that the sensors are not within range of the tunnel walls. A design optimization is formulated to search for an optimal spatial configuration of the sensors in both tunnel and shafts results in low error tracking of the robot pose. To tackle the large search space, genetic algorithm (GA) is used to solve the optimization problem.
Notation
Frames are denoted with italic fonts, e.g. A, with the unit vectors {{circumflex over (x)}A, ŷA, {circumflex over (z)}A} and origin OA. The local frame, L, may be defined with {circumflex over (x)}L parallel to the longitudinal axis of the tunnel, and {circumflex over (z)}L defined parallel to the gravity vector, and ŷL such that {circumflex over (x)}L×ŷL={circumflex over (z)}L. B may be a body-fixed frame with {circumflex over (x)}B pointing to the front of the robot, {circumflex over (z)}B down and ŷB such that {circumflex over (x)}B×ŷB={circumflex over (z)}B. The origin OB is attached to the geometric centre of the robot. A rotation matrix LRB transforms a point in frame B into L. The full state of the craft is defined in the local frame as
r
L=[x y z ϕx ϕy ϕz]L,T (1)
Environmental Assumptions
Tunnel environments may refer to structured environment with high reflection symmetry, a characteristic prevalent in man-made structures, e.g. canal, penstocks, sewerage tunnel, shafts and etc. These structures are uni-axial with a known cross-sectional parametric representation, and visually-degraded with poor illumination. In particular, the following description focuses on navigating a horizontal tunnel with a rectangular cross-section and vertically descending a cylindrical shaft with circular cross-section although it should be understood that the aerial vehicle may not be limited to navigating tunnels and shafts of such geometries. Due to the lack of salient geometric landmarks and the symmetry of the environment, there are inherent “blind spots” that prohibits the reliable estimation of the position of the robot along the tunnel axis solely from the sparse array of rangefinders.
Perception Algorithm
Based on prior knowledge of the geometry of the environment, the perception algorithm translates the range input from the sparse sensing array into reduced-order pose estimates in the tunnel. It involves first finding a suitable cross-section of the tunnel that maximizes the number of robot states that can be estimated, and subsequently deriving a parametric representation for the chosen cross-section of the tunnel. For horizontal tunnels, the reflection symmetry of the tunnel about local x-z plane makes it possible to only estimating 2 DOF: the lateral position offset, yL, and the yaw, ϕL, of the robot. For traversing in vertical shafts, the axial symmetry about the zL allows for only 2 DOF pose estimates: the lateral position displacement, xL and yL. In both cases, the position along the longitudinal axis of the tunnel, i.e. xL in the horizontal tunnels and zL in the vertical shaft, is not possible to be estimated from the sparse sensing array, and is left controlled by the operator. The remaining states are estimated with information from the on-board IMU and a downward-pointing rangefinder. The sparse sensing array is populated with rangefinders that have physical range limitations, the minimum and maximum range is denoted by rmin and rmax respectively. These limits can either be retrieved from manufacturer technical specifications or empirically determined. The range measurement from a sensor is along the {circumflex over (x)}S-axis, where S is the sensor frame with the origin attached to the body of the sensor. The individual measurements are rotated to the body frame, B. The resulting point cloud of the range measurements from the sparse sensing array in B is denoted as pB∈M×1 where M is the number of sensors in the array. Using linear least square approach, the point cloud, pB, is used to estimate the parameters for the parametric representation of the tunnel. The parameter fitting procedure for tunnel environments is discussed first, followed by vertical shaft environments. Unless otherwise stated, the proceeding calculations are performed in frame B.
Pose Estimation in Tunnel Environments
Most man-made tunnels have local sections with right rectangular prism geometry, bounded by vertically parallel walls on two sides, water body on the bottom, and a ceiling on the top. In addition, all angles are right angles. This justifies the reduction of the geometry to a 2D planar form, formed by intersecting the geometry with a local xL-yL plane.
ρ0=x cos α0+y sin α0 (2)
where ρ0 is the perpendicular distance from the OB to the line, α0 is the slope of line in a bounded interval (−π, π], given by the angle the line makes with xB, and x and y are coordinates of the feasible set that falls on the line. The feasible set contains the range measurements from the sparse sensing array, where pi,x=pi·{circumflex over (x)}B, pi,y=pi·ŷB and pi is the measurement of the i-th rangefinder projected onto the local xL-yL plane.
The distance and slope parameters can be determined from the linear regression formulation as shown in below
where pxl and pyr is a column vector of the x and y component of the points that falls on the left and right line tunnel wall respectively, p0l and p0r is the perpendicular distance to the left and right wall respectively, α is the slope of the line segments, and W is a diagonal matrix with weights wi,i=e−(λp)
In the above formulation, the tunnel walls are assumed to be parallel, which is highly accurate of canal environments that exhibit reflection symmetry about the local x-z plane. The partial pose of the robot in the tunnel is then given by
and w is the estimated width of the tunnel and is given by w=½(f2+f3).
However, to construct the A matrix in (4), there is a need to determine which rangefinder in the sensing array measured the left and the right tunnel wall, given by {right arrow over (p)}xl and {right arrow over (p)}yr respectively. Equation (2) can be rewritten as
ρ−1=ρ0−1 cos(θ−α0) (8)
where ρ is the expected sensor measurements if placed with orientation θ given that the perpendicular distance, ρ0, from the origin and slope, α0.
Equation (2) does not accurately reflect the physical sensor model as the rangefinders have limited range, and does not output negative range. A more accurate representation would be
An over-complete dictionary, D, may be defined with the i-column, di given by
d
i=cos(θ−α1) (10)
where θ is a column vector from
with discretization Δθ, and αi⊂[−π,π] with discrete interval Δα.
The set of points from the sparse sensing array that falls on the left or right tunnel surface can be found by solving the following optimization problem
min∥W(p−Ds)∥22 subject to ∥s∥0≤N (11)
where W is a diagonal matrix with weights wi,i=e(λpi)
The minimization problem defined (11) is a known NP-hard problem. However, such problems are well-explored in compressing sensing literature, and various approximation methods are widely used. The method of matching pursuit (MP) may be used to find the a N-sparse approximation to the problem. The solution is denoted as s*. Then, the points that fall on the same tunnel wall are given by the index set, S, of the non-zero entries of Ds*, i.e.
S⊂supp(Ds*)={i⊂{1, . . . ,n}:Dsi*≠0} (12)
Let pS denote the submatrix of p containing only the elements with index in S and p
Pose Estimation in Vertical Shaft Environments
As vertical shafts are cylindrical, this reduces the problem to a 2D form, by intersecting the geometry with a local x-y plane.
r
0
2=(x−x0)2+(y−y0)2 (13)
where (x0, y0) is the centre of the circle in frame , and r0 is the radius of the circle.
Similarly, the parameters can be estimated by the least square formulation to fit the sensor measurements to the parametric representation
A=[2px 2py 1] b=[px2+py2] (14)
f=[x0 y0 x02+y02+r02]T (15)
Then, the local position of the robot within the vertical shaft is
==−[f1,f2] (16)
The radius of the shaft can also be determined as
r
0=√{square root over (f3−(f12+f22))} (17)
Design Optimization of Sparse Sensing Array
for the placement of the sensors on the right-hand plane of the robot. The optimal sensor configuration is optimized for a certain range of tunnel parameters γ i.e. γi corresponds to w, α of a tunnel or r of a shaft. For a sampled tunnel parameter γi, sensor readings from a set of robot pose η within the tunnel are also simulated. The sensor noise is simulated as a random Gaussian distribution centered at. To evaluate the performance of a sensor configuration, the root mean square (rms) error E, is used. E is defined as
where {tilde over (r)} denotes the estimated position of the robot (6) and (16) calculated from the simulated sensors, and {tilde over (r)}* is the ground truth position the robot. The subscript i,j denotes that pose was estimated from simulated sensor readings in tunnel parameter γi, and robot pose ηj. In this formulation, the errors across all the robot states are equally weighted i.e. 1 rad error in yaw estimation is equivalent to 1 m displacement error in the tunnel.
Then, the optimal sensor placement can be obtained by minimizing the error function E, or
mind,Θ,λE where d∈D={x∈m:g(d)≤0} (19)
where g is a logarithmic barrier function that penalizes the placement of the sensors near the constraint bounds. The design optimization also solves for the optimal λ, a parameter for the radial weighting function used in the weighted least squares, and the minimization problem in (11).
There are physical constraints on the robot that limits the feasible region for the placement of the sensors. In the case of aerial robots designed for visual inspection of the entire tunnel surface, the sensor cannot be placed within the field of view (FOV) of the camera, and the sensors need to be reasonably close to the mechanical components of the robot for the ease of mounting and electrical wiring. These physical bounds can be described mathematically by a generic user-defined function. For simplicity, the infeasible region due to the camera FOV is defined as a rectangle given by ∥x∥<xmin and region due to mechanical placement is defined as the area outside the bounding box with the corners fixed on the centre of each propulsion system i.e. ∥x∥>xmax∪∥y∥>ymax. The infeasible regions are also illustrated in
The ToF sensors for the planar and front/rear configuration may overlap to reduce the total number of ToF sensors required. The ToF sensors may be complemented by optical flow sensors that are used for localization along the tunnel axis, and may also be complemented with a downward pointing laser altimeter for measuring the altitude of the UAV within the covered infrastructure. The downward pointing laser altimeter may or may be not part of the rear configuration.
Optimization Using Genetic Algorithm
Genetic algorithm (GA) is a nature-inspired evolutionary algorithm that uses mutation, crossovers and selection to yield high-quality solutions for large optimization problems. In this case, the design optimization presented involves a large combinatorial search over a (3×m)-dimension space for the optimal sensor configuration that minimizes the rms error E. GA tackles the large search space by the discretizing the space into nodes where each node is a possible location to mount the sensors. During the search, each candidate solution is coded into a gene, which contains the optimization variables d, Θ and λ. The rms error E is directly used to evaluate the fitness of a particular gene. In each generation, a group of elites with the best fitness is guaranteed to survive next generations. The remaining genes are used to breed the next generations of candidate solutions, known as children, through crossover and mutation. The algorithm is allowed to evolve over many generations until the average change in the best fitness of the population stalls over a user-defined number of generations. The gene with the best fitness in the final generation is the optimal solution to the optimization problem.
Numerical Results
A numerical simulation was conducted and the rms error E was studied for m=2, 3 and 4 configuration, corresponding to the layout of m0=4, 6 and 8 sensors on the robot. The optimal configuration from the GA and the parameters used for the numerical simulation are shown in Table I.
= 0.05
indicates data missing or illegible when filed
In all cases, the best and mean penalty value and the average distance between individual converges, indicating that an optimal solution is found.
The optimal configuration output from GA have good localization performance of rms error E of at least 0.06 (the E for M2 configuration). The rms error improves with the increase in sensor count, M3 and M4 have lower E. Increasing m=2 to m=3 results to a significant two-fold reduction of the rms error from 0.058 m to 0.03 m. Further increment m=4 reduces the rms error but this time, the reduction is not as significant.
There may be an intuitive explanation for the GA results for M2. M2 configuration have the sensors pointing to the immediate left and right of the robot. This would ensure that the range measurement of the rangefinder would remains within range throughout the simulated yaw range of ±60° at various pose within the tunnel. The GA results for M3 and M4 are more challenging to find an intuitive explanation. The results from the GA are compared to sensor configurations that are placed heuristically. These configurations always results in a optimal E. Inspired by M2, M2′ have one sensor pointing 25° forward and the other pointing at 0°, the immediate right. M2′ have a high E of 1.189 and a reduced yaw range of ±30°. M4′ have configuration with sensor pointing at 25° interval. The resultant E is 0.804, and a reduced yaw range of ±40° which is worse than the M4. M6′ have a E of 1.51 m and a reduced yaw range of ±40°. The placement of the sensors in the sparse sensing array through heuristics or trial and error is a challenging one. On average, the rms error of the GA is at least 36 times better than the suboptimal configurations and is shown to be extremely powerful in producing a optimal sensor configuration with minimal E.
Experimental Results
A series of four experiments were performed to evaluate the performance of the optimal sensing configuration with the proposed pose estimation to autonomously navigate tunnel environments. In these experiments, when the robot was autonomously flight tested, the pilot only commanded the position of the robot along the longitudinal axis of the tunnel i.e. along -direction in horizontal tunnel environments and -direction in vertical shaft environments. The remaining DOF are controlled by the on-board controller. The experiments in the horizontal tunnel are discussed first, followed by those performed in the vertical shaft.
Prototype Platform
The visual inspection system, also referred herein as the camera system 102, may be independent of the aerial platform, supplied with its own battery and microcontroller. As such, all electrical wiring for power and digital signals runs through a centre 25 mm diameter carbon-fibre rod that the visual inspection system rotates about. The carbon-fibre rod may be the central member 104. The arms 106 of the prototype 1300 may also be fabricated out of carbon-fibre composite. The rotor 108, or propulsion system may include a DJI E1200 powertrain. The flight controller 516 may be Pixhawk 2.1 flight controller with built-in IMU, and a Intel Edison as companion computer. The range sensors 502 may include a lightweight array of six TeraRanger One rangefinders, weighing 10 g each. The sensor configuration is a hybrid of the M4 and M6 configuration from the GA result of the numerical simulation. The hybrid allows for a redundancy of the sensing system. For example, if either of the front two rangefinder were to fail, the algorithm can fall back to an algorithm that uses only four rangefinders to continue the navigation mission. The prototype weighs 5 kg, inclusive of the 21000-mAh 6S 10C Li-Ion battery.
Table II shows the breakdown of the weights of the various subsystems. The sparse sensing array and additional companion computer contribute to merely 2.8% of the total weight.
aincludes the weights of the power electronics and wirings,
bincludes the associated accessories e.g. compass, power modules, and etc.
Table III shows that compared to similar UAVs documented in academic literature, the sparse sensing system is on average 5 times lighter and consumes 12 times less power. The lean and low-power sensing system, as a result, enables the prototype to achieve 35 mins of autonomous flight. The total weight of the proposed system is at least quarter of conventional system and at least a ten-fold reduction in power consumption. These savings directly translate into improved flight endurance of the proposed platform.
a consists of six teraranger one, and a teraranger hub for synchronisation,
b includes the Intel Edison and the essential breakout boards
System Architecture
The proposed perception algorithm described in the earlier sections is implemented on an Intel Edison computer, running a Debian-based Linux distribution. The Edison runs the Robot Operating System (ROS) middleware for low-latency data acquisition, inter-hardware communication, and the high-level processing task. It interfaces to the sparse planar sensing array, a downward-pointing altimeter, and the IMU, and translates these sensory inputs into partial local position estimates within the tunnel. These estimates are fed into the EKF for full state estimates of the robot. A Pixhawk2.1 autopilot running the PX4 flight stack outputs for low-level command for the control of the SWIRL platform, based on the pilot input and the state estimates of the robot.
Autonomous Flight in Tunnels
The first and second experiments were carried out in horizontal tunnels. In these experiments, the pilot manually controls the robot to take-off, and then fly to an initial position, , which is roughly aligned to the tunnel axis of the tunnel, i.e {tilde over (r)}=0. At this point, the pilot toggles to autonomous mode. In this mode, the pilot only retains manual control of the acceleration along the tunnel axis, .
Autonomous Flight in Shafts
The goal of the third and fourth experiment was to evaluate the autonomous flight performance in vertical shaft environments. Similar to the previous experiments, the pilot manually controls the robot to take-off, and then fly to an initial position, , which is roughly at the centre of the tunnel, i.e. =0. At this point, the pilot toggles to autonomous mode. In this mode, the pilot only retains manual control of the acceleration along the tunnel axis, .
Extended Flight in Shaft Environments
Lastly, to evaluate the endurance of the system, the prototype 1300 was tested in an indoor mock-up of the vertical shaft. In this experiment, the robot was commanded to hover at a predetermined height and at the centre of the tunnel. The prototype 1300 achieved a total flight time of 35 minutes and 41 seconds.
It will be appreciated to a person skilled in the art that the terminology used herein is for the purpose of describing various embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is understood that the specific order or hierarchy of blocks in the processes/flowcharts disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of blocks in the processes/flowcharts may be rearranged. Further, some blocks may be combined or omitted. The accompanying method claims present elements of the various blocks in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any aspect described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects. Unless specifically stated otherwise, the term “some” refers to one or more. Combinations such as “at least one of A, B, or C,” “one or more of A, B, or C,” “at least one of A, B, and C,” “one or more of A, B, and C,” and “A, B, C, or any combination thereof” include any combination of A, B, and/or C, and may include multiples of A, multiples of B, or multiples of C. Specifically, combinations such as “at least one of A, B, or C,” “one or more of A, B, or C,” “at least one of A, B, and C,” “one or more of A, B, and C,” and “A, B, C, or any combination thereof” may be A only, B only, C only, A and B, A and C, B and C, or A and B and C, where any such combinations may contain one or more member or members of A, B, or C. All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. The words “module,” “mechanism,” “element,” “device,” and the like may not be a substitute for the word “means.” As such, no claim element is to be construed as a means plus function unless the element is expressly recited using the phrase “means for.”
Number | Date | Country | Kind |
---|---|---|---|
10201802492Y | Mar 2018 | SG | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/SG2019/050167 | 3/26/2019 | WO | 00 |