This application claims the benefit of priority to Korean Patent Application No. 10-2023-0181268, filed in the Korean Intellectual Property Office on Dec. 13, 2023, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an apparatus for recognizing a runway using an image and a method therefor.
Research on air mobility has been actively conducted for future traffic and transportation systems. Recently, research on air mobility in an autonomous flight scheme that does not depend on a pilot has been actively conducted. The air mobility may be divided into a rotary wing type aircraft and a fixed wing type aircraft.
The fixed wing type air mobility requires a runway for takeoff and landing and requires a technology for determining a position of the runway in a flight status to land on the runway in the autonomous flight scheme. It is common for a landing system of the air mobility to use an instrument landing system (ILS) and a global positioning system (GPS).
However, because each of all airports has the ILS, it is difficult to operate air mobility in the autonomous flight scheme in an airport without the ILS.
Furthermore, because a general landing system of the air mobility is heavily dependent on the GPS, the landing stability of the air mobility is inevitably vulnerable to an error in the GPS.
Thus, there is a need for a method for operating the air mobility in a wider place and increasing the stability of the landing system of the air mobility.
The present disclosure relates to an apparatus for recognizing a runway using an image and a method therefor, and more particularly, relates to technologies for recognizing a runway using an image obtained by air mobility.
Some embodiments of the present disclosure can solve the above-mentioned problems occurring in the prior art while advantages achieved by the prior art are maintained intact.
An embodiment of the present disclosure can provide an apparatus for recognizing a runway using an image to allow air mobility to land in an autonomous flight scheme event in a place without an instrument landing system (ILS) and a method therefor.
An embodiment of the present disclosure can provide an apparatus for recognizing a runway using an image to more accurately recognize a runway depending on an altitude or a position of air mobility and a method therefor.
An embodiment of the present disclosure can provide an apparatus for recognizing a runway using an image to prevent a position information difference of a runway from being generated in a process of changing an algorithm that recognizes a runway and a method therefor.
Technical problems to be solved by the present disclosure are not necessarily limited to the aforementioned problems, and other technical problems not mentioned herein can by solved by some embodiments, which can be understood from the following description by those skilled in the art to which the present disclosure pertains.
According to an embodiment of the present disclosure, an apparatus for recognizing a runway using an image may include a sensor device that obtains flight status information of air mobility and a processor that obtains information about a position of a runway in response to the flight status information obtained by the sensor device. The processor may obtain first position information about a runway position based on a first algorithm, in response to that a flight status is an approach phase, may obtain second position information about the runway position based on a second algorithm different from the first algorithm, in response to that the flight status is a ground roll phase, and may fuse the first position information with the second position information to obtain the information about the position of the runway, in response to that the flight status corresponds to a phase change interval between the approach phase and the ground roll phase.
According to an embodiment, the processor may identify an altitude of the air mobility from the flight phase information and may determine that the air mobility is in the phase change interval, when the altitude of the air mobility is greater than or equal to zero and is less than or equal to a reference altitude.
According to an embodiment, the processor may reflect a first weight in the first position information to obtain first correction position information, may reflect a second weight, increasing as the first weight decreases, in the second position information to obtain second correction position information, and may add the first correction position information and the second correction position information to obtain the information about the position of the runway in the phase change interval.
According to an embodiment, the processor may determine the first weight to be small, as the air mobility is close to an end point of the phase change interval, and may determine the second weight such that the sum of the first weight and the second weight is constant.
According to an embodiment, the processor may determine the first weight to be small in magnitude, as the air mobility is lowered in altitude.
According to an embodiment, the processor may determine the first weight to be small, as the air mobility and the runway are close to each other in distance.
According to an embodiment, the apparatus may further include a camera mounted on the air mobility. The processor may determine a relative position of the runway with respect to the air mobility from a first image obtained by the camera, using the first algorithm.
According to an embodiment, the processor may transform the first image into a first bird's eye view (BEV) image, may determine a reference projection point corresponding to coordinates onto which a camera principal point is projected on the first BEV image, and may determine a position error of a reference point of the runway for the air mobility as an actual error, based on a coordinate error between the reference projection point and the reference point of the runway.
According to an embodiment, the processor may obtain information about a position of a centerline of the runway from a second image obtained by the camera.
According to an embodiment, the processor may transform the second image into a second BEV image, may transform the second BEV image into a hue-saturation-value (HSV) image, and may extract a center line of the runway from the HSV image.
According to an embodiment of the present disclosure, a method for recognizing a runway using an image may include determining a flight status of air mobility, based on flight phase information, obtaining first position information about a runway position based on a first algorithm, in response to that the flight status is an approach phase, obtaining second position information about the runway position based on a second algorithm different from the first algorithm, in response to that the flight status is a ground roll phase, and fusing the first position information obtained using the first algorithm with the second position information obtained using the second algorithm to obtain information about a position of the runway, in response to that the flight status is a phase change interval between the approach phase and the ground roll phase.
According to an embodiment, the determining of the flight status of the air mobility may include identifying an altitude of the air mobility and determining that the air mobility is in the phase change interval, when the altitude of the air mobility is greater than or equal to zero and is less than or equal to a reference altitude.
According to an embodiment, the fusing of the first position information and the second position information to obtain the information about the position of the runway may include reflecting a first weight in the first position information to obtain first correction position information, reflecting a second weight, increasing as the first weight decreases, in the second position information to obtain second correction position information, and adding the first correction position information and the second correction position information.
According to an embodiment, the fusing of the first position information and the second position information to obtain the information about the position of the runway may include determining the first weight to be small, as the air mobility is close to an end point of the phase change interval, and determining the second weight such that the sum of the first weight and the second weight is constant.
According to an embodiment, the determining of the first weight to be small, as the air mobility is close to the end point of the phase change interval may include determining the first weight to be small in magnitude, as the air mobility is lowered in altitude.
According to an embodiment, the determining of the first weight to be small, as the air mobility is close to the end point of the phase change interval may include determining the first weight to be small, as the air mobility and the runway are close to each other in distance.
According to an embodiment, the obtaining of the first position information may include obtaining a first image using a camera mounted on the air mobility and determining a relative position of the runway with respect to the air mobility from the first image.
According to an embodiment, the determining of the relative position of the runway with respect to the air mobility from the first image may include transforming the first image into a first BEV image, determining a reference projection point corresponding to coordinates onto which a camera principal point is projected on the first BEV image, and determining a position error of a reference point of the runway for the air mobility as an actual error, based on a coordinate error between the reference projection point and the reference point of the runway.
According to an embodiment, the obtaining of the second position information may include obtaining information about a position of a centerline of the runway from a second image obtained by the camera.
According to an embodiment, the obtaining of the information about the position of the centerline of the runway from the second image may include transforming the second image into a second BEV image, transforming the second BEV image into an HSV image, and extracting a center line of the runway from the HSV image.
The above and other features and advantages of the present disclosure can be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
Hereinafter, some example embodiments of the present disclosure will be described in detail with reference to the example drawings. In adding reference numerals to the components of each drawing, it can be noted that identical component can be designated by identical numerals even when they are displayed on other drawings. In addition, a detailed description of well-known features or functions can be omitted to not unnecessarily obscure the gist of the present disclosure.
In the present disclosure, terms such as “first”, “second”, “A”, “B”, “(a)”, “(b)”, and the like, may be used. Such terms can be used merely to distinguish one element from another element, but do not necessarily limit the corresponding elements irrespective of the order or priority of the corresponding elements. Furthermore, unless otherwise defined, terms including technical and scientific terms used herein can have a same meaning as being generally understood by those skilled in the art to which the present disclosure pertains. Such terms as those defined in a generally used dictionary can be interpreted as having meanings equal to contextual meanings in the relevant field of art, and are not to be interpreted as having ideal or excessively formal meanings unless clearly defined as having such in the present application.
Hereinafter, some example embodiments of the present disclosure will be described in detail with reference to
Referring to
The camera 10 may be to obtain an image for a heading direction of air mobility AM. For example, the camera 10 may be located at a position at which it may capture a forward direction on a lower portion of the air mobility AM. The image obtained by the camera 10 may be used in a process of determining a position of the runway.
The sensor device 20 may obtain body status information and flight phase information.
The body status information may include yaw, pitch, and roll of an airframe. To obtain the body status information, the sensor device 20 may use an acceleration sensor and a gyro sensor, for example. The acceleration sensor may be a 3-axis acceleration sensor capable of measuring acceleration of a 3-axis direction of the air mobility AM. The gyro sensor may be to measure an angular velocity of the air mobility AM and may measure an angle at which the air mobility rotates over time.
The flight phase information may be information for dividing a flight status of the air mobility AM.
The processor 30 may obtain information about a position of the runway, based on the image provided from the camera 10.
A description will be given of the method for obtaining the information about the position of the runway depending on the flight status with reference to
Referring to
The approach phase may be a phase in which the air mobility AM descends in a state in which it is close to a runway RL.
The ground roll phase may include a duration when the air mobility AM rolls out on the ground. The ground roll phase may include a timing when the air mobility AM touches down on the runway RL. A starting point of the ground roll phase may be a point at which the air mobility AM touches down or may be a point prior to the point at which the air mobility AM touches down. For example, the starting point of the ground roll phase may be determined within a range in the altitude of the air mobility AM is between 0 m and 5 m.
As shown in
The processor 30 may use a first image processing algorithm in the approach phase and may use a second image processing algorithm in the ground roll phase.
The first image processing algorithm may recognize the runway RL from the image and may evaluate a position of the runway RL. The evaluation of the position of the runway RL may include a procedure for determining a horizontal position of the runway RL and may particularly include a procedure for obtaining a center point of a starting line of the runway RL.
The second image processing algorithm may perform a procedure for detecting a centerline of the runway RL from the image.
A description will be given in detail of the method for obtaining the information about the position of the runway in the processor 30.
In operation S510, a processor 30 may determine a flight status of air mobility AM based on flight phase information.
The processor 30 may receive the flight phase information from a sensor device 20 and/or an FCC 50.
The flight phase information may be classified as an approach phase or a ground roll phase. The approach phase may be determined on the basis of a timing when the air mobility AM descends to land on the runway RL.
In operation S520, the processor 30 may obtain information about a position of the runway RL from a first image, based on a first image processing algorithm, in response to the flight status being the approach phase.
The first image may be obtained by a camera 10 mounted on the air mobility AM.
The first image processing algorithm may be to detect a runway from the first image and determine an actual runway position based on back projection of the image.
The first image processing algorithm may detect a runway starting line. The runway starting line may be an end of the runway RL, which can be close to a point at which the air mobility AM touches down and may refer to a line in a direction perpendicular to a direction in which the air mobility AM rolls out on the ground.
The processor 30 may detect a polygon using a mask from the first image and may determine two points at the farthest point in a lateral direction on the polygon as both end points of the runway starting line.
Because the runway starting line is able to be greatly expressed in an area on the runway RL of the first image and be well recognized, the processor 30 may increase recognition performance when obtaining runway position information based on the runway starting line.
The processor 30 may determine a relative position of the runway RL with respect to the air mobility AM on a heading coordinate system.
The processor 30 may transform the first image into a Bird's eye view (BEV) image and may determine the relative position of the runway RL with respect to the air mobility AM on the heading coordinate system, based on a position deviation between an image center, which can be the center of the BEV image, and based on a center point corresponding to the center of the runway starting line.
The processor 30 may transform the first image into the first BEV image and may determine the relative position of the runway RL based on the first BEV image. The relative position of the runway RL may be determined based on a coordinate error between a reference projection point onto which a principal point of the camera 10 is projected on the first BEV image and a reference point of the runway RL. The reference projection point may be coordinates on the first BEV image, which correspond to principal point coordinates of the camera 10, which are obtained on the heading coordinate system. The reference point of the runway RL may be a center point of the runway starting line.
The processor 30 may obtain a projection distance between the camera 10 and the runway RL along a projection direction of the camera 10 and may obtain the principal point coordinates of the camera 10 on the heading coordinate system based on the projection distance and the camera principal point on an image coordinate system.
The processor 30 may reflect a scale factor in the coordinate error between the reference projection point of the camera principal point and the reference point on the first BEV image and may determine an actual distance error between an extension line in a heading direction of the air mobility AM and the center point of the runway starting line. The reference point may be set to the center point of the starting line of the runway RL. The scale factor may be obtained based on an actual length of the runway starting line and a length of the runway starting line on the first BEV image.
Furthermore, the processor 30 may reflect body status information of the air mobility AM in the process of transforming the first image into the first BEV image. The body status information may include yaw, pitch, roll, or any combination thereof, of the air mobility AM. The processor 30 may receive information about yaw, pitch, and roll of the air mobility AM from an inertial navigation system (INS) 40.
In operation S530, the processor 30 may obtain information about a position of the runway from the second image, based on a second image processing algorithm, in response to the flight status being the ground roll phase.
The second image may be an image obtained by the camera 10, which can be the same as the first image.
The second image processing algorithm may be to recognize a centerline of the runway RL from the second image and determine a lateral relative position of the runway RL.
The processor 30 may transform the second image into a second BEV image, based on the second image processing algorithm. The processor 30 may extract a centerline of the runway RL included in the second BEV image and may represent the centerline of the runway RL as a straight line equation on a coordinate axis of the second BEV image.
The centerline of the runway RL may refer to a straight line formed along a direction in which the air mobility AM rolls out on the ground. The processor 30 may transform the second BEV image into a hue-saturation-value (HSV) image and may extract the centerline of the runway RL from the HSV image.
The FCC 50 may control the air mobility AM based on the runway position information, and for example, may control a motor, a transmission, or the like.
A description will be given below of a method for recognizing a runway according to an embodiment of the present disclosure with reference to
In operation S610, a processor 30 may divide a flight status into an approach phase, a phase change interval, and a ground roll phase, based on flight phase information.
The approach phase may be a phase in which air mobility AM descends in a state in which it is close to a runway RL.
The phase change interval may be an interval that changes from the approach phase to the ground roll phase. An end point of the phase change interval may correspond to a starting point of the ground roll phase. For example, the phase change interval may be determined within a range in which the altitude of the air mobility AM is between 0 m and 5 m. A starting point of the phase change interval may be determined on the basis of an altitude and may be set to an altitude 5 m to 10 m higher than the end point of the phase change interval.
The ground roll phase may include a duration when the air mobility AM rolls out on the ground. The starting point of the ground roll phase may be a point at which the air mobility AM touches down or a point prior to the point at which the air mobility AM touches down.
In operation S620, the processor 30 may obtain first position information about a runway position based on a first algorithm, in response to the flight status being the approach phase.
The first algorithm may use, but is not limited to, a first image processing algorithm shown in
The first position information may include a center point of a runway starting line.
In operation S630, the processor 30 may obtain second position information about a runway position based on a second algorithm, in response to the flight status being the ground roll phase.
The second algorithm may use, but is not limited to, a second image processing algorithm shown in
The second position information may include information about the centerline of the runway, and the information about the centerline of the runway may include a center point of the runway starting line.
In operation S640, the processor 30 may obtain information about a position of the runway RL using the first algorithm and the second algorithm, in response to the flight status being the phase change interval. In the phase change interval, the processor 30 may reflect a first weight in first position information obtained using the first algorithm to obtain first correction position information.
The closer the air mobility AM is to the end point of the phase change interval, the smaller the processor 30 may set the first weight to be and the larger the processor 30 may set a second weight to be.
The first weight and the second weight may be determined based on an altitude of the air mobility AM. For example, the lower the altitude of the air mobility AM, the smaller the first weight may be set to be and the larger the second weight may be set to be.
The first weight and the second weight may be determined based on a distance between the air mobility AM and the runway RL. For example, the closer the distance between the air mobility AM and the run RL, the smaller the first weight may be set to be and the larger the second weight may be set to be. As shown in
The sum of the first weight and the second weight may be “1”. For example, when the first weight is W1 and when the second weight is W2, W2 may be set as 1−W1.
A description will be given of an embodiment of determining the first weight W1 and the second weight W2 based on the altitude of the air mobility AM. When the phase change interval is between altitude h1 and altitude h0, the first weight W1 may be obtained based on Equation 1 below.
In Equation 1 above, h may be the altitude of the air mobility AM. h0 may be the altitude of the end point in the phase change interval, and h1 may be the altitude of the starting point in the phase change interval. h may be the altitude of the air mobility AM. For example, when h0 is set to 5 m and h1 is set to 10 m, the first weight W1 may be calculated as (h−5)/5 according to the altitude h of the air mobility AM. Thus, when the altitude h of the air mobility AM is 10 m, the first weight W1 may be calculated as “1” and the second weight W2 may be calculated as “0”. Similarly, when the altitude h of the air mobility AM is 8 m, the first weight W1 may be calculated as “0.6” and the second weight W2 may be calculated as “0.4”. Furthermore, when the altitude h of the air mobility AM is 5 m, the first weight W1 may be calculated as “0” and the second weight W2 may be calculated as “1”.
The processor 30 may multiply the first position information by the first weight W1 to obtain the first correction position information and may multiply the second position information by the second weight W2 to obtain the second correction position information.
The processor 30 may fuse the first correction position information with the second correction position information to obtain information for a position of the runway RL. For example, the processor 30 may add the first correction position information and the second correction position information to obtain information about a position of the runway RL.
The first position information may be coordinates of a center point of a runway starting line. The second position information may be a centerline of the runway RL and may include coordinates of the center point of the runway starting line.
When the first position information includes (x1, y1) and the second position information includes (x2, y2), the processor 30 may obtain information about a position of a runway as (x1×W1+x2×W2, y1×W1+y2×W2) in the phase change interval.
Hereinafter, a description will be given of a detailed embodiment for respective procedures.
A description will be given below of a method for recognizing a runway based on the first image processing algorithm with reference to
The first image processing algorithm may be to obtain information about a position of the runway RL based on a first image obtained from the sky by the air mobility AM. The first image processing algorithm may use coordinate transformation for an image coordinate system to which the first image belongs. The coordinate transformation of the image coordinate system may be performed based on Euler angles including roll, pitch, and yaw of the air mobility AM.
Referring to
A description will be given below of a coordinate system used in an embodiment of the present disclosure with reference to
A body coordinate system may refer to a coordinate system fixed to air mobility AM and may be a 3-axis coordinate system including an XB-axis, a YB-axis, and a ZB-axis. The XB-axis may refer to a heading direction of the air mobility AM, the YB-axis may refer to a wing direction of the air mobility AM, and the ZB-axis may refer to a lower direction of the air mobility AM. The body coordinate system may be obtained using Euler angles ϕ, θ, and ψ of the air mobility AM on a world coordinate system.
The heading coordinate system may be a coordinate system that rotates by the yaw angle ψ of the air mobility AM on the world coordinate system.
An image coordinate system may be a two-dimensional (2D) coordinate system of an image obtained by the camera 10 and may include an X1-axis and Y1-axis around the origin O1 of a right upper end.
A camera coordinate system may be a 3-axis coordinate system including an XC-axis, a YC-axis, and a ZC-axis and may be obtained using a camera calibration matrix K on the image coordinate system.
A runway coordinate system may refer to a coordinate system for control of an airframe and may be a 3-axis coordinate system including an XR-axis, a YR-axis, and a ZR-axis. The runway coordinate system may be a coordinate system that rotates by a yaw angle ψr of the runway on the world coordinate system.
The world coordinate system may be a 3-axis coordinate system defined as the origin OW and an XC-axis, a YC-axis, and a ZC-axis and may be a north-east-down (NED) coordinate system.
A projection distance between a camera 10 and a runway RL may refer to a distance to a projection point CH at which a principal point of the camera 10 is projected onto a heading coordinate system. A description will be given below of a method for calculating a distance between the principal point of the camera 10 and a camera principal point projected onto a world coordinate system with reference to
When a basic vector on the world coordinate system is defined as {{circumflex over (x)}w, ŷw, and {circumflex over (z)}w}, when a basic vector on a body coordinate system is defined as {{circumflex over (x)}b, ŷb, and {circumflex over (z)}b}, and when a basic vector on a camera coordinate system is defined as {{circumflex over (x)}c, ŷc, and {circumflex over (z)}c}, relationships, such as Equations 2 to 4, may be established between the respective basic vectors.
In Equations 2 to 4 above, Rx, Ry, and Rz may be a rotational matrix using Euler angles.
A distance Z′ between a point CH at which a camera principal point is projected onto a heading coordinate system and the camera principal point may be represented as Equation 5 below by a relationship between the world coordinate system and the camera coordinate system.
As shown in
A description will be given below of a method for obtaining 3D coordinates of the camera principal point.
Hereinafter, in the specification, RAB may be the rotational matrix for transforming a vector from frame A to frame B, and tAB may be the translation matrix indicating frame A as frame B.
A relationship, such as Equation 7 below, between any first reference point ({circumflex over (x)}i, ŷi) on an image coordinate system and a second reference point ({circumflex over (x)}c, ŷc, {circumflex over (z)}c) on the camera coordinate system, which can correspond to the first reference point, may be established.
In Equation 7 above, K may be the intrinsic parameter of the camera.
A rotational matrix RiC of the image coordinate system and the camera coordinate system may be represented as Equation 8 below.
The 3D coordinates of the principal point of the camera may be obtained as Equation 9 below.
In Equation 9 above, CI may refer to the camera principal point on the image coordinate system, CC may refer to the coordinates corresponding to the camera principal point on the camera coordinate system, and CB may refer to the coordinates corresponding to the camera principal point on the body coordinate system. CH may refer to the coordinates corresponding to the camera principal point on the heading coordinate system.
When there is no transformation by the translation matrix between the body coordinate system and the heading coordinate system and when there is no rotation transformation by roll and pitch, the coordinates of the camera principal point on the heading coordinate system may be represented as Equation 10 below.
When substituting [Equation 10] above into [Equation 9] above, the coordinates of the camera principal point on the heading coordinate system may be represented as Equation 11 below.
A processor 30 may transform pixels of a first image into a first BEV image using the translation matrix. The translation matrix may be obtained using a translation matrix generation function. Source points of the first image may be used to obtain the translation matrix. Coordinates of the source points may be corrected by reflecting yaw, pitch, and roll of air mobility AM.
A description will be given below of a method for obtaining a translation matrix for transforming the first image into the first BEV image with reference to
First to fourth source points src1 to src4 in
As shown in
As shown in
Referring to
In Equation 12 above, fy may refer to the focal length of the camera in the y-axis direction.
Referring to
When the vertical distance between a principal point VP0 of the image and the vanishing line VL is dθ, a tilt angle θtilt of the camera 10 may be represented as Equation 13 below.
Referring to
The translation matrix TMb may be obtained using a translation matrix generation function. For example, first to fourth source points src1 to src4 may be provided as inputs of “OpenCV getPerspectiveTransform function” to generate the translation matrix TMb.
The first to fourth source points src1 to src4 shown in
A description will be given below of a method for determining a relative position of a runway with respect to air mobility with reference to
Referring to
The coordinate error may include an x-error xE and a y-error yE. The x-error xE may indicate an x-coordinate difference between the reference projection point T_CH and the reference point XE_C of the runway RL. The y-error yE may indicate a y-coordinate difference between the reference projection point T_CH and the reference point XE_C of the runway RL.
When coordinates of both ends of a runway starting line on the first BEV image are (u1, v1) and (u2, v2), respectively, each of the x-error xE and the y-error yE may be obtained using Equation 14 below.
A processor 30 may calculate an actual error (XE, YE), based on the x-error xE and the y-error yE. The actual error (XE, YE) may be obtained by multiplying a coordinate error (xE, yE) by a scale factor Ks. The scale factor Ks may be obtained based on a distance w between coordinates (u1, v1) and coordinates (u2, v2) and an actual length of the runway starting line RSL. When the actual length of the runway starting line RSL is WR, the scale factor Ks may be obtained by dividing WR by w.
As a result, a center point PH of the runway starting line RSL for the air mobility AM on a heading coordinate system may be represented as Equation 15 below.
In Equation 15 above, CHx may refer to the x-coordinate of the camera principal point on the heading coordinate system, and CHy may refer to the y-coordinate of the camera principal point on the heading coordinate system.
As in Equation 16 below, the center point PR of the runway starting line RSL on the runway coordinate system may be obtained using the rotational matrix RHR and the center point PH of the runway starting line RSL on the heading coordinate system.
The processor 30 may obtain the center point PR of the runway starting line RSL on the runway coordinate system, based on a first image processing algorithm. The center point PR of the runway starting line RSL on the runway coordinate system may be first position information described with reference to
Furthermore, the processor 30 may remove noise of the first position information using a filter. For example, the processor 30 may increase the reliability of the first position information, using a Kalman filter.
Referring to
The processor 30 may perform the camera calibration for coordinate transformation of the second image IMG2 obtained by a camera 10.
The processor 30 may transform the second image IMG2, the camera calibration of which can be performed, into a second BEV image. The second BEV image may be an RGB image.
The processor 30 may transform the second BEV image IMG2 in RGB format into an HSV image.
The processor 30 may extract a feature point of a runway RL from the HSV image. For example, the processor 30 may extract a centerline of the runway RL from the HSV image.
A description will be given below of a detailed procedure of the second image processing algorithm.
Referring to
Because the altitude of air mobility AM is “0” or is very low, a second image processing algorithm may fail to consider the altitude of the air mobility AM in a process of transforming a coordinate system of a second image. A horizontal coordinate and a vertical coordinate on an image may correspond to an x-coordinate and a y-coordinate, respectively.
The reference points P1, P2, P3, and P4 on the image may be transformed into (X, Y) coordinates of the world coordinate system based on Equation 17 below.
In Equation 17 above, w may be the proportional constant and may be represented as (p31x+p32y+p33).
Furthermore, the processor 30 may use actual information about any subject on the image to allow pixel units to correspond to meter units on the world coordinate system.
Referring to
The size of the ROI may be set to exclude another object capable of being confused with the centerline CL of the runway RL. For example, the size of the ROI may be set to a size at which side lines SL of the runway RL are not detected. Thus, the size of the ROI may be set to another size according to the altitude of the air mobility AM.
According to an embodiment, the higher the altitude of the air mobility AM, the smaller the ROI may be set to be. The lower the altitude of the air mobility AM, the larger the ROI may be set to be. For example, as shown in
After detecting the centerline CL of the runway RL, the processor 30 may output information about a position of the centerline CL. The information about the position of the centerline CL may be second position information. Alternatively, the information about the position of the centerline CL may be a point at which the centerline CL of the runway RL starts in a runway starting line.
According to an embodiment, as in Equation 18 below, the processor 30 may represent a position of the runway centerline CL as a linear equation.
In Equation 18 above, l may be the error between the runway centerline CL and the center line of the HSV image. The center line of the HSV image may be a vertical line parallel to an x-axis on a world coordinate system. l may be the metric indicating the horizontal error between the air mobility AM and the runway centerline CL and may be output in m units.
The information about the position of the runway RL, which can be obtained using the above-mentioned method, may be used in a process of generating a control signal of the air mobility AM.
For example, a center point of the runway starting line, which can be obtained based on a first image processing algorithm, may be used to determine a touchdown position of the air mobility AM.
A lateral relative distance of the runway RL, which can be obtained based on a second image processing algorithm before touchdown, may be used to determine a touchdown position. Furthermore, a slope of the runway centerline CL, which can be obtained based on the second image processing algorithm after the touchdown, may be used in a process of generating a control command for ground roll of the air mobility AM.
The control signal of the air mobility AM may include a control signal for controlling ailerons and a control signal for controlling a rudder and a steering device.
Referring to
The processor 1100 may be a central processing unit (CPU) or a semiconductor device that processes instructions stored in the memory 1300 and/or the storage 1600. A storage medium may include the memory 1300 and the storage 1600, which may include various types of volatile or non-volatile storage media. For example, the memory 1300 may include a read only memory (ROM) 1310 and a random access memory (RAM) 1320.
Accordingly, the operations of the method or algorithm described in connection with the embodiments disclosed in the specification may be directly implemented with a hardware module, a software module, or a combination of the hardware module and the software module, which can be executed by the processor 1100. The software module may reside on a storage medium (that is, the memory 1300 and/or the storage 1600) such as a RAM, a flash memory, a ROM, an EPROM, an EEPROM, a register, a hard disc, a removable disk, and a CD-ROM.
The example storage medium may be coupled to the processor 1100. The processor 1100 may read out information from the storage medium and may write information in the storage medium. Alternatively, the storage medium may be integrated with the processor 1100. The processor and the storage medium may reside in an application specific integrated circuit (ASIC). The ASIC may reside within a user terminal. In another case, the processor and the storage medium may reside in the user terminal as separate components. For the processor 1100, one or more processors can be together and/or remote.
According to an embodiment, the apparatus for recognizing the runway may determine a position of a runway without using an instrument landing system (ILS) and a global positioning system (GPS), thus guiding air mobility to land even in an airport without the ILS.
Furthermore, according to an embodiment, the apparatus for recognizing the runway may differently use an image processing algorithm depending on a position or an altitude of the air mobility, thus more accurately determining a position of the runway depending on the position or the altitude of the air mobility.
Furthermore, according to an embodiment, the apparatus for recognizing the runway may reflect a weight which varies with an altitude or a position of the air mobility in information about a position of the runway, which can be obtained in a process of changing an image processing algorithm, thus reduce that a position information deviation is generated by different image processing algorithms.
In addition, various effects ascertained directly or indirectly through the present disclosure may be provided.
Hereinabove, although the present disclosure has been described with reference to example embodiments and the accompanying drawings, the present disclosure is not limited thereto, but may be variously modified and altered by those skilled in the art to which the present disclosure pertains without departing from the spirit and scopes of the present disclosure claimed in the following claims.
Therefore, the example embodiments of the present disclosure are not intended to limit the technical spirit of the present disclosure, but are provided for illustrative purposes. Scopes of the present disclosure can be construed on the basis of the accompanying claims, and technical ideas within scopes equivalent to the claims can be included in scopes of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0181268 | Dec 2023 | KR | national |