The present invention relates to an airspace information processing device, an airspace information processing method, and a non-transitory computer-readable medium storing an airspace information processing program.
Today, various navigation systems have been put into practice to monitor vehicles on the Earth. In order to manage the operation of aircraft whose travel distance is longer than that of other carriers, it is necessary to calculate the azimuth and distance of the aircraft in a wide area. Aircraft navigation systems are generally required to process large-scale spatial information accurately and effectively in a wide area such as a country's territory and airspace, or a flight information region (FIR).
For example, each air route of aircraft or the like can be represented by a line segment connecting two points on a true sphere. In this case, in order to ensure the security of the aircraft or the like, it is extremely important to determine whether or not two air routes intersect with each other. Further, each aircraft flies in an airspace set in the air in which the operation of the aircraft is allowed, thereby ensuring the security of the aircraft. In this case, if adjacent airspaces overlap one another, several aircrafts may enter into the overlapping airspace, which poses a problem in terms of security. Accordingly, it is necessary for the navigation systems mentioned above to appropriately design the airspace for ensuring the security of the aircraft.
As an example of such navigation systems, a method for determining a positional relationship to determine whether an arbitrary point is inside or outside a polygon on the Earth has been proposed. In this example, a search direction of each side of a polygon (in other words, a circumferential direction of a closed curve) for defining an airspace is taken into consideration to determine which one of right and left regions with respect to the circumference direction is an airspace.
Japanese Patent Application No. 2013-271712 proposes a technique for detecting, for various airspaces, an intersection point between line segments forming each airspace, and determining whether a vehicle is on the inside or outside of the airspace.
[Patent Literature 1] Japanese Unexamined Patent Application Publication No. 2012-88902
However, the present inventor has found that the above-mentioned techniques have the following problems. That is, depending on flight rules or airspace design specifications, it may be required to manage a large airspace extending across countries or continents. In this case, for example, it can be assumed that the circumferential direction of a closed curve for defining an airspace differs from country to country, or differs from airspace to airspace. To deal with this, the technique disclosed in Patent Literature 1 takes into consideration the circumferential direction of a closed curve (the probing direction of each side of a polygon), but does not take into consideration how to deal with a case where the direction of a closed curve for defining an airspace to be managed varies. If a plurality of airspaces including airspaces defined by closed curves with different circumferential directions are managed by the technique disclosed in Patent Literature 1, an unacceptable error in airspace design, such as false recognition as to the inside or outside region of an airspace due to a difference in the circumferential direction, may occur.
The present invention has been made in view of the above-mentioned circumstances, and an object of the present invention is to manage, in a unified manner, a plurality of airspaces each having an unspecified circumferential direction.
Another object of the present invention made in view of the above-mentioned circumstances is to correctly and accurately determine positional relationships in regions having arbitrary shapes and sizes on the ground.
An airspace information processing device according to an aspect of the present invention includes: transfer means for generating a transferred image by transferring a whole or part of a closed curve representing an outline of an airspace from its original position to another position on a spherical surface in such a manner that the transferred image has no intersection point with the closed curve, the closed curve being formed of one or more line segments on the spherical surface; line segment generation means for generating, from the one or more line segments forming the closed curve, a determination line segment having an intersection point with the transferred image and having no intersection point with other line segments forming the closed curve; and airspace recognition means for recognizing, as the airspace, a region in which the line segment is present, the region being one of two regions on the spherical surface that are defined by the closed curve.
An airspace information processing method according to another aspect of the present invention includes: causing transfer means to generate a transferred image by transferring a whole or part of a closed curve representing an outline of an airspace from its original position to another position on a spherical surface in such a manner that the transferred image has no intersection point with the closed curve, the closed curve being formed of one or more line segments on the spherical surface; causing line segment generation means to generate, from the one or more line segments forming the closed curve, a determination line segment having an intersection point with the transferred image and having no intersection point with other line segments forming the closed curve; and causing airspace recognition means to recognize, as the airspace, a region in which the line segment is present, the region being one of two regions on the spherical surface that are defined by the closed curve.
A non-transitory computer-readable medium storing an airspace information processing program according to still another aspect of the present invention causes a computer to execute: processing for generating a transferred image by transferring a whole or part of a closed curve representing an outline of an airspace from its original position to another position on a spherical surface in such a manner that the transferred image has no intersection point with the closed curve, the closed curve being formed of one or more line segments on the spherical surface; processing for causing line segment generation means to generate, from the one or more line segments forming the closed curve, a determination line segment having an intersection point with the transferred image and having no intersection point with other line segments forming the closed curve; and processing for causing airspace recognition means to recognize, as the airspace, a region in which the line segment is present, the region being one of two regions on the spherical surface that are defined by the closed curve.
According to the present invention, a plurality of airspaces each having an unspecified circumferential direction can be managed in a unified manner.
An airspace information processing device 100 according to a first exemplary embodiment will be described. The airspace information processing device 100 is a device that manages, in a unified manner, pieces of information on a plurality of airspaces which are each defined by a closed curve formed of one or more line segments and have an unspecified circumferential direction. The airspace information processing device 100 is configured using hardware resources such as a computer system.
First, line segments forming a closed curve will be described as premises for understanding an airspace. Line segments on a true sphere can be roughly divided into the following three types.
A line segment connecting two points on the true sphere in the shortest distance
A line segment connecting a point P1 and a point P2 to each other on a true sphere CB (on the ground) will be described.
Assuming that P represents a point on the line segment L connecting the point P1 and the point P2 to each other on the true sphere CB and sa represents the cosine of the angle formed between the unit normal vector Va and the position vector of the point P, sa is represented by the following formula (2).
[Formula 2]
({right arrow over (Va)}·{right arrow over (P)})=sa (2)
Since it is apparent that the unit normal vector Va and the line segment L are orthogonal to each other, the cosine Sa is 0. Accordingly, the point P on the line segment L can be defined as a point that satisfies the following formula (3).
[Formula 3]
({right arrow over (Va)}·{right arrow over (P)})=0 (3)
A circle on the true sphere CB will be described.
[Formula 4]
{right arrow over (Vd)}={right arrow over (P0)}
({right arrow over (Vd)}·{right arrow over (P)})=sd (4)
where sd represents the cosine of the angle formed between the point P0 and the point P on the true sphere CB, and is expressed by the following formula (5).
An arc on the true sphere CB will be described. The arc on the true sphere CB can be understood as being a set of points at the distance r from the point P0 on the true sphere CB.
A case where the direction from a start point to an end point of the arc is counterclockwise will be described.
[Formula 6]
{right arrow over (Ve)}={right arrow over (P0)}
({right arrow over (Ve)}·{right arrow over (P )})=se (6)
where sd represents the cosine of the angle formed between the point P0 and the point P on the true sphere, and is expressed by the following formula (7).
A case where the direction from a start point to an end point of an arc is clockwise will be described.
[Formula 8]
{right arrow over (Ve)}=−{right arrow over (P0)}
({right arrow over (Ve)}·{right arrow over (P)})=se (8)
se is equal to the cosine of the angle formed between the point P0 on the true sphere CB and an arbitrary point P on the arc, and has a negative sign. se is represented by the following formula (9).
Next, an airspace set on the true sphere will be described.
In summary, it can be understood that, when an airspace is defined, the following two pieces of information are required.
Specification of one or more line segments surrounding the airspace.
Specification of a direction (counterclockwise or clockwise) when the closed curve formed of the one or more line segments surrounding the airspace is viewed from the outside of the true sphere.
However, it is assumed that the airspace information processing device 100 according to this exemplary embodiment manages a considerably large airspace on the true sphere. Accordingly, it is necessary to collectively manage pieces of airspace information created by different subjects, such as an organization, a corporation, a country, and the like.
In this case, a start point and an end point (for example, the points P1 and P2 shown in
On the other hand, it is necessary to carefully manage the direction information for the following reason. That is, as for the direction information, the direction of the closed curve is artificially determined. Therefore, the direction of the closed curve may vary among organizations, corporations, countries, and the like that manage the airspace. For example, it can be assumed that the direction of the closed curve is specified as counterclockwise in a country A, while the direction of the closed curve is specified as clockwise in a country B. In this case, the direction of the closed curve is defined as counterclockwise in a system using the airspace information of the country A. Accordingly, if the line segment information created in the country B is input to a system of the country A to recognize the airspace, the system of the country A recognizes that the airspace indicated by the line segment information of the country B is outside of the airspace. That is, in such a case, false recognition of the airspace occurs.
In order to avoid this, it is possible to specify the direction information for each piece of line segment information created by different subjects, such as an organization, a corporation, and a country. However, in existing systems, it is not assumed that a wide range of airspace is managed like in the airspace information processing device 100 according to this exemplary embodiment. Accordingly, the existing systems do not have any function for adding the direction information for specifying the direction of the closed curve to the line segment information for specifying the airspace. Even if the direction information is added, the amount of information to be input to the system increases, and if the direction information is erroneously specified, a problem similar to that described above arises.
The area of an airspace defined by a closed curve is generally smaller than half of the surface area of the Earth, as is obvious from the intended use thereof. Therefore, when the area of the airspace is compared with the area of the region outside of the airspace, the smaller area can be discriminated as being the airspace. However, a vast number of calculations are required to obtain the area of each region defined by a closed curve on the sphere, which is not suitable for processing for simply recognizing an airspace. Particularly when a plurality of airspaces are managed, a vast number of calculations are required merely for enabling the system to recognize an airspace, and thus it is not practical.
On the other hand, the airspace information processing device 100 according to this exemplary embodiment can recognize an airspace accurately with a small number of calculations based on the airspace information with various directions of closed curves. The airspace information processing device 100 will be described in detail below.
An operation of the airspace information processing device 100 according to this exemplary embodiment will be described. The airspace information processing device 100 performs an inside/outside determination on an airspace defined by the determination target closed curve, based on the relationship between the closed curve (the determination target closed curve and a transferred image) representing the outline of two airspaces spatially isolated from each other.
First, the closed curve reading unit 1 reads the determination target closed curve AZ1. At this time, a circumferential direction is not given to the determination target closed curve AZ1, and the determination target closed curve AZ1 represents only the outline of the airspace. Therefore, it is unclear which one of the two regions on the true sphere CB defined by the determination target closed curve AZ1 corresponds to the airspace. Specifically, in this case, the closed curve reading unit 1 reads line segment information specifying the determination target closed curve AZ1 which is preliminarily stored in the storage unit 5. In the example shown in
The transfer processing unit 21 of the transfer unit 2 generates the transferred image AZ2 by transferring the closed curve reading unit 1 from its original position to another position on the true sphere. In this exemplary embodiment, the transfer processing unit 21 generates, as the transferred image AZ2, an inverted transferred image by transferring the determination target closed curve AZ1 to a point-symmetrical position about the center of the true sphere CB.
As described above, the airspace information processing device 100 performs the inside/outside determination on the determination target closed curve AZ1 based on the positional relationship between two airspaces, which are spatially isolated from each other, and a line segment drawn between the two airspaces. Accordingly, it is necessary to secure the state in which the determination target closed curve AZ1 and the transferred image AZ2 are spatially isolated. Therefore, in this case, the transfer processing unit 21 of the transfer unit 2 determines whether or not the determination target closed curve AZ1 and the transferred image AZ2 have an intersection point. The intersection point described herein does not include a contact point between the determination target closed curve AZ1 and the transferred image AZ2. In other words, when the determination target closed curve AZ1 and the transferred image AZ2 have an intersection point, it is impossible to determine the circumferential direction, and thus the processing is cancelled.
When the determination target closed curve AZ1 and the transferred image AZ2 are spatially isolated (when the determination target closed curve AZ1 and the transferred image AZ2 have no intersection point), the line segment generation unit 3 generates a line segment, which passes through the transferred image AZ2, from points on the line segment closest to the transferred image AZ2 among the line segments LA1 to LA4 of the determination target closed curve AZ1.
The generation of a line segment (step S14) will be described in more detail.
An arbitrary point P0 (also referred to as a first point) is set on an arbitrary line segment among the line segments forming an airspace.
A temporal line segment Lp (also referred to as a first line segment) having an intersection point with a line segment forming the transferred image AZ2 is subtracted from the point P0.
Intersection points between the line segment Lp and the line segments of the determination target closed curve AZ1 other than the line segment on which the point P0 is set are obtained.
Among the intersection points obtained as described above, an intersection point closest to the transferred image AZ2 is selected as a point PA. Assume herein that the intersection point includes the point P0 which is an endpoint of the line segment Lp.
In the temporal line segment Lp, an interval between the point PA and any point on the transferred image AZ2 is set as a determination line segment Ld. In this case, as any point on the transferred image AZ2, for example, a point (also referred to as a second point) that is closest to the determination target closed curve AZ1 among the intersection points between the temporal line segment Lp and the line segments forming the transferred image AZ is used. In this case, however, the definition of any point on the transferred image AZ2 is not limited to this.
By the above-described steps S41 to S45, the generation of a line segment can be carried out in the above-described step S14.
Referring again to
In two regions defined by the closed curve representing the airspace, the region located on the left side when the boundary between the regions is followed in the direction in which the airspace is defined is represented by A1, and the region located on the right side when the boundary between the regions is followed in the direction in which the airspace is defined is represented by A2. Since it is apparent that the transferred image AZ2 is located outside of the determination target closed curve AZ1, it is apparent that the determination line segment Ld is output outward from the line segment that defines the determination target closed curve AZ1.
In this case, when the determination line segment Ld is present on the right side as viewed from the line segment having an intersection point (that is, the point PA) with the determination line segment Ld, i.e., in the right-side region A2, it can be determined that the left-side region A1 represents the airspace.
Further, when the determination line segment Ld is present on the left side as viewed from the line segment having an intersection (that is, the point PA) with the determination line segment Ld, i.e., in the left-side region A1, it can be determined that the right-side region A2 represents the airspace.
As described above, in step S5, it can be recognized which one of the right and left closed curves represents the determination target closed curve AZ1 by determining in which one of the right and left regions of the closed curves (line segments forming the airspace), the determination line segment Ld is present.
After that, the circumferential direction of the recognized airspace may be set so as to be identical with the circumferential direction of the closed curve set by the airspace information processing device 100. For example, when the circumferential direction of the airspace is defined to be counterclockwise, the circumferential direction is a direction in which the determination line segment Ld is viewed on the right side. When the circumferential direction of the airspace is defined to be clockwise, the circumferential direction is a direction in which the determination line segment Ld is viewed on the left side.
Note that the above description is made assuming that the transferred image AZ2 is generated, but the entire airspace need not necessarily be transferred. Instead, only a part of the airspace on the closed curve forming the determination target closed curve AZ1 may be transferred. Further, a part of the airspace on the closed curve to be transferred is not necessarily a line segment, but instead may be a point. Furthermore, the line segment Lp passing through the transferred line segment or the transferred point may be generated. When the line segment Lp passes through the transferred point, the location where the transferred point is present on the line segment Lp is also referred to as an intersection point, for convenience of explanation. However, this is applicable only when it is apparent that the transferred point is not included in the determination target closed curve AZ1. In this case, the above-mentioned detection of an intersection point (step S12) may be omitted, which is advantageous as the number of calculations is reduced.
Instead of the transferred point, another point that is apparently not included in the determination target closed curve AZ1 may be used.
For example, practically, it is highly unlikely that an airspace including the south pole is set, and thus the south pole can be used as another point described above.
An airspace information processing device according to a second exemplary embodiment will be described. In this exemplary embodiment, modified examples for the method of generating the transferred image AZ2 will be described. In the first exemplary embodiment, the inverted transferred image of the determination target closed curve AZ1 is used as the transferred image AZ2. However, any image having no intersection point with the determination target closed curve AZ1 can be used as the transferred image AZ2, and thus the modified examples for the method of generating the transferred image AZ2 can be applied.
A center-of-gravity point G of the determination target closed curve AZ1 is obtained and a vector OG connecting the center-of-gravity point G and the center O of the true sphere CB is obtained. Further, an image obtained by rotating and duplicating the determination target closed curve AZ1 by a predetermined angle (for example, 90°, 120°, or 180°) using, as a rotation axis, a vector perpendicular to the vector OG passing through the center O of the true sphere CB is set as the transferred image AZ2. In this case, the calculation of the center-of-gravity point G of the determination target closed curve AZ1 requires an appropriate number of calculations.
For example, a plurality of points (XYZ perpendicular coordinates) are set at regular intervals on a closed curve surrounding the determination target closed curve AZ1, and the average vector of the position vectors of the plurality of set points is obtained. Further, an image obtained by rotating and duplicating the determination target closed curve AZ1 by a predetermined angle (for example, 90°, 120°, or 180°) using a vector perpendicular to the average vector passing through the center O of the true sphere CB as a rotation axis is set as the transferred image AZ2. In this case, the calculation of the average vector is easier than the calculation of the center-of-gravity point G of the determination target closed curve AZ1, and thus the number of calculations can be reduced.
For example, a plurality of points are set at regular intervals on a closed curve surrounding the determination target closed curve AZ1. Average latitude and longitude coordinates composed of average values of the latitudes and longitudes of the plurality of set points are obtained and a vector connecting the average latitude and longitude coordinates and the center O of the true sphere CB is obtained. Further, an image obtained by rotating and duplicating the determination target closed curve AZ1 by a predetermined angle (for example, 90°, 120°, or 180°) using, as a rotation axis, a vector which is perpendicular to the obtained vector and passes through the center O of the true sphere CB is set as the transferred image AZ2. In this case, the calculation of the average latitude and longitude coordinates is easier than the calculation of the center-of-gravity point G of the determination target closed curve AZ1, and thus the number of calculations can be reduced.
A vector connecting the center O of the true sphere CB and an arbitrary point on a closed curve surrounding the determination target closed curve AZ1 is obtained. Further, an image obtained by rotating and duplicating the determination target closed curve AZ1 by a predetermined angle (for example, 90°, 120°, or 180°) using, as a rotation axis, a vector which is perpendicular to the obtained vector and passes through the center O of the true sphere CB is set as the transferred image AZ2. In this case, it is only necessary to obtain an arbitrary point on a closed curve surrounding the determination target closed curve AZ1, and thus the number of calculations can be reduced.
Two points are set in such a manner that the distance between the two points on a closed curve surrounding the determination target closed curve AZ1 is maximum, and a middle point between the set two points is obtained. Further, a vector connecting the middle point and the center O of the true sphere CB is obtained. Furthermore, an image obtained by rotating and duplicating the determination target closed curve AZ1 by a predetermined angle (for example, 90°, 120°, or 180°) using, as a rotation axis, a vector which is perpendicular to the obtained vector and passes through the center O of the true sphere CB is set as the transferred image AZ2. In this case, it is only necessary to obtain a middle point, and thus the number of calculations can be reduced.
The rotation of coordinates at an arbitrary angle in a three-dimensional manner in the modified examples described above includes the calculations of three floating-point parameters and the trigonometric-function, so that a computing error is likely to occur. On the other hand, Modified Example 6 relates to a method of generating the transferred image AZ2 in which a computing error does not occur in principle.
The XYZ axes are set on the true sphere CB as shown in, for example,
The rotation about the first rotation axis (Z-axis) indicates that an X-Y plane (X-Y coordinates) rotates about the Z-axis. The rotation about the second rotation axis (X-axis) indicates that a Y-Z plane (Y-Z coordinates) rotates about the X-axis. The rotation about the third rotation axis (Y-axis) indicates that a Z-X plane (Z-X coordinates) rotates about the Y-axis.
The triaxial rotation method will be described in detail below.
First, the determination target closed curve AZ1 is rotated about the Z-axis by 180° to generate a transferred image. The image obtained in this case is represented as a transferred image AZ2_Z. In this case, coordinates (x, y, z) on the determination target closed curve AZ1 are transferred to coordinates (−x, −y, z). Although the term “triaxial rotation method” includes “rotation”, it can be understood that, in practice, it is only necessary to perform a simple operation for inverting the signs of x and y coordinates of coordinate information defining a closed curve surrounding the determination target closed curve AZ1.
It is detected whether the determination target closed curve AZ1 and the transferred image AZ2_Z have an intersection point. In this case, the intersection point detection process can be performed by a method similar to that in the above-mentioned step S12.
When the determination target closed curve AZ1 and the transferred image AZ2_Z have no intersection point, the transferred image AZ2_Z is set as the transferred image AZ2, and the processing is terminated.
When the determination target closed curve AZ1 and the transferred image AZ2_Z have an intersection point, the determination target closed curve AZ1 is rotated about the X-axis by 180° to generate a new transferred image. The image obtained in this case is represented as a transferred image AZ2_X. In this case, coordinates (x, y, z) on the determination target closed curve AZ1 are transferred to coordinates (x, −y, −z). Although the term “triaxial rotation method” includes “rotation”, it can be understood that, in practice, it is only necessary to perform a simple operation for inverting the signs of y and z coordinates of coordinate information defining a closed curve surrounding the determination target closed curve AZ1.
It is detected whether the determination target closed curve AZ1 and the transferred image AZ2_X have an intersection point. The intersection point detection process in this case may be performed by a method similar to the above-mentioned step S12.
When the determination target closed curve AZ1 and the transferred image AZ2_X have no intersection point, the transferred image AZ2_X is set as the transferred image AZ2, and the processing is terminated.
When the determination target closed curve AZ1 and the transferred image AZ2_X have an intersection point, the determination target closed curve AZ1 is rotated about the Y-axis by 180° to generate a new transferred image. The image obtained in this case is represented as a transferred image AZ2_Y. In this case, coordinates (x, y, z) on the determination target closed curve AZ1 are transferred to coordinates (−x, y, −z). Although the term “triaxial rotation method” includes “rotation”, it can be understood that, in practice, it is only necessary to perform a simple operation for inverting the signs of x and z coordinates of coordinate information defining a closed curve surrounding the determination target closed curve AZ1.
It is detected whether the determination target closed curve AZ1 and the transferred image AZ2_Y have an intersection point. The intersection point detection process in this case can be performed by a method similar to the above-mentioned step S12.
When the determination target closed curve AZ1 and the transferred image AZ2_Y have no intersection point, the transferred image AZ2_Y is set as the transferred image AZ2, and the processing is terminated.
When the determination target closed curve AZ1 and the transferred image AZ2_Y have an intersection point, the creation of the transferred image AZ2 is cancelled and the processing is terminated.
Although the term “triaxial rotation method” includes “rotation”, in practice, it is only necessary to perform a simple operation for inverting the sign of coordinate information defining a closed curve surrounding the determination target closed curve AZ1. Therefore, the number of calculations can be reduced as compared with those in Modified Examples 1 to 5 described above.
Since the inverted transferred image described in the first exemplary embodiment is generated by transferring the determination target closed curve AZ1 to a point-symmetrical position about the center of the true sphere CB, the circumferential direction of the determination target closed curve AZ1 is opposite to the circumferential direction of the inverted transferred image. On the other hand, in the triaxial rotation method, the transferred image AZ2 can be generated while maintaining the circumferential direction unchanged.
In addition, for example, in Modified Examples 1 to 5, the amount of rotation of a closed curve can be arbitrarily determined. A closed curve may be rotated a plurality of times until the determination target closed curve AZ1 and the transferred image AZ2 have no intersection point. When a closed curve is rotated a plurality of times, Modified Examples 1 to 5 may be combined as appropriate.
A geographic information management device according to a third exemplary embodiment will be described. This exemplary embodiment illustrates a specific example of the detection of an intersection point as described above with reference to step S13 shown in
The storage device 31 can store a database storing data and programs to be supplied for processing in the operation unit 32. For example, various types of storage devices, such as a hard disk drive and a flash memory, can be applied to the storage device 31. Specifically, the storage device 31 stores a basic form database D1 and an airspace information database D2.
The basic form database D1 is unique information provided in advance.
The airspace information database D2 includes coordinate information indicating a line segment or an airspace on the true sphere CB.
The storage device 31 can also store a program PRG1 for specifying arithmetic processing for detecting an intersection point between line segments to be described later.
The operation unit 32 is capable of reading the program and database from the storage device 31, and performing necessary arithmetic processing. The operation unit 32 is composed of, for example, a CPU (Central Processing Unit).
Next, the intersection point detecting operation of the intersection point detection unit 22 will be described.
First, the operation unit 32 reads the program PRG1. The program PRG1 is a program for determining whether or not two line segments on the true sphere CB have an intersection point by using the basic form database D1 and the airspace information database D2. Thus, the operation unit 32 functions as a shape determination device including the candidate point detection unit 34 and a detection unit. The program PRG1 is read out from, for example, the storage device 31.
This exemplary embodiment has been described above assuming that the operation unit 32 is composed of a computer and reads the program PRG1. However, the operation unit 32 can be configured as a device in which the candidate point detection unit 34 and the detection unit each having a physical entity are formed.
Next, the operation unit 32 reads out the basic form database D1 and the airspace information database D2 from the storage device 31.
The operation unit 32 substitutes the information included in the basic form database D1 and the airspace information database D2 into the formula specified by the program PRG1, thereby performing the intersection point detecting operation.
The intersection point detecting operation in step S23 will be described in detail below. In the case of indicating a point on the true sphere CB (on the ground surface), a superscript arrow is added to denote a vector quantity in the following formulas and the drawings. For ease of explanation, all vector quantities are normalized. Specifically, a position vector representing a point on the true sphere CB is a position vector normalized by dividing the vector quantities by the radius R of the true sphere CB included in the basic form database D1. For ease of explanation, the normalized vector is hereinafter referred to simply as a vector.
On the true sphere CB, an airspace can be defined as a region surrounded by one or more line segments that do not intersect with each other. In general, a line segment on the true sphere CB is an arc. An arc can be represented as an interval between a start point and an end point on a circle as a closed curve. As premises for understanding the intersection point detecting operation according to this exemplary embodiment, a method for representing a line segment on the true sphere CB will be described below.
A line segment connecting two points as a shortest route on the true sphere, a circle on the true sphere, and an arc connecting two points on the true sphere have already been described in the first exemplary embodiment, and thus the descriptions thereof are herein omitted.
A latitude line connecting the points P1 and P2 at the same latitude on the true sphere CB (on the ground surface) will be described. A latitude line on the true sphere CB (on the ground surface) can be understood as being a rhumb line between two points at the same latitude on the true sphere CB.
A case where the azimuth from the point P1 (start point) to the point P2 (end point) is eastward will be described.
[Formula 10]
{right arrow over (Vb)}={right arrow over (N)}=(0,0,1)
({right arrow over (Vb)}·{right arrow over (P)})=sb (10)
where sb represents the sine of the angle formed by an equational plane and a latitude θ at which the point P1 and the point P2 are present, and is expressed by the following formula (11).
[Formula 11]
sb=sin θ (11)
A case where the azimuth from the point P1 (start point) to the point P2 (end point) is westward will be described.
[Formula 12]
{right arrow over (Vc)}={right arrow over (S)}=(0,0,−1)
({right arrow over (Vc)}·{right arrow over (P)})=sc (12)
where sc represents the sine of the angle formed by the equational plane and the latitude θ at which the point P1 and the point P2 are present, has an inverted sign, and is expressed by the following formula (13).
[Formula 13]
s
c=−sin θ (13)
Next, how to manage line segments in the intersection point detecting operation will be described. A circle including an arc which is a line segment on the true sphere CB is hereinafter referred to as a reference circle, and an expression that the arc belongs to the reference circle is used.
The position vector for the point P on the reference circle C satisfies the following formula (14). In Formula (14), s represents a parameter indicating the radius (curvature radius) of the reference circle C, and V represents a unit normal vector for the plane to which the reference circle C belongs.
[Formula 14]
{right arrow over (V)}·{right arrow over (P)}=s (14)
On the basis of the above premises, an example in which two line segments L1 and L2 are present on the true sphere CB will be discussed.
[Formula 15]
({right arrow over (V1)}·{right arrow over (P1)})=s1
({right arrow over (V2)}·{right arrow over (P2)})=s2 (15)
The candidate point detection unit 34 of the operation unit 32 detects an intersection point (candidate point) between the reference circle C1 and the reference circle C2. In the detection process, an intersection point is detected using a discriminant D as described below. The derivation of the discriminant D will be described below.
An intersection point between the reference circle C1 and the reference circle C2 is represented by Pc. A position vector for the intersection point Pc can be defined by the following formula (16). In Formula (16), β, γ, and δ are arbitrary real numbers.
[Formula 16]
{right arrow over (Pc)}=β{right arrow over (V1)}+γ{right arrow over (V2)}+δ{right arrow over (V1)}×{right arrow over (V2)} (16)
The intersection point Pc needs to satisfy each equation in Formula (15). Accordingly, Formula (16) is substituted into each equation of Formula (15), thereby obtaining the following formula (17).
[Formula 17]
({right arrow over (V1)}·{right arrow over (Pc)})=β+γ({right arrow over (V1)}·{right arrow over (V2)})=s1
({right arrow over (V2)}·{right arrow over (Pc)})=β({right arrow over (V1)}·{right arrow over (V2)})+γ=s2 (17)
When Formula (17) is solved for β and γ, the following formula (18) is obtained.
At the intersection point Pc, the following formula (19) is established.
[Formula 17]
({right arrow over (Pc)}·{right arrow over (Pc)})=1 (19)
When Formula (19) is expanded using Formula (16), the following formula (20) is obtained.
[Formula 20]
β2+γ2+2βγ({right arrow over (V1)}·{right arrow over (V2)})+δ2 ({right arrow over (V1)}·{right arrow over (V2)})2=1 (20)
Formula (18) is substituted into Formula (20) and the formula is solved for δ, thereby obtaining the following formula (21).
D shown in Formula (21) represents a discriminant representing whether there is an intersection point, and is expressed by the following formula (22).
[Formula 22]
D=1−({right arrow over (V1)}·{right arrow over (V2)})2−s12−s22+2s1s2({right arrow over (V1)}·{right arrow over (V2)}) (22)
Formula (19) includes the square root of the discriminant D. Therefore, to obtain the solution of Formula (14) representing the intersection point Pc, it is necessary to sort the cases according to the value of the discriminant D.
When the discriminant D takes a positive value, δ takes two positive and negative values having the same absolute value. Accordingly, two solutions are obtained for Formula (16) representing the intersection point Pc. Specifically, in this case, the reference circle C1 and the reference circle C2 intersect with each other at two intersection points Pc1 and Pc2 on the true sphere CB.
Formula (18) and Formula (21) are substituted into Formula (16), with the result that the position vectors for the intersection points Pc1 and Pc2 are represented by the following formula (23).
When the discriminant D takes a negative value, δ represents an imaginary number solution. Accordingly, the reference circle C1 and the reference circle C2 have no intersection point. When the reference circle C1 and the reference circle C2 have no intersection point, the reference circle C1 and the reference circle C2 have a separation or inclusion relationship.
When the discriminant D is 0, δ is also 0. In this case, the reference circle C1 and the reference circle C2 are in contact with each other. It can be considered that the reference circle C1 and the reference circle C2 are in contact with each other in the following two cases. One is a case where the reference circle C1 and the reference circle C2 are circumscribed or inscribed at the intersection point Pc as a contact point. The other one is a case where the reference circle C1 and the reference circle C2 match.
A Case Where the Reference Circle C1 and the Reference Circle C2 are Circumscribed or Inscribed
When the discriminant D is 0 and the following formula (24) is satisfied, the reference circle C1 and the reference circle C2 have one intersection point.
[Formula 24]
({right arrow over (V1)}·{right arrow over (V2)})2<1 (24)
In this case, the position vector for an intersection point Pc0 between the reference circle C1 and the reference circle C2 is represented by the following formula (25) by substituting Formula (18) and Formula (21) into Formula (16).
When the Reference Circle C1 and the Reference Circle C2 Match
When the discriminant D is 0 and the following formula (26) is satisfied, the reference circle C1 and the reference circle C2 match.
[Formula 26]
({right arrow over (V1)}·{right arrow over (V2)})2 =1 (26)
The determination as to whether or not two reference circles have an intersection point and the determination as to whether or not two reference circles match have been described above. However, it is necessary to consider the interval between line segments on the reference circle in the determination as to whether two line segments have an intersection point. Specifically, when the intersection point between the reference circle C1 and the reference circle C2 is not present in the interval between the line segment L1 and the line segment L2, the line segment L1 and the line segment L2 have no intersection point.
Accordingly, in this exemplary embodiment, the intersection point between the reference circle C1 and the reference circle C2 does not necessarily correspond to the intersection point between the line segment L1 and the line segment L2. Therefore, in order to distinguish the intersection point between the reference circle C1 and the reference circle C2 from the intersection point between the line segment L1 and the line segment L2, the detected intersection point between the reference circle C1 and the reference circle C2 is referred to as a candidate point.
A method in which the detection unit 35 determines whether or not the line segment L1 on the reference circle C1 includes the candidate point Pc represented by Formula (14) will be described below. In the determination, the cases are sorted according to the center angle Ψ of the line segment L1.
[Formula 27]
({right arrow over (PS)}×{right arrow over (PE)})·{right arrow over (V1)}≦0 (27)
When the following formula (28) or (29) is satisfied, the candidate point Pc is present on the line segment L1.
[Formula 28]
{right arrow over (Pc)}·({right arrow over (V1)}×{right arrow over (PS)})≧0 (28)
[Formula 29]
{right arrow over (Pc)}·({right arrow over (V1)}×{right arrow over (PE)})≦0 (29)
When the Center Angle Ψ is Smaller than π (0<Ψπ)
[Formula 30]
({right arrow over (PS)}×{right arrow over (PE)})·{right arrow over (V1)}>0 (30)
When both Formulas (28) and (29) are satisfied, the candidate point Pc is present on the line segment L1.
While the method for determining whether the line segment L1 has an intersection point has been described above, it can be determined whether or not the line segment L2 has an intersection point.
Thus, when the line segment L1 and the line segment L2 include the same candidate point Pc, it can be determined that the candidate point is identical with the intersection point Pc. In this case, the line segment L1 and the line segment L2 intersect with each other at two points (this state is referred to as an intersecting state), and thus it can be determined that the line segments are in contact with each other or match.
The above-described procedure for detecting an intersection point (step S23 shown in
The candidate point detection unit 34 calculates the discriminant D.
Step SS2 The candidate point detection unit 34 determines whether or not the discriminant D is smaller than 0. Accordingly, it can be determined whether there is a candidate point. When the discriminant D is smaller than 0, there is no candidate point. When the discriminant D is equal to or greater than 0, there is at least one candidate point.
When the discriminant D is equal to or greater than 0, the detection unit 35 determines whether the discriminant D is 0.
When the discriminant D is greater than 0, the detection unit 35 calculates the candidate point Pc1.
The detection unit 35 performs intersection point determination processing on the candidate point Pc1. The intersection point determination processing will be described later.
The detection unit 35 calculates the candidate point Pc2.
The detection unit 35 performs the intersection point determination processing on the candidate point Pc2. The intersection point determination processing will be described later.
When the discriminant D is 0, the detection unit 35 determines whether Formula (31) is satisfied.
[Formula 31]
({right arrow over (V2)}·{right arrow over (V1)})2<1 (31)
When Formula (31) is satisfied, the detection unit 35 calculates the candidate point Pc0.
The detection unit 35 performs the intersection point determination processing on the candidate point Pc0. The intersection point determination processing will be described later.
When Formula (31) is not satisfied, the detection unit 35 performs the intersection point determination processing on the start point PS1 of the line segment L1.
The detection unit 35 performs the intersection point determination processing on the end point PE1 of the line segment L1.
The detection unit 35 performs the intersection point determination processing on the start point PS2 of the line segment L2.
The detection unit 35 performs the intersection point determination processing on the end point PE2 of the line segment L2.
Next, the intersection point determination processing will be described.
As the determination target point PJ, the candidate point calculated in the previous step is set.
A range verification process for determining whether the determination target point PJ is present on the line segment L1 is carried out. The range verification process will be described in detail later. When the determination target point PJ is not present on the line segment L1, the processing is terminated.
When the determination target point PJ is present on the line segment L1, a range verification process for determining whether the determination target point PJ is present on the line segment L2 is carried out. The range verification process will be described in detail later. When the determination target point PJ is not present on the line segment L2, the processing is terminated.
When the determination target point PJ is present on the line segments L1 and L2, the determination target point PJ is registered as a candidate point.
The range verification process in the above-mentioned steps SR2 and SR3 will be described.
It is determined whether the determination target line segment LJ is a circle.
When the determination target line segment LJ is not a circle, it is determined whether the line segment is a superior arc.
When the determination target line segment LJ is a superior arc or a half-arc, it is determined whether at least one of Formula (28) and Formula (29) is satisfied. When at least one of Formula (28) and Formula (29) is satisfied, the determination target point PJ is present on the determination target line segment LJ (determination result shows “YES”). When both Formulas (28) and (29) are not satisfied, the determination target point PJ is not present on the determination target line segment LJ (determination result shows “NO”).
When the determination target line segment LJ is a minor arc, it is determined whether both Formulas (28) and (29) are satisfied. When both Formulas (28) and (29) are satisfied, the determination target point PJ is present on the determination target line segment LJ (determination result shows “YES”). When at least one of Formula (28) and Formula (29) is not satisfied, the determination target point PJ is not present on the determination target line segment LJ (determination result shows “NO”).
As described above, according to this exemplary embodiment, it is possible to reliably determine whether or not two line segments set on the true sphere have an intersection point. Consequently, it is possible to reliably determine whether or not two air routes each represented by an arc on the true sphere intersect with each other, or whether or not line segments each constituting an airspace intersect with each other.
In the above description, the determination as to whether or not typical two line segments have an intersection point has been described. However, it can be understood that the intersection point detection unit 22 can specifically and easily detect whether or not the determination target closed curve AZ1 and the transferred image AZ2 have an intersection point, by applying the detection of an intersection point between two line segments to line segments forming a closed curve surrounding the determination target closed curve AZ1 and line segments forming a closed curve surrounding the transferred image AZ2.
Note that the present invention is not limited to the above exemplary embodiments and can be modified as appropriate without departing from the scope of the invention.
The airspace information processing device and the airspace information processing method performed in the device have been described above. However, the present invention is not limited to these. According to the present invention, arbitrary processing can be implemented by causing a CPU (Central Processing Unit) to execute a computer program.
The program can be stored and provided to a computer using any type of non-transitory computer-readable media. Non-transitory computer-readable media include any type of tangible storage media. Examples of non-transitory computer-readable media include magnetic storage media (such as floppy disks, magnetic tapes, hard disk drives, etc.), optical magnetic storage media (e.g. magneto-optical disks), CD-ROM (Read Only Memory), CD-R, CD-R/W, and semiconductor memories (such as mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (random access memory), etc.). The program may be provided to a computer using any type of transitory computer-readable media. Examples of transitory computer-readable media include electric signals, optical signals, and electromagnetic waves. Transitory computer-readable media can provide the program to a computer via a wired communication line (e.g. electric wires, and optical fibers) or a wireless communication line.
While the present invention has been described above with reference to exemplary embodiments, the present invention is not limited to the above exemplary embodiments. The configuration and details of the present invention can be modified in various ways which can be understood by those skilled in the art within the scope of the invention.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2014/003783 | 7/17/2014 | WO | 00 |