This application claims the benefit of priority to the following Chinese patent application: Chinese patent application No. 202310892450.5, titled “AIRCRAFT GROUND ANTI-COLLISION SYSTEM AND METHOD”, filed with the China National Intellectual Property Administration on Jul. 19, 2023, the entire contents of which is incorporated herein by reference.
The present application relates to an aircraft ground anti-collision system and method, and in particular to an aircraft ground anti-collision system and method for eliminating a vision blind area through multi-system cooperation.
The content in this section only provides background information related to the present application, which may not constitute the prior art.
An aircraft usually moves on the airport ground in two ways that a pilot controls the aircraft to move on the ground or a trailer driver controls a trailer to tow the aircraft to move on the ground. During an operation of the aircraft on the ground, it is required to avoid people, other aircrafts, buildings and other obstacles around the aircraft at any time, so as to avoid collision. Such collision is particularly likely to occur at a wing tip of the aircraft. At present, there are some on-board anti-collision schemes based on a device mounted on the aircraft and off-board anti-collision schemes based on a device mounted outside the aircraft.
A first on-board anti-collision scheme is based on automatic dependent surveillance-broadcast (ADS-B) technology, that is, the aircraft determines a position of the aircraft through a satellite navigation system and periodically broadcasts the position of the aircraft, so that the aircraft is trackable. The ADS-B scheme is applicable for monitoring the position of the moving aircraft and routing the aircraft to avoid collision. However, lots of aircraft ground collision events occur when an ADS-B device has not been activated normally, for example, when one aircraft is moving by being towed by the trailer while another aircraft is not moving. On the other hand, accuracy of the position detection based on the ADS-B scheme is low, and the error may reach 10 m or more.
In a second on-board anti-collision scheme, an obstacle around the aircraft is detected based on radar. This scheme may be realized with low cost, and it is relatively less affected by light and bad weather, but a spatial resolution of radar detection technology is low. In other words, the shape of an object cannot be accurately identified by a radar device. For example, the aircraft can be displayed only in the form of dot or cross mark on a monitoring dashboard, and the position and shape of a wing tip (such as the outline of the wing tip) of the aircraft cannot be accurately determined by radar. Therefore, it is impossible to accurately avoid the collision between the aircraft and other obstacles through the radar detection technology alone.
In a third on-board anti-collision scheme, an obstacle around the aircraft is detected based on a vision sensor (for example, a pair of cameras mounted on a vertical stabilizer of the aircraft). However, the image quality and sensing accuracy of the vision sensor will be greatly negatively affected under poor light and/or bad weather conditions.
In addition, some off-board anti-collision schemes have been proposed, for example, a portable sensor system is attached to a movable object such as a trailer to detect surrounding obstacles. A disadvantage of this scheme is that the view angle of the portable sensor system is limited and is easy to be blocked, which results in a vision blind area. For example, a camera fixed to the trailer has a limited height and may only capture a low view angle close to the ground and is easily obscured by a fuselage of the aircraft.
On the other hand, in addition to the distance misjudgment and the vision blind area that may be caused by the above hardware issues, collision accidents are usually caused by human errors of a pilot, a trailer driver, a wing protector, a commander and other ground operators, such as inefficient communication between all parties or distraction caused by fatigue. Therefore, it is required to improve the cooperation and scene awareness among all parties.
In summary, it is still required to develop a reliable aircraft ground anti-collision scheme, and in particular, a cooperative aircraft ground anti-collision scheme in which view angles of all parties can be fused to eliminate the vision blind area.
An object of the present application is to provide an improved aircraft ground anti-collision system and method, so as to eliminate a vision blind area for a pilot and a ground operator during the ground movement of an aircraft. Another object of the present application is to improve the cooperation and scene awareness between the pilot and the ground operator and reduce the accident rate.
According to an aspect of the present application, an aircraft ground anti-collision system is provided. The aircraft ground anti-collision system includes multiple sensors and an obstacle detection processing unit, where the multiple sensors include an on-board sensor located on an aircraft and an off-board sensor located outside the aircraft. The obstacle detection processing unit is configured to: process data received from the multiple sensors to detect an object around the aircraft and/or a trailer for towing the aircraft, and fuse sensing ranges of the multiple sensors and unify information about the object detected by the multiple sensors in a same coordinate system, to generate a view-angle fused view indicating the object.
In some embodiments, the view-angle fused view includes an aerial view and/or a three-dimensional rendering view.
In some embodiments, the on-board sensor may include a distance sensor and/or a vision sensor mounted on the aircraft, and the off-board sensor may include a distance sensor and/or a vision sensor mounted on the trailer.
In some embodiments, the distance sensor may include lidar, and the vision sensor may include a camera.
In some embodiments, at least one sensor mounted on the trailer is configured to be lifted and lowered relative to the trailer.
In some embodiments, the off-board sensor may include a wearable sensor configured to be worn or held by an operator, where the operator includes a wing protector located behind a wing of the aircraft during ground movement of the aircraft.
In some embodiments, the obstacle detection processing unit is configured to perform a risk assessment based on the data received from the multiple sensors and output alarm information in response to detection of an unsafe event. The alarm information may include at least one of a visual alarm, an auditory alarm, and a tactile alarm.
In some embodiments, the obstacle detection processing unit is configured to output the alarm information in response to detection of overspeed and/or presence of a person or an obstacle in a predetermined danger area.
In some embodiments, the aircraft ground anti-collision system may further include multiple user interfaces, configured to communicate with the obstacle detection processing unit and communicate with each other to synchronously share information among multiple operators.
In some embodiments, the multiple user interfaces are configured to receive and display the view-angle fused view generated by the obstacle detection processing unit.
In some embodiments, the multiple user interfaces are configured to transmit the alarm information outputted by the obstacle detection processing unit to an operator.
In some embodiments, each of the multiple user interfaces is configured to send an encoded instruction to the obstacle detection processing unit and/or the remaining of the multiple user interfaces.
In some embodiments, the multiple user interfaces may include a head-up display and/or a head-mounted display device.
According to another aspect of the present application, an aircraft ground anti-collision method is provided. The method includes: processing data received from multiple sensors to detect an object around an aircraft and/or a trailer for towing the aircraft; and fusing sensing ranges of the multiple sensors, unifying information of the object detected by the multiple sensors in a same coordinate system, and outputting a view-angle fused view indicating the object. The multiple sensors include an on-board sensor located on the aircraft and an off-board sensor located outside the aircraft.
In some embodiments, the view-angle fused view may include an aerial view and/or a three-dimensional rendering view.
In some embodiments, the method may further include: performing a risk assessment based on the data from the multiple sensors and outputting alarm information in response to detection of an unsafe event. The alarm information may include at least one of a visual alarm, an auditory alarm, or a tactile alarm.
From the following detailed description, other applications of the present application will become more apparent. It should be understood that, although these detailed descriptions and specific examples show preferred embodiments of the present application, these detailed descriptions and specific examples are intended to achieve the purpose of illustrative description, rather than to limit the present application.
Embodiments of the present application will be described hereinafter by way of example only, with reference to the accompanying drawings. In the drawings, the same features or components are indicated by the same reference numerals, and the drawings may not necessarily drawn to scale. In the drawings:
The following description is essentially exemplary only, and is not intended to limit the present application, application or usage thereof. It should be appreciated that, throughout all of the drawings, similar reference numerals indicate the same or similar parts or features. The drawings merely schematically show the concepts and principles of the embodiments of the present application, and do not necessarily show the specific dimensions and scales of the embodiments of the present application. Specific parts in specific drawings may be exaggerated to illustrate related details or structures of the embodiments of the present application.
As shown in
The view-angle fused view generated by the obstacle detection processing unit 110 preferably includes an aerial view. For example, each of the sensors 120 may be spatially located first, and then information, such as the position and the size, of the object(s) detected by the sensors 120 may be unified in the same coordinate system (for example, taking a point on the aircraft 200 as the coordinate origin) by using an algorithm, so that the object(s) detected by all of the sensors 120 can be displayed in an aerial angle of view. The view-angle fused view may further include a three-dimensional rendering view. Similar to the aerial view, each of the sensors 120 may be spatially located first, the information, such as the position and the size, of the object(s) detected by the sensors 120 may be unified in the same coordinate system, and then the three-dimensional rendering view may be generated by rendering an outline of the object three-dimensionally based on the information about the size. In addition, alternatively, the three-dimensional rendering view may be generated based on image stitching of images captured by multiple vision sensors. The three-dimensional rendering view may be configured as a panoramic image with a 360-degree view angle.
The aerial view may provide a global view angle to allow the pilot and the ground operator to fully know the positions of all obstacles within a larger area around the aircraft 200 and/or the trailer 300. The obstacle detection processing unit 110 may further be configured to, based on the generated aerial view, plan possible obstacle avoidance routes for the aircraft 200 and the trailer 300 for reference by the operator. Three-dimensional rendering view may provide a local view angle, to allow the pilot and the ground operator to intuitively perceive the position and shape of obstacles in a smaller range around the aircraft 200 and/or the trailer 300, thereby improving the scene awareness of the pilot and the ground operator.
In addition, the obstacle detection processing unit 110 may further be configured to perform a risk assessment based on the data received from the sensors 120 and output alarm information to the operator (especially a controller controlling the operation of the aircraft 200, such as a pilot and/or a trailer driver) in response to detection of an unsafe event, so as to avoid accidents. For example, the obstacle detection processing unit 110 may be configured to output the alarm information in response to detection of overspeed and/or presence of a person or an obstacle in the 3D safety protection frame 400, especially in a predetermined area of interest and danger area. The area of interest may include, for example, the area in the vicinity of the wing tip and the tail wing of the aircraft 200. The danger area may be set based on the minimum safety distance specified by the operation requirements. For example, when the aircraft 200 is towed by the trailer 300, the minimum safety distance from an aircraft landing gear or the trailer should be greater than 3 m.
Referring back to
According to the present application, multiple sensors 120 are arranged at different positions such as the aircraft 200, the trailer 300 and/or the operators, the view-angle fused view, particularly an aerial view, is generated by fusing the view angles or sensing ranges of the multiple sensors 120; integrated scene information may be generated in an interconnected manner based on information from multiple different view angles such as the view angles of the aircraft, the trailer and the ground operator, thereby effectively eliminating the vision blind area of a single operator or a single sensor, and improving the safety of the ground movement of the aircraft. In particular, the wearable sensors arranged on the operators may mainly focus on the areas of interest such as the wing tip and the tail wing, and the vision blind area can be reduced in real time by the operator's subjective initiative. In addition, according to the present application, quick and effective communication between the operators involved during the ground movement of the aircraft is enabled, so as to synchronously share operational situation awareness and quickly transmit the alarm information. As a result, the cooperation and scene awareness of all of the operators can be effectively improved.
The exemplary embodiments of the aircraft ground anti-collision system and method according to the present application have been described in detail herein, but it should be understood that the present application is not limited to the specific embodiments described and illustrated in detail above. Various modifications and variations can be made by those skilled in the art to the present application, without departing from the spirit and scope of the present application. All the variations and modifications shall fall within the scope of the present application. Moreover, all the components described herein can be replaced by other technically equivalent components.
Number | Date | Country | Kind |
---|---|---|---|
202310892450.5 | Jul 2023 | CN | national |