This nonprovisional application claims the benefit of European patent application No. 20217106.2 filed Dec. 23, 2020, the entirety of which is incorporated by reference herein.
The present application relates a system and method for measuring weight and balance on an aircraft.
Center of gravity calculation is a safety critical task that is performed prior to departure of an aircraft. According to the value of the center of gravity, pilots trim horizontal stabilizers in order to compensate the pitch moment generated by a forward or aft center of gravity position. A center of gravity outside the aircraft safety envelope can lead to dramatic consequences, which include tail strike, pitch-up during climb, stall and crash.
In a passenger aircraft, determining the total weight and distribution of the occupants, and of the hand baggage, is required in order to compute the center of gravity of the aircraft. Conventional methods include flight attendants counting passengers manually and a rough weight estimate for adults and children is calculated. The estimate may change according to the season in order to, for example, consider the weight of the clothes worn by the passengers. Once the passenger distribution is known, the weight distribution in predetermined cabin zones is calculated and then entered manually into the Flight Management System. The Flight Management System then computes the center of gravity of the aircraft. This can lead to inaccurate and time-consuming processes. Moreover, the weight of the hand baggage is usually not considered. Normally the weight of the hand baggage is included in the overall passenger weight estimate or is simply neglected prior to flight.
In one aspect, there is provided a system for calculating weight and distribution on an aircraft. The system comprises an aircraft, at least one cabin camera configured to view the cabin of the aircraft, an image collector configured to collect images from the at least one cabin camera and a processor. The processor is configured to perform the following steps: a) detecting passengers and/or hand baggage from the images collected from the image collector, b) estimating positions of the passengers and/or hand baggage from the images collected from the image collector, and c) estimating weight of the passengers and/or hand baggage from the images collected from the image collector. The processor also calculates a real-time weight distribution of the aircraft based on the estimates provided in steps a)-c).
The system may further include a first module configured to determine passenger weight distribution based on the images collected from the image collector, and a second module configured to determine hand baggage weight distribution based on the images collected from the image collector. The system may further comprise a third module configured to calculate cabin weight distribution based on the data from the first and second modules.
The first module may be configured to determine biometric information of the detected passengers and wherein the passenger weight distribution may be based on the biometric information, and the position, of the passenger.
The second module may be configured to determine the size and shape of detected hand baggage, and wherein the hand baggage weight distribution may be based on the size and shape of the detected hand baggage, and the position, of the hand baggage.
The real-time weight distribution of the aircraft may be determined pre-flight, during flight and/or on landing. The real-time weight distribution of the aircraft may be continuously updated to a pilot and/or a Flight Management System.
The image collector may be hardware or software.
In another aspect, there is provided a method of calculating weight distribution on an aircraft. The method comprises collecting images from at least one cabin camera that is configured to view the cabin of the aircraft, detecting passengers and/or hand baggage from the images collected, estimating positions of the passengers and/or hand baggage from the collected images, estimating weight of the passengers and/or hand baggage from the images collected, and calculating a real-time weight distribution of the aircraft based on the estimated weight and estimated positions of the passengers and/or hand baggage.
There may be provided a first module for detecting passengers, estimating positions of the passengers and estimating weight of the passengers. There may also be provided a second module for detecting hand baggage, estimating positions of the hand baggage and estimating the weight of the hand baggage. There may also be provided a third module for calculating the weight distribution of the aircraft based on the data from the first and second modules.
The first module may be configured to determine biometric information of the passengers.
The second module may be configured to determine the size and shape of detected hand baggage.
The real-time weight distribution of the aircraft may be determined pre-flight, during flight and/or on landing.
The real-time weight distribution of the aircraft may be continuously updated to a pilot and/or a Flight Management System.
An example of a system 10 for calculating weight and distribution of an aircraft is shown in
As shown in
In the instance where there are overlapping passengers, the system 10 compensates for this by comparing the estimated positions of the passengers in the cabin. When the position of the passengers in the cabin is estimated from the cabin cameras 100a-100n, the same position of the same passengers can be recognised by the system 10 in the cabin, and compensations can be made in order to remove the overlapping image of the passenger. This therefore results in an adjustment to compensate for overlapping passengers within the view of the cabin cameras 100a-100n.
As mentioned, the first module 120 can detect passenger objects in the images collected by the image collector 110. An object detection model, such as MobileNet SSD could be used, as an example. Of course, other detection models could be used. Once the passenger has been detected by the first module 120, the position of the passenger can be estimated. As the position of the cameras, and the layout of the cabin, is known by the processor, the first module 120 can match detected passengers to the position of the cabin (e.g. specific seats). On take-off and landing, the passengers are located in their seats, and, therefore, a static frame image process may be used to determine the position of the passengers at their seats via pixel association. It may be necessary to also know the passenger positions during flight—i.e., to compute the passenger locations all the time. If the position of passengers is to be detected continuously (e.g. during flight), a dynamic algorithm may be used in order to detect the passengers and update as the passengers move around the cabin.
The first module 120 can then estimate the passenger weight. In order to do this, the first module 120 may determine some biometric information from the images collected by the image collector 110. As an example, the first module 120 may be able to determine whether a detected passenger is an adult or a child, and estimate a weight based on the determination. Other additional examples, that may be used in conjunction, could be height estimation via pixel detection, or otherwise; complexion estimation via posture detection; weight estimation based on face landmark extraction; clothing weight estimation based on skin detection, which consists of assessing the amount of skin that is detected for each passenger by the first module 120. Based on one or more of these estimation techniques, the first module 120 can then provide an estimation of the weights of the passengers and their positions.
The second module 130 can detect hand baggage objects in the images collected by the image collector 110. As mentioned above, and as an example, an object detection model, such as MobileNet SSD be used. Of course, other detection models could be used. Once the hand baggage has been detected by the second module 130, the position of the hand baggage can be estimated. As the position of the cameras, and the layout of the cabin, is known by the processor, the second module 130 can match detected hand baggage to the position of the cabin (e.g. the position in the overhead locker, or under the seat in front of their seat). On take-off and landing, the passengers are located in their seats, and, usually, the hand baggage is placed under the seat in front of them or in the overhead locker. Therefore, hand baggage positions may be gathered statically or dynamically. In the static approach, the cabin cameras 100a-100n have the overhead lockers in full view and a static frame image process may be used to determine the position of the hand baggage via pixel association. If the position of hand baggage is to be detected continuously (e.g. during flight), a dynamic algorithm may be used in order to detect the hand baggage and update once the hand baggage has been stored.
The second module 130 can then estimate the hand baggage weight. In order to do this, the second module 130 may determine the size and shape of the hand baggage to determine an estimate of the weight. Further, the second module 130 may be able to use statistical analysis and regulations for baggage allowed in the cabin (e.g., based on the airline) and provide an average weight estimate. A computer vision algorithm could then compute the weight based on the statistical analysis and the hand baggage detected in the images. Based on one or more of these estimation techniques, the second module 130 can then provide an estimation of the weights of the hand baggage and their positions.
The data of the passenger weights and positions from the first module 120, and the data of the hand baggage weights and positions from the second module 130 is then fed to a third module 140. The third module 140 is a cabin weight distribution module. The third module 140 calculates a cabin weight distribution based on the data from the first and second modules 120, 130. The cabin weight distribution can then be used to calculate a center of gravity for the aircraft. Optionally, these values may then be fed to the Flight Management System such that the center of gravity of the aircraft may be updated, and the pilot may be informed of a change in the weight distribution. As shown in
An example of a method of using the system 10 of
The steps above may also be repeated throughout the entire flight (including take-off and landing) for up-to-date weight distribution calculations.
An example of passenger weight estimation is shown in
Although identifying passengers wearing winter or summer clothes is described and shown in
The systems and methods described above may, therefore, be used to determine passenger and hand baggage weight distribution to provide an overall weight distribution of the aircraft. The overall weight distribution of the aircraft may then be determined in real-time before flight, during flight and during landing of the aircraft. The system and method provide an accurate weight distribution to a pilot or Flight Management System during the entire process of boarding, seating, flight, landing and disembarkation of the aircraft.
Although this disclosure has been described in terms of preferred examples, it should be understood that these examples are illustrative only and that the claims are not limited to those examples. Those skilled in the art will be able to make modifications and alternatives in view of the disclosure which are contemplated as falling within the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
20217106.2 | Dec 2020 | EP | regional |