The invention refers to a method for automated docking a passenger boarding bridge to an aircraft.
Nowadays the docking of the passenger boarding bridge (PBB) to an aircraft is performed manually. An operator is using a joystick, giving immediate operation signals to the drive means. This kind of docking is time consuming and requires an operator. In case the operator is still busy at another PBB, the docking is delayed, which leads to costly delays in ground handling of aircrafts. Consequently, the market requires a possibility of automatic docking, in which no fully trained and certified operator is needed through the complete docking procedure anymore.
A camera based automatic docking is disclosed in WO 2017/198605 A1. To detect a door contour, a line of windows is detected. A door contour is searched at a position adjacent to the line of windows. Based on the detected door an automatic movement of the PBB is initiated, to get the PBB in alignment with the door of the aircraft.
It is an object of the present invention to provide an improved method for automated docking a PBB to an aircraft.
The invention comprises a method according to claim 1 and an arrangement according to claim 21; embodiments are disclosed in the subclaims and the description.
The aircraft having a fuselage and a door, to which door a bridgehead of the passenger boarding bridge is to be aligned. The method comprising the following steps: determining a target position in relation to the door, controlling a movement of the bridgehead based on the determined target position.
In an embodiment the PBB comprises a tunnel to which the bridgehead is connected in a rotatable manner, when vied in top view. The tunnel is located between the bridgehead and a terminal building.
In an embodiment in a first, in particular prepositioning, phase an assumed position of a first accuracy is detected using a first detection technology; in a subsequent phase, in particular after moving the bridgehead into the direction of the assumed position, the target position of a second accuracy is determined by using a second detection technology, which is different from the first technology; wherein the first accuracy is lower than the second accuracy. Here the first technology is in particular used for bringing the PBB into a condition in which the second technology can work properly. In particular a camera based system has a limited field of view; for starting the camera based system the door to be docked must be brought into the field of view.
In particular the second accuracy is equal or better than 5 cm. That means in particular, that the target position is determined with a fault of max. +/−5 cm.
In particular the first accuracy is worse than 5 cm, in particular worse than 20 cm. That means in particular, that the assumed position may be determined with a fault of more than +/−5 cm, in particular more than +/−20 cm.
In an embodiment a type of aircraft is determined. Here e.g. the flight coordination of the airport can be consulted which provides the data type of the aircraft which is expected to arrive at next at the respective PBB. Then a database can be consulted which has information about the door position within each aircraft type and which has information about the parking position of the aircraft at the respective PBB. Both data lead to a rough position of the door to be docked with an accuracy, which is sufficient for the preposition and to start final docking, but not sufficient for the final phase of docking. The database may also comprise an information about which door of a plurality of doors is to be docked and can provide this information to the PBB control.
In an embodiment the stand comprises at least two passenger boarding bridges. The method comprises the step of: automatically selecting one of a plurality of the at least two passenger boarding bridges, allocating the target position to the selected passenger boarding bridge selected out of the at least two passenger boarding bridges for controlling movement of the bridgehead of the selected passenger boarding bridge. In a more particular embodiment there are two passenger boarding bridges selected and to each separate boarding bridge an individual target position is allocating to. The selection can be made in dependency of the centerline of a plurality of centerlines of the single stand on which the aircraft is positioned and/or in dependency of the determined aircraft and/or in dependency of the selected door.
The type of aircraft may be determined by a visual docking guidance system (VDGS), which may be part of a control arrangement to control the bridgehead movement. Here no flight coordination database needs to be consulted; the control arrangement can obtain the aircraft type information in a self-sustaining manner.
In an embodiment the target position is determined via an optical scan of the door. The optical scan may be performed by laser measurement and/or image recognition.
In an embodiment a longitudinal coordinate and a transverse coordinate of the target position is determined by analyzing a scanned door contour and/or a painted contour marking of the door; a height coordinate of the target position is determined by analyzing the position of a scanned u-shaped marking below the door. In particular the scanned u-shaped marking is distinct to the painted contour marking.
In an embodiment, in a second, in particular modeling, phase a digital three-dimensional model of the door is determined. In particular the digital three-dimensional model may be created during the docking procedure by scanning the door. Alternatively or in combination thereto the three-dimensional model is determined by retrieving a prestored digital three-dimensional model from a database. The prestored model can be identified with the help of the identified aircraft type as described above.
A combination of both mentioned determining methods can be used to improve the quality of the process. So, in a first substep the digital three-dimensional model can be created by scanning; in the second substep the created digital three-dimensional model can be compared with any prestored door model. If the comparison gives a positive feedback (e.g. the created model conforms to a prestored model) the docking procedure may continue—otherwise a failure mode can be initiated.
In an embodiment in occurs that in a third, in particular final, phase the door is merely partially in the field of view of the main scanning device so that the target position is not located in the field view. That may lead to a situation where the camera based docking is not able to follow the target position. Instead of the target position an auxiliary position, which is in relation to the door, will be monitored by the main scanning device (50). A spatial relationship between the target position and the auxiliary position is thereby provided by the digital model. With the help of the spatial model it is possible to conclude on the target position by calculating the target position based on the monitored auxiliary position and the spatial relationship.
The aforementioned feature enables kind of freedom when looking for a suitable position of the camera, since there are other requirements which need to be considered. So, the position of the scanning device should be out of reach of the passengers, in particular that a passenger cannot hit the scanning device with his head, the passenger may not damage the device or the passenger may not stumble upon the scanning device. So, on the one hand the scanning device should be located outside of the space, in which the user may be located anytime during boarding. On the other hand the scanning device position should support a best possible field of view, catching the door as far as possible over the entire docking procedure. Further it is preferred that commercially available scanners may be used, such as a stereo camera or a laser scanner.
For supporting the aforementioned requirements, a particular suitable position has been developed. In an embodiment the main camera for detecting the target position is located
In an embodiment during the third phase an auxiliary scanning device is used to determine the target position. The auxiliary scanning device is in particular located in a manner that the target position is within the field of view until docking has finished. Here the combination of both scanning devices enable that the target position is always in the field of view of at least one of the scanning devices. So, the main scanning device may be located for a best possible view on the target position during an early phase; the auxiliary scanning device may be located for a best possible view on the target position during the final phase.
In an embodiment in which both the main and the auxiliary scanning device are used, during a phase, in which the target position is within the fields of view of both the main scanning device and of the auxiliary scanning device, the functionality of the cameras is checked by comparing the scanning results of both scanning devices to each other. In particular it is important that both devices provide the same position results for the identical target position; if there is a deviation in the results, at least one of the scanning devices has a malfunction or is not properly calibrated; a failure mode can be issued.
In an embodiment in case the step of automated determination of the target position leads to a faulty result, a user interaction is requested. A possible user interaction may be:
Identifying the target position by the user interaction. The user indicates the target position on a screen with a suitable HMI input device, e.g. mouse click on the visualization of the target position at a screen.
Continuing the docking procedure by the operator manually.
The operator does not need to be present at the PBB, moreover the operator can be located at a distant location and may use a remote control to interact with the PBB control unit.
In an embodiment the method comprises the following steps: Establishing a trajectory defining the movement of the bridgehead to align the bridgehead with the target position, the trajectory comprising in particular a path of movement and in particular a course of orientations. Moving the bridgehead along the established trajectory. The path may comprise a group of coordinates. The orientations indicate the angular orientation of the bridgehead. During automated docking the bridgeheads movement is performed in accordance with the established trajectory.
In an embodiment the method comprises the following steps: during moving, continuing determining the target position and reviewing the trajectory based the continuously detected target position; in particular adapting the trajectory if a deviation from a previously determined target position is determined and/or applying a safety mode if a deviation from a previously determined target position is exceeding a predefined critical value.
So, in principal the target position may be continuously monitored by the scanning device. In contrast to just measuring once it has the advantage to improve the accuracy of the docking procedure. If a trajectory is used, the trajectory is permanently updated, if deviations from in the measurements occur. When a deviation reaches a critical value, a safety mode can be issued; in the safety mode the movement speed can be decreased or the movement can be stopped.
In an embodiment the method comprises the following steps: Observing the apron with respect to an obstacle; detecting the position of the obstacle; assessing the relevance of the obstacle by comparing the position of the obstacle with the trajectory. The position of the obstacle may also comprise a short term position during movement of the obstacle. The comparison may come to the result that a collision between parts of the PBB, in particular the drive, and the obstacle is likely; then the safety mode will be issued. Comparing the position of the obstacle with the trajectory leads to a filtering of the objects, so not all objects detected in the area of the PBB will cause issuance of the safety mode.
In an embodiment that the movement of the bridgehead is controlled in a manner, in particular that the trajectory is established in a manner, that in a subsequent phase of movement, in particular when the door distance (distance between the door and the approaching edge of the bridgehead) reaches a value of 0.5 m,
That leads to a condition in which the movement of the bridgehead does not have a movement component along the fuselage in x or z direction, resulting in a decreased risk of damage to the fuselage.
In an embodiment the movement of the bridgehead is controlled in a manner, that the movement speed is dependent from a distance between the bridgehead and the door, in particular that the movement speed is decreasing as the distance between the bridgehead and the door is decreasing, and/or
in particular that the movement speed is lower than 0.2 m/s, in particular lower than 0.15 m/s, if the door distance is smaller than 1 m, and/or
in particular that the movement speed is higher than 0.4 m/s if the door distance is larger than 2.5 m.
In an embodiment a calibration step is performed before docking, thereby using a calibration tag located at a fixed location of the bridgehead within the field of view of a camera to be calibrated. Due to the continuous movements and resulting vibrations in a PBB the orientation of the scanning device may change slightly, which may result in inaccurate results. With the help of the calibration step these influences are made obsolete.
In an embodiment after the movement of the bridgehead for aligning the passenger boarding bridge with the target position is finished, a validating step is performed, in the validating step it is checked, whether the bridgehead is aligned properly to the door. Here the quality may be checked and if a sufficient level of quality is not determined, then an operator may be prompted to confirm or disconfirm, that the passenger boarding bridge is properly docked. In particular, there may be performed a visual check between a marking on the bridgehead floor and the door contour; if both are aligned in a predetermined manner, docking was successful.
In an embodiment the aircraft is parked in a MARS stand; the MARS stand having a plurality of separate centerlines associated to one passenger boarding bridge. For determining the target position and/or for detecting the assumed position an information is used, on which of the centerlines of the MARS stand the aircraft is parked. This information may be retrieved from a database and/or a VDGS system. By knowing on which centerline the aircraft is parked, the area where to look for the target position and/or the assumed position is massively reduced. Consequently the bridgehead can be brought into a propoer preposition, which is a good starting position for the camera based docking procedure.
A MARS stand comprises a plurality of separate centerlines associated to one passenger boarding bridge. The centerlines indicate different parking positions for different aircrafts. The term “centerlines associated to one passenger boarding bridge” means: Said passenger boarding bridge can be connected to an aircraft parked on a first one of the centerlines as well as to an aircraft parked on a second one of the centerline. Separate centerline means: The centerlines are a) not parallel or b) parallel but at a distance. The MARS stand may comprise two or more PBBs.
The inventive arrangement, comprising: a Passenger boarding bridge, a control arrangement to control a movement of a bridgehead of the passenger boarding bridge. The control arrangement is configured to control a method according to any of the preceding claims.
The invention is explained in more detail by means of the figure, the figures show:
Within the scope of the present application a coordinate system is defined, which is relevant for the docking procedure (
At first reference is made to
A ribbon is applied with paint on the fuselage 2 and/or the door 3, which highlights the door contour 31. This ribbon is called painted contour mark (PCM) 32. As the different configurations of
For example in the embodiment of
In addition to the PCM 32 a u-shaped mark 33 is provided at the lower part of the aircraft door 3. The upper line of the u-shaped mark 33 is collinear with the door sill 31L as shown in all three embodiments of
It is mandatory for aircraft manufacturers to add the PCM 32 and the u-shaped mark 33; for details are described in technical manuals and the federal US regulation 14 CFR 25.811-“Emergency exit marking”.
Herein first and second lower reference points T1, T2 are shown, which may be used as a target position within the present invention. Each lower reference point T1 T2 is in particular
Respectively there are third and fourth upper reference points T3, T4 shown, which are of interest for the present invention. Each upper reference point T3 T4 in particular
If possible, the foremost/rearmost point P1, P2 which are used for defining planes X1 and X2, are the foremost/rearmost points of the door contour 31 as shown in
If the contour 31 itself cannot be extracted clearly it is also sufficient that the most foremost/rearmost points of the PCM 32 are used as the most foremost/rearmost points P1, P2 for defining the planes X1, X2. For detecting the position of the door in longitudinal direction x it is merely important to have positional information which is roughly centered with the door. Each of the reference points Ti has the coordinates xti, yti, zti (for i=1, 2, 3 or 4).
As can be seen in the different illustrations the PCM in different embodiments the PCM does not match with the door contour 31; however in any case the PCM 32 is sufficiently centered in longitudinal direction with the door contour 31.
In this example the VDGS 94 recognizes, that the aircraft 1 is an Airbus A320-200, and should be located on a predetermined parking position. In addition in a database 91, in particular connected to the flight control center, information about the type and identification of the next aircraft expected at the gate may be stored. In fact the aircraft parking position will slightly deviate from the exact predetermined parking position, what can be detected by certain types of VDGS 94. The VDGS 94 is connected to the database 91, the database 91 may comprise structural information of the aircraft, in particular the relative position of the door 3 to be docked within the aircraft coordinate system. Based on the available information with respect to the position of the plane 1 at the apron and the relative door position within the aircraft 1 an assumed position 8 of the aircraft door 3 can be calculated. Here the assumed position is an area 8, in which the position of the aircraft door may be located.
Alternatively or in combination the database 91 may comprise immediate positional information of the assumed door position, if the type and/or identification of the next arriving aircraft is stored, since each aircraft of the same type has to be parked at the same parking position and comprise identical located doors. The database may also comprise individual information of which door is to be docked. This in particular of interest for wide-body aircrafts, which comprise two or more left doors in front of the wings, which may be considered for being docked by a standard (not overwing) PBB. Optional details are described later with reference to
The PBB 10 comprises, as usual, a tunnel 11 which is on the one end connected in a conventional manner to a rotunda located at the airport building (not shown). On the other end the PBB comprises a bridgehead 13, which is to be brought into alignment with the aircraft door 3, so that passengers can leave the aircraft 1 via the door 3 and the tunnel 11, in direction 21 to the airport terminal building.
Conventional drive means 12 are provided as to adjust the position of the bridgehead 13 by adapting the length and orientation of the tunnel 11. A conventional lift system (not shown) may be provided to adjust the height position of the bridgehead 13. Additionally the relative angular orientation between the bridgehead 13 and the tunnel 11 can be adapted, since a pivotable joint between the bridgehead 13 and the tunnel 11, in particular a round cabin 22, is provided between the bridgehead 13 and the tunnel 11.
The operation of the drive means 12 is controlled by a control unit 93 of the PBB 10. The control unit 93, the VDGS 94, the database 91 and a data connection 92, connecting the aforementioned components are part of a control arrangement 90.
At the bridgehead 13 a main camera 50 is provided which is used for automatic docking of the bridgehead 13 to the door 3. The main camera 50 has a field of view 51. In situation A the passenger boarding bridge 10 is located in parking position. Here the door 3 is not located in the field of view of the main camera 50. Hence the camera based docking system cannot operate yet. Consequently, at first a prepositioning step has to be performed to bring the main camera 50 into a position, in which the door 3 is in the field of view 51 of the main camera 50.
During or before prepositioning a positional information is obtained with the help from the database 91, in particular in combination with the VDGS 94. This positional information is used for determining an assumed position 8 of the door 3. Based on that available information the control unit 93 initiates a first movement of the bridgehead 13 into a condition, where door 3 is in the field of view 51 of the main camera 50 (situation B in
In a subsequent phase B-C, which is the time between situation B (
For a proper docking it is essential, that the approaching edge 20 is properly aligned in parallel to the door sill 31L. Additionally the approaching edge should be aligned in a predetermined way in the longitudinal direction x; in particular the bridgehead 13 may be centered to the center of door gap 31 or may be aligned slightly offset to the center of the door gap 31 (to enable an opening of the door). A large door may collide with the side wall of the canopy, if the bridgehead is centered exactly. For aligning the bridgehead 13 in the longitudinal direction x the first reference point T1 and the second reference point T2 are used.
During phase B-C the main camera 50 is used for scanning the door 3. A result of that scanning is the creation of a digital three-dimensional model 3d of the door 3, which is shown in
In another embodiment a plurality of models 3d is already prepared and stored in a database 91. Here to each of a plurality of aircraft types an individual door model 3d is allocated. As discussed before, the VDGS 94 may be used for determining the aircraft type or the aircraft type expected may be stored in the database 91. The database 91 may be asked for providing the prestored model 3d associated to the determined aircraft type. “Prestored” means, that the door model 3d is already available in a database before the docking procedure begins and the door model 3d can be retrieved from the database 91 during docking. This may be used for determining the model 3d instead of creating a door model 3d each time during each docking process. Alternatively a prestored three-dimensional model 31 and a three-dimensional model 3d can be used together to verifying the aircraft type or to improve the quality of a created model 3d.
The obtained coordinates of the reference points T1-T4 are shown in
The result to be achieved in situation D is to properly align the approaching edge 20 with the door sill 31L. But since the fuselage itself is more and more covering the door sill 31L during at a certain situation in phase C-D it cannot be assured that the lower reference points T1 and T2 or any other point on the door sill 31L can be seen by the a camera reliably. Here situation C1 is a point in time, when reference points T1, T2 get out field of the view 51.
It is to be noted that “properly aligned” does not mean, that the approaching edge is in an exact overlapping condition with the door sill 31L. Rather a proper alignment may require a safety gap between the approaching edge and the fuselage of about 5 cm and the floor of the bridgehead should be aligned slightly below the level of the door sill (about 15 cm), so that a safety shoe can be placed between the door and the bridgehead floor 17.
During phase B-C1 the upper reference points T3, T4 and the lower reference points T1, T2 are within the field of view 51 of the main camera 50. The main camera 50 is a stereo camera, through which the relative positions of the reference points to the camera position can be calculated. This is done by usual stereoscopic analysis of the obtained pictures using available picture recognition algorithm.
In case the picture recognition algorithm does not provide a valid position of the reference points, the user can be prompted to assist e.g. by mouse clicking on to the illustration of the corners of the door, which are presented to the user on a screen.
Based on that the spatial coordinates (xt1, yt1, zt1), (xt2, yt2, zt2), (xt3, yt3, zt3), (xt4, yt4, zt4) of all four reference points T1, T2, T3, T4 are calculated (see box in
Based on the obtained model 3d a differential relationship e.g in form of the vectors
D13, D24 can be calculated. The first differential vector D13 constitutes the spatial difference between the third reference point T3 and the first reference point Tl. The second differential vector D24 constitutes the spatial difference between the fourth reference point T4 and the second reference point T2.
In phase C1-D (C1 is a situation after situation C and before situation D), the first and second reference points T1, T2 are not visible to the main camera 50. However the third and fourth reference points T3, T4 are still visible and their position can be determined by the main camera. With the help of the model 3d, the coordinates of the first and second reference points T1, T2 can be obtained, in particular by calculating (see right column in box of
It is an advantageous that the door 3 is as long as possible visible by the main camera 50. Therefore the position of the main camera 50 is an important aspect. It has been found out that for the function of the present application it is advantageous, that the main camera is positioned
Alternatively or additionally an auxiliary camera 55 may be used for continuing determining the target position T1, T2 during the phase C1-D, when the target position is not in the field of view of the main camera 50. Exemplary the position of the auxiliary camera 50 is shown in
As can be seen in in
However the lowered position of the auxiliary camera 55 position has the increased risk to “loose” the view of the complete door earlier than the upper position of the main camera 50. To have the door as long as possible within the field of view of one single camera increases the scanning results and in particular the creation of the three-dimensional model 3d of the door. Consequently the upper position of camera 50 has advantages although the target position will be lost in the field of view 51. So, a critical distance d, at which the upper main camera 50 loses the parts of the complete door from its field of view is roughly about 1 m; a critical distance d, at which the lower auxiliary camera 55 loses the parts of the complete door from its field of view is roughly about 2 m. So, for establishing the model 3d, the upper camera is more advantageous.
As long as both cameras can see the target position T1, T2, the auxiliary camera 55 can be calibrated with the scan results of the main camera 50.
The trajectory 60 comprises also a course 62 of orientation 62b-d. Here the orientations 62b-d are vectors defining the direction in which the bridgehead 13 is pointing during the situations B, C and D. In the final docked situation D it is essential, that the approaching edge is oriented in parallel to the door sill 31 L. That means that in situation D the vector 62d is perpendicular to fuselage 2 in the area of the door 3. Note, that the fuselage may be curved, what is neglected in this description for keeping the complexity low.
As obvious from
The trajectory 60 can also be used for assessing an obstacle collision between the PBB and an obstacle. Generally an obstacle may be detected comparing a first image with a second image of any additional camera or another sensor, which can be attached a in the area of the drive 12. The first image may be a prestored image showing the apron area without any obstacle. The second image is an actual image, showing the current situation of the apron. With the help of picture recognition differences between the two images can be determined. Any object, which is present in the second image, but which is not present in the first image, may be considered as an obstacle.
But not all obstacle in the apron present a problem. Within the scope of the invention, only such obstacles may present a problem, which lie in the area of the trajectory.
The first obstacle 63 has a plan view distance to the trajectory of d63, which larger than a required minimum clearance distance c. Consequently first obstacle 63 is not considered as problematic. The second obstacle 64 has a plan view distance to the trajectory of d64, which is smaller than a required minimum clearance distance c. Consequently second obstacle 64 is considered as problematic. The presence of the second obstacle 64 will induce the control unit to switch into a safety mode. In the safety mode, the movement of the PBB may be stopped or at least a warning signal may be issued. It is possible that there are distinct safety mode, to which different clearance distances are allocated.
Due to vibrations and/or other environmental influences the calibration status of the camera may be invalid during operation of the bridge. Therefore the system comprises an auto-calibration procedure, which is described with reference to
A calibration tag 53 is provided at a defined position within the field of view 51, 56 of the camera to be calibrated, in particular of the main camera 50 and/or of the auxiliary camera 55. The tag 53 may be fixed with a tag fixture 54 to the bridgehead 13. The fixture may be a separate part as shown in
The position of the calibration tag 53 relative to the position of the camera 50, 55 to be calibrated is prestored. So, in a calibration step before docking the camera is calibrated. Hereby the camera performs a step of detecting the relative position of the tag by image recognition. The camera is then calibrated by comparing the detected position with the prestored position.
The vertical speed in height direction z (not shown in
In an embodiment to validate that the bridgehead is properly docked a validation tag 57 is attached provided e.g. at the floor 17 of the bridgehead 13. The validation tag 57 may be an optical mark, which can be detected and located by the main camera 50. The camera checks the alignment in particular in longitudinal direction x between the validation tag 57 and a hinged side 31 of the door 3 (in closed condition). The hinged side 31 of the door 3 is where the door hinge and axis of door swing may be located. Here the hinge side is the left side, so when viewed from the bridgehead, the door swings to the left side. The hinged side 31 may be optically detected with the help of the contour side 3S or of an area of the PCM marking 32 (see
The calibration tag 57 and the calibration tag 53 can be the same tag.
Within the scope of the present invention a main camera is described which is enabled to detect e.g. the location of the target. From this formulation is becomes obvious, that the term “camera” is used also for describing a more complex arrangement having, in addition to purely a photo sensor, huge image analyzing capabilities; this camera may be split into separate devices and may comprise a computer.
The invention provides a method which does not require any coding at the fuselage of the aircraft, which contains a coded information about the location of the door, e.g. a QR code or a RFID tag. Thus the invention does not require any preparation performed at the aircraft. Thus any individual aircraft arriving at the PBB can be processed with the inventive method.
In particular the preposition is/the prepositions are a selected from a number of predefined prepositions based on the type of aircraft and/or based on the parking position of the aircraft. During docking the it is determined, on which centerline of the MRS stand the aircraft is parked; based on the determined centerline the target position cab be determined in a rough way, in particular by retrieving an appropriate positional information from a database.
The MARS stand comprises two or more PBBs. Before docking it needs to be decided, which of the plurality of PBBS are to be docket. The decision can be made automatically thereby using predefined allocation or selection rules which can considere the type of aircraft to be docked, the specific centerline on which the aircraft is located, and/or a selection of the door which is to be connected by a PBB.
1 aircraft
2 aircraft fuselage
3 aircraft door
3
a closed door
3
b open door
3
d door model
3
l left side of door
3
h door hinge
3
s door swing range
4 reference points
5 apron ground
6 side window
7 cockpit window
8 assumed area of aircraft door
10 Passenger boarding bridge
11 tunnel
12 drive means
13 bridgehead
14 interior of bridgehead
15 canopy
16 canopy bumper
17 floor
18 floor bumper
19 cabin roof
20 approaching edge
21 direction to airport terminal building
22 round cabin
23 cabin side wall
24 centerline
25 MARS stand
31 door contour
31U door contour upper
31L door contour lower
31S door contour side
32 contour mark
33 U-shaped mark
50 main camera
51 field of view of main camera
52 picture
53 calibration tag
54 tag fixture
55 auxiliary camera
56 field of view of auxiliary camera
57 validation tag
60 trajectory
61 path
62 course of orientation
62
b-d orientation vector
63 first obstacle
64 second obstacle
90 control arrangement
91 database
92 data connection
93 control unit
94 VDGS
x longitudinal direction
y transverse direction
z height dorection
Z horizontal plane (within aircraft)
X vertical plane (within aircraft)
T1,T2 target position
T3,T4 auxiliary position
s rearward offset of main camera 50 behind approaching edge
tx,ty,tz coordinates of the position of the reference points
h height above ground
d distance between approaching edge and fuselage
Number | Date | Country | Kind |
---|---|---|---|
18382372.3 | May 2018 | EP | regional |
10 2018 211 492.7 | Jul 2018 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2019/063906 | 5/29/2019 | WO | 00 |