In aviation, takeoffs and landings are relatively dangerous compared to other portions of a flight. Some of the risks associated with takeoffs and landings may be surface hazards that exist on runways or landing pads, such as snow, ice or water that might not be seen by the pilot. Snow, ice, or water may cause the aircraft to skid or hydroplane increasing the chance of an accident. Knowledge of such conditions allows the pilot or the autopilot system to make changes or take precautions to compensate for such conditions. However, seeing and recognizing such hazards can sometimes be difficult.
Cameras have been used in an effort to facilitate detection of hazardous conditions. If a surface hazard condition, such as water or ice, is detected on a runway or landing pad by a camera-based system, a pilot can be warned or otherwise notified of the surface hazard. For autonomous aircraft, hazardous conditions detected by a camera-based system may be used to make control decisions to mitigate or avoid the effects of the hazardous condition. However, it can be difficult for camera-based systems to detect at least some hazardous conditions. For example, water or ice on a runway is often substantially transparent and, therefore, can be difficult to detect. In this regard, ice or water allows light to pass and reflect from the surface of the runway or landing pad. Thus, in an image captured by a camera, a portion of a runway or landing pad covered by water or ice may appear similar to other portions of the runway or landing pad, thereby making it difficult to use segmentation or other known image processing techniques to detect the presence of water or ice on the runway or landing pad.
The disclosure can be better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other, emphasis instead being placed upon clearly illustrating the principles of the disclosure. Furthermore, like reference numerals designate corresponding parts throughout the several views.
The present disclosure generally pertains to systems and methods for detecting surface conditions. A system in accordance with one embodiment of the present disclosure is mounted or otherwise positioned on a vehicle and detects surface conditions external to the vehicle by capturing, processing, and analyzing multiple images of differing polarization. In this regard, the system uses at least one image sensor to capture a plurality of images of external areas (e.g., roadways, taxiways, or landing zones, such as runways or landing pads). At least one image is polarized differently than another image, and the two images with differing polarizations are compared to provide a comparison image. As an example, two images of the same surface may be orthogonally polarized and subtracted, though other types of polarization and comparisons may be performed in other embodiments. Certain types of surface conditions, such as water, ice, or snow, may have certain characteristics or signatures in the comparison image, thereby facilitating detection of the presence of these surface conditions in the comparison image. Thus, the comparison image may be analyzed to detect certain surface conditions, such as surface conditions that may be hazardous to the operation of the vehicle. When a hazardous surface condition is detected, a user (e.g., a pilot or driver) of the vehicle may be notified or information indicative of the detected surface condition may be used to control the vehicle.
Note that there are a variety of techniques that may be used to capture images of different polarizations. As an example, two or more cameras having different polarization configurations may be used. In this regard, each camera may have a polarizing filter that filters light differently than the polarizing filters of the other cameras. For example, one camera may have a filter that permits light in a first direction (e.g., a vertical direction) to pass, and another camera may have a filter that permits light in a direction (e.g., a horizontal direction) that is orthogonal to the first direction to pass. In other examples, a single sensor may be used to capture multiple images having different polarizations. For example, it is possible to use a camera with a polarizing filter that is moved, removed, replaced, or exchanged while taking successive shots. In another example, a single sensor may be used with a single polarizing filter that provides multiple polarizations to the sensor.
Once the system detects a hazardous surface condition for a surface that the vehicle is traveling or will travel, the system may warn a user of the hazardous condition in a variety of ways. For example, the system may activate an indicator light or provide some other visual warning in response to a detection of a certain hazardous condition, such as ice, snow, or water. If desired, an audio warning, such as a buzzer or voice recorded message, may be output to the user. In some embodiments, the output provided by the system may recommend certain user actions, such as certain types of braking maneuvers or other types of maneuvers for controlling operation of the vehicle. In some embodiments, the system may use information on the surface conditions to predict a braking distance required to bring the vehicle to a stop, and the system may output information indicative of the braking distance and/or whether the surface is suitable for operation. As an example, the system may compare the braking distance to a length of a runway and provide a warning if the runway is not sufficiently long to perform a safe braking maneuver. The system may also provide information indicating the location of a detected surface condition. For example, system may generate an image of a roadway, runway, taxiway, or landing pad and indicate the location of the detected hazard on the generated image.
Based on the surface conditions detected by the system, various actions may be taken to control the operation of the vehicle whether such control is implemented by a human operator (e.g., a pilot or driver) or by a control system, such as would be the case for an autonomous vehicle. As an example, a decision may be made to divert the vehicle away from a hazardous surface condition. In this regard, a decision may be made to land an aircraft at a different location in response to a detection of a hazardous surface condition at a landing zone. In another example, the vehicle may be controlled to bring the aircraft to a stop prior to reaching a hazardous surface condition detected by the system or otherwise controlled (e.g., steered) to avoid the hazardous surface condition. In other embodiments, decisions may be made to change vehicle operating characteristics based on the surface conditions (e.g., anti-lock braking thresholds or braking methods), and so forth. For example, for an aircraft reverse thrust and air braking may be relied on to a greater extent in response to a detection of a hazardous surface condition on a runway. In some embodiments, in the presence of ice or snow, certain braking techniques (e.g., a reduction in the braking force applied to one or more wheels) may be implemented to reduce the likelihood of skidding or hydroplaning.
In some embodiments, the system may be configured to wirelessly transmit information indicative of detected surface conditions from the vehicle. For example, information indicative of the type and location of certain surface conditions on a runway or landing pad may be reported to airport maintenance crew who may attempt to remove or compensate for the surface condition. As an example, the maintenance crew may apply salt on the runway for melting the detected ice. In another example, the information may be reported to other vehicles to warn other pilots, drivers, or control systems for these vehicles. The information may be used to update a map showing hazardous surface conditions. In other embodiments, the information provided by the system may be used for other purposes.
Note that a surface hazard 70 generally refers to any surface condition or anomaly that may be a hazard to the safe operation of the vehicle 10 if the vehicle encounters the surface condition during operation. As an example, a surface hazard 70 may be a pot hole in the pavement of the landing zone 100 or ice, snow or water on a surface of the landing zone 100.
While in this example vehicle 10 is an airplane, in other embodiments, the vehicle 10 may be of any type including motorcycles, cars, and trucks. The vehicle 10 may also be other types of aircraft, such as helicopters, drones, and vertical takeoff and landing (VTOL) aircraft. Further, the vehicle 10 may be controlled by a user (e.g., pilot) on board the vehicle 10, or control of the vehicle 10 may be autonomous, such as by a controller on the vehicle or at other location. Exemplary autonomous vehicles are described by U.S. Application No. 16/302,263, entitled “Self-Piloted Aircraft for Passenger or Cargo Transportation” and filed on Nov. 16, 2018, which is incorporated herein by reference.
While the polarizing sensor 20 is depicted at the nose of the vehicle 10, it could be placed anywhere (e.g., on the wings or at the top or bottom of the fuselage) with a view of the area to be evaluated. As an example, for a VTOL aircraft, the sensor 20 may positioned underneath the aircraft 10 to view the area directly below the aircraft during a takeoff or landing. The sensor 20 may be mounted in a fixed position or it may be mounted such that it can move (e.g., rotate left, right, up, and down) to allow the image sensor to monitor different fields of view. In addition, any number of polarizing sensors 20 may be used on the vehicle 10. As will be described in more detail below, a polarizing sensor 20 is configured to capture a polarized image and may include one or more polarizing filters for providing polarized light.
As used herein, a “polarized image” refers to an image of polarized light. As an example, a polarized image may be formed by passing light through a polarizing filter (such as a vertically-polarized filter or horizontally-polarized filter) and then captured by an optical sensor (e.g., a camera) to form a polarized image. In accordance with some embodiments of the instant disclosure, images of light polarized in different directions are compared in order to identify surface conditions that may otherwise be difficult to see with the naked eye.
In this regard, light reflects differently from different types of surfaces. As an example, light may reflect from asphalt, such as may be used for runways, taxiways, landing pads, or roadways, differently than from water or ice formed on the asphalt. Such water or ice may be substantially transparent making it difficult to see or identify the water or ice in an unpolarized image or with the naked eye. However, by comparing differently polarized images of the same scene, differences in the reflection properties of the water or ice relative to the reflection properties of the asphalt can be accentuated, thereby facilitating detection of the water or ice on the asphalt.
To better accentuate these differences, it may be desirable for the polarized directions of the images in the comparison to be as different as possible. As an example, two images that are orthogonally polarized (e.g., a horizontally-polarized image and a vertically-polarized image) may be compared. However, it is possible for the difference in polarization to be less or otherwise different in other embodiments.
Note that there are various techniques that may be used to perform a comparison of images that are polarized differently. In some embodiments, the comparison may be performed by subtraction. As an example, a pixel-bypixel subtraction may be performed such that a pixel in one image is subtracted from a corresponding pixel (e.g., a pixel representing the same geographic location) in the other image, resulting in a “differential image” where each pixel value of the differential image is the difference between corresponding pixels of the polarized images. In other embodiments, other types of comparisons may be performed. As an example, addition, multiplication, or other types of mathematical operations may be performed on corresponding pixel values.
As an example, the differences of the pixels representing a surface hazard 70 (such as a patch of ice or puddle of water) may be significantly greater (or otherwise different) than the differences of the pixels representing a surface (e.g. asphalt) of the landing zone 100, such as a runway. Thus, the surface hazard 70 may appear accentuated in the differential image 330 relative to the landing zone 100, thereby facilitating detection of the surface hazard 70.
Moreover, certain surface conditions may exhibit certain patterns or ranges of difference values making it possible not just to detect the presence of the surface condition but to identify the type of surface condition (and, hence, hazard for surface conditions that are hazardous). In this regard, a surface condition or hazard of a certain type may have a signature in the differential image 330 that can be learned and then used to identify the type of or, in other words, classify surface condition in the image 330. Thus, a system may use the differential image 330 not just to detect the presence of a surface hazard but also identify its type. Note that
The optical sensors 420 may be cameras, arrays of photo detectors, or other type of sensors for capturing images. Images captured by these optical sensors 420 are transmitted to the controller 430, which compares the images to detect surface conditions (e.g., surface hazards) and provides information indicative of the detected surface conditions to an output interface 440 and/or flight control system 450 as will be discussed later in more detail.
Controller 430 can be implemented in a variety of ways including specific analog hardware, general-purpose machines running software, or a combination thereof.
At step 630, the resulting image is evaluated for surface conditions. The controller 430 performs segmentation, identification, and classification on the comparison image 530. In some embodiments, segmentation, identification, and classification may be performed on the original captured images 520 and used with the comparison image to further segment, identify, and classify the objects, features, and hazards in view. Segmentation can also be used to eliminate false positives or to change how a detected condition or hazard is processed. For example, a portion of the resulting image or original image may be identified as an area of no interest and then culled so that it is not analyzed for detection of surface hazards or other types of surface conditions. As an example, a portion of an image may be identified as sky for which no surface hazards should be present. Such a portion can be culled so that it is not further processed by the controller 430.
External factors may also affect the evaluation (e.g., classification) of surface conditions. Such external factors may include location, date, reported air temperature, reported ground temperature, physical characteristics of etc. Location may be detected through a location sensor, such as a global positioning system (GPS) sensor. As an example, in evaluating the surface conditions, the controller 430 may detect a surface condition having a signature similar to that as ice. However, if the surface temperature for the region is above a certain threshold above freezing, such as 50 degrees Fahrenheit, for example, the controller 430 may be configured to refrain from classifying the surface condition as ice. If the vehicle 10 is located over a region with a high concentration of swamps, then the controller 420 may be configured to identify a region with a significant percentage of surface area covered with water as a “swamp.” However, if the vehicle 10 is located over a region known to be free of swamps, then the controller 430 may refrain from identifying such an area as a “swamp.” External factors may be used in other ways to aid in the identification of surface conditions in other examples.
In evaluating the surface conditions, certain surface hazards may be detected, such as water, ice, or snow. Features may be detected during this process, such as the presence of route significant objects (e.g., a road, runway, or taxiway). Features may include the interpreting of signs and markings associated with an object (e.g., street signs, lights, runway markings, runway markings, etc.). As an example, runway markings may be identified in a captured image and used to identify and define the boundaries of the corresponding runway. Features may also broadly include estimates about the characteristics (e.g., length of the runway, flatness of a field). Such estimates may be determined based on the captured images. As an example, the length of a runway may be estimated based on its length in one or more captured images. In other embodiments, the length of a runway may be predefined and stored in the system 440. As an example, a runway could be identified based on the vehicle’s location relative to the runway, and the length of the identified runway may be retrieved from memory. In other embodiments, other techniques for estimating characteristics of features are possible.
Hazards and features may be interrelated. For example, some surface conditions may be classified as a hazard depending on how the surface condition relates to detected features. As an example, in some embodiments, ice may be identified as a surface hazard only if it covers a certain percentage or certain areas of the vehicle’s landing zone 100, such as a runway, landing pad, or roadway segment. As an example, if ice is located near the edge of a runway but the center of a runway is substantially free of ice, then the ice may not be a threat to the safe operation of the vehicle 10 and, thus, might not be classified as a hazard.
The controller 430 is configured to provide information on surface conditions, including surface hazards at step 640. This information is provided to the output interface 440 and flight control system 450. Based on the surface conditions detected, one or more actions may be performed.
These actions may come in the form of providing information, warnings, or alerts through audio or visual medium of the output interface 440. In an operator controlled vehicle 10, these alerts might come in the form of sounds or lights. For example, an indicator light may be illuminated to indicate particular hazards, a class of hazards, or hazards in general (e.g., ice on road light, slick conditions light, or hazard light). In some embodiments, the controller 430 through the output interface 440 may indicate the location of a surface hazard by use of a display (e.g., heads up display or monitor displaying text, a map, augmented reality, or a display image with the hazard location indicated or highlighted). In some embodiments, this may come in the form of displaying the comparison image 530 over one of the captured images 520 or at least subsets of the comparison image 530 determined to be a hazard or otherwise of interest. In other embodiments, the hazards or objects of interest may be highlighted, circled, or otherwise indicated.
In some embodiments, the output may include recommendations to the pilot or driver regarding operation of the vehicle, such as suggestions for certain maneuvers. As an example, based on the detected surface conditions, including hazards, the controller 430 may be configured to detect a braking distance for the vehicle 10 and provide information on the braking distance to the pilot, driver, or other user. In this required, the braking distance is generally the distance that the vehicle 10 travels while performing a braking maneuver to bring the vehicle 10 to a stop. The braking distance may be longer when the runway, roadway, or other pathway has certain surface conditions, such as ice, snow, or water. In some cases, the controller 430 may store predefined data indicative of the expected braking distance for the vehicle 10 for different types of surface conditions and look up or otherwise retrieve the braking distance associated with the type of surface condition detected. In other embodiments, the controller 430 may calculate the braking distance based on measured performance parameters of the vehicle 10, such as the vehicle’s ground speed or other parameters measured by the vehicle’s sensors. Such calculation may take into account the types of surface conditions detected. If desired, the controller 430 may provide an output indicative of the estimated braking distance, and a user may make control decisions based on such information, such as whether to divert the vehicle 10 away from surface hazards (e.g., select a new landing location or new path for the vehicle 10) or select a certain braking procedure for slowing or stopping the vehicle 10. As an example, in response to adverse surface conditions that would increase braking distance for a normal braking procedure, a pilot may select a different braking procedure that tends to reduce braking distance. Or is less affected by the detected surface conditions. As an example, when landing on a runway, a pilot may elect to utilize reverse thrusters if a surface hazard, such as ice or water, is detected on the runway.
In some embodiments, the controller 430 may be configured to compare the estimated braking distance to a length of the runway or other pathway available for the braking procedure and provide a warning if the braking distance exceeds or is within a certain range of such length. For example, if the controller 430 determines that the estimated braking distance is close to the length of the runway or other pathway, the controller 430 may provide a warning indicating to the pilot or other user that the braking procedure may be unsafe for the types of surface conditions detected.
In some embodiments, the controller 430 may be configured to wirelessly transmit the information indicative of the surface conditions from the vehicle. For example, information regarding surface hazards may be sent to a mapping service or other vehicles to warn other drivers or pilots of the surface hazards, or such information may be sent to ground crews who may then take actions to mitigate or remove the detected surface hazards.
In some embodiments, similar information described above as being output to the output interface 440 may also or alternatively be output to the flight control system 450, which may automatically control operation of the vehicle 10 based on such information. As an example, the vehicle 10 may be autonomously controlled by the flight control system 450, which may control the vehicle 10 based on surface conditions detected by the system 400. As an example, the flight control system 450 may be configured to select an appropriate braking maneuver to use, as described for the user-based decisions, or decide whether to abort a landing or otherwise divert the vehicle 10 based on surface conditions. If the vehicle 10 is an aircraft, the flight control system 450 may select or suggest a suitable landing area based on the surface conditions.
For land-based vehicles, roads, shoulders, bridges, and the like may be evaluated. Braking characteristics may be changed based on surface conditions, such as when to initiate antilock braking, limit braking over hazards, or limit braking to hazard free areas. Path of travel may be adjusted to position tires on hazard-free sections of the pavement or pathway.
In some embodiments, the polarizing sensor 20 may be attached to a pivoting or movable mount allowing the sensor 20 to track areas of interest while the vehicle 10 is moving. In other embodiments, multiple sensors 20 could be used to expand the area around the vehicle 10 capable of being monitored for surface conditions. In the event that a landing zone 100 for the vehicle 10 is found unsuitable or for some reason an emergency landing becomes necessary, the sensor 20 can help scan the area around the vehicle 10 and locate potential landing areas by providing additional information about the surface conditions of various potential landing zones 100 to the vehicle operator or the flight control system 450.
In some embodiments, the controller 430 is configured to process an image based on a detected surface hazard. For example,
For certain types (e.g., classifications) of surface hazards, the optical properties of the surface hazards may cause artifacts in the un-polarized images that could result in errors in processing or analyzing such images. In some embodiments, when the controller 430 detects a certain type of surface hazard from polarized images, as described above, the controller 430 may be configured to remove or otherwise adjust the detected surface hazard in an unpolarized image in an effort to prevent the surface hazard from affecting the subsequent processing or analysis of the un-polarized image.
In this regard, the controller 430 may be configured to determine the location of a detected surface hazard in one or more of the polarized images and then identify the corresponding location in the un-polarized image where the same surface hazard should be located. That is, upon detecting a surface hazard of a certain type from polarized images, the controller 430 may identify the surface hazard’s location in the un-polarized image and then remove or otherwise adjust the pixels at such location in the un-polarized image.
Note that there are various techniques that may be used to remove or adjust the pixels at the location of the surface hazard. As an example, the pixel values at such location may be replaced with predefined pixel values. Alternatively, surrounding pixel values in the un-polarized image close to (e.g., within a certain distance of) the surface hazard may be averaged or otherwise combined to generate new pixel values to be used to replace the pixel values of the surface hazard. Yet other techniques for adjusting the pixel values of the surface hazard are possible in other embodiments. Moreover, by removing or otherwise adjusting the pixel values of certain types of surface hazards in the un-polarized images, at least some errors induced by artifacts that would otherwise be present in the un-polarized image may be prevented. Note that similar technique may be used to adjust pixel values of un-polarized images in the embodiment depicted by
The foregoing is merely illustrative of the principles of this disclosure and various modifications may be made by those skilled in the art without departing from the scope of this disclosure. The above described embodiments are presented for purposes of illustration and not of limitation. The present disclosure also can take many forms other than those explicitly described herein. Accordingly, it is emphasized that this disclosure is not limited to the explicitly disclosed methods, systems, and apparatuses, but is intended to include variations to and modifications thereof, which are within the spirit of the following claims.
As a further example, variations of apparatus or process parameters (e.g., configurations, components, process step order, etc.) may be made to further optimize the provided structures, devices and methods, as shown and described herein. In any event, the structures and devices, as well as the associated methods, described herein have many applications. Therefore, the disclosed subject matter should not be limited to any single embodiment described herein, but rather should be construed in breadth and scope in accordance with the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2019/068408 | 12/23/2019 | WO |