Aircraft may encounter a wide variety of collision risks during flight, such as debris, other aircraft, equipment, buildings, birds, terrain, and other objects. Collision with any such object may cause significant damage to an aircraft and, in some cases, injure its occupants. Sensors can be used to detect objects that pose a collision risk and warn a pilot of the detected collision risks. If an aircraft is self-piloted, sensor data indicative of objects around the aircraft may be used by a controller to avoid collision with the detected objects. In other examples, objects may be sensed and classified for assisting with navigation or control of the aircraft in other ways.
One type of sensor that can be used on an aircraft to detect objects is a LIDAR (light detection and ranging) sensor. The LIDAR sensor works by using a laser to send a laser beam or pulse at an object and calculating the distance from the measured time-of-flight and the intensity of the returning laser beam or pulse. The range for a LIDAR sensor can be defined by the sensitivity of the LIDAR sensor when collecting the returning laser beam or pulse. A range for a LIDAR sensor in applications involving use of the LIDAR sensor near the ground is typically limited to about 100-200 meters due to eye safety concerns related to operating the laser of the LIDAR sensor at a higher power. The relatively short range of a LIDAR sensor due to eye safety concerns can limit the usefulness of the LIDAR sensor in detecting objects in front of moving aircraft, which typically operate at high speeds.
The disclosure can be better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other, emphasis instead being placed upon clearly illustrating the principles of the disclosure.
The present disclosure generally pertains to vehicular systems and methods for modulating the range of a LIDAR sensor used by the vehicular system such as an aircraft. In some embodiments, an aircraft includes an aircraft monitoring system having sensors that are used to sense the presence of objects around the aircraft for collision avoidance, navigation, or other purposes. At least one of the sensors is a LIDAR sensor that can be modulated to increase the range of the LIDAR sensor (i.e., the distance at which the LIDAR sensor is able to detect objects). The range of the LIDAR sensor can be increased by increasing the power to the laser of the LIDAR sensor when the aircraft (and correspondingly the LIDAR sensor) is in a position where the increased power of the laser does not pose a risk of eye damage to humans or animals.
The increased range of the LIDAR sensor can be used when the aircraft is operating in a cruise mode (e.g., engaged in forward flight or moving in a horizontal direction) at a cruising elevation. When operating in cruise mode, if the aircraft detects an object within the beam scan or scan range of the LIDAR sensor, a determination is made as to whether there are eye safety concerns associated with the object. If there are eye safety concerns associated with the object (e.g., if the object is a bird, helicopter or building), the power level (and corresponding range) of the LIDAR sensor is reduced to avoid any risk of eye damage to a person or animal. The power level of the LIDAR sensor can be reduced for the portion of the scan range associated with the object (e.g., a safety range associated with the angular heading of the object). For the portions of the beam scan that are not associated with the object, the LIDAR sensor can remain at the increased range and power level. Once the object has moved from the scan range of the LIDAR sensor, the range and power level of the LIDAR sensor can be increased for the portion of the scan range that was at the reduced power level. If there is not any eye safety concerns associated with the object detected by the aircraft, the LIDAR sensor can continue to operate at the increased range and power level.
During takeoff and landing operations in hover flight, the LIDAR sensor of the aircraft can be operated at the reduced range and power level to prevent eye damage to any people or animals that may be in the vicinity of the takeoff/landing area or hover area for the aircraft. As the aircraft transitions from a takeoff operation in hover flight to a cruising operation, the range and power level of the LIDAR sensor can be increased since the possibility of eye damage to people or animals is not likely present at a cruising elevation where the presence of people or animals is not expected. Conversely, as the aircraft transitions from a cruising operation to a landing operation or hover flight, the range and power level of the LIDAR sensor are reduced to avoid the possibility of eye damage to people or animals since the aircraft is moving into an area where people or animals are expected to be present.
Note that the object 15 can be of various types that aircraft 10 may encounter during flight. As an example, the object 15 may be another aircraft, such as a drone, airplane or helicopter. The object 15 also can be a bird, debris, or terrain that are close to a path of the aircraft 10. In some embodiments, object 15 can be various types of objects that may damage the aircraft 10 if the aircraft 10 and object 15 collide. In this regard, the aircraft monitoring system 5 is configured to sense any object 15 that poses a risk of collision and classify it as described herein.
The object 15 of
The aircraft 10 may be of various types, but in the embodiment of
Although the embodiments disclosed herein generally concern functionality attributed to aircraft monitoring system 5 as implemented in an aircraft, in other embodiments, systems having similar functionality may be used with other types of vehicles 10, such as automobiles or watercraft. As an example, it is possible for a boat or ship to increase the power level and range of a LIDAR sensor once it has moved a certain distance from shore or port.
In the embodiment of
When the aircraft 10 transitions from cruise mode into takeoff/landing mode, aircraft monitoring system 5 may process data from sensors 20, 30 that are configured and oriented in the direction of motion of the aircraft 10. In this regard, aircraft 10 and aircraft monitoring system 5 are configured to receive sensor data from sensors 20, 30 that are configured and oriented to sense in the space that is in the direction of motion of the aircraft 10. The aircraft monitoring system 5 may also receive sensor data from sensors 20, 30 that are configured and oriented to sense in other space so that the system 5 can detect an object 15 approaching the aircraft 10 from any direction.
Moreover, when an object 15 is identified in data sensed by sensors 20, 30, the aircraft monitoring system 5 may use information about the aircraft 10 to determine an escape envelope 25 that represents a possible range of paths that aircraft 10 may safely follow (e.g., within a predefined margin of safety or otherwise). Based on the escape envelope 25, the system 5 then selects an escape path within the envelope 25 for the aircraft 10 to follow in order to avoid the detected object 15. In this regard,
The sense and avoid element 207 of aircraft monitoring system 5 may perform processing of data received from sensors 20, 30 and aircraft control system 225 to modulate the range and power level of the LIDAR sensor 30. In addition, the sense and avoid element 207 can control a shut-off system 37 for each LIDAR sensor 30. The shut-off system 37 can be used to stop the transmission of a laser beam or pulse from a laser of the LIDAR sensor 37. The shut-off system 37 may incorporate mechanical devices (e.g., a shutter device) and/or electrical devices (e.g., a disconnect switch) to stop the transmission of the laser beam or pulse. In some embodiments, as shown by
In some embodiments, the aircraft control system 225 may include various components (not specifically shown) for controlling the operation of the aircraft 10, including the velocity and route of the aircraft 10. As an example, the aircraft control system 25 may include thrust-generating devices (e.g., propellers), flight control surfaces (e.g., one or more ailerons, flaps, elevators, and rudders) and one or more controllers and motors for controlling such components. The aircraft control system 225 may also include sensors and other instruments for obtaining information about the operation of the aircraft components and flight.
As shown by
Note that the sense and avoid logic 350 and LIDAR control logic 355, when implemented in software, can be stored and transported on any computer-readable medium for use by or in connection with an instruction execution apparatus that can fetch and execute instructions. In the context of this document, a “computer-readable medium” can be any means that can contain or store code for use by or in connection with the instruction execution apparatus.
The sense and avoid logic 350 is configured to receive data sensed by sensors 20, 30, classify an object 15 based on the data and assess whether there is a collision risk between object 15 and aircraft 10. Sense and avoid logic 350 is configured to identify a collision threat based on various information such as the object's location and velocity.
In some embodiments, the sense and avoid logic 350 is configured to classify the object 15 in order to better assess its possible flight performance, such as speed and maneuverability, and threat risk. In this regard, the sense and avoid element 207 may store object data 344 indicative of various types of objects, such as birds or other aircraft, that might be encountered by the aircraft 10 during flight. For each object type, the object data 344 defines a signature that can be compared to sensor data 343 to determine when a sensed object corresponds to the object type. As an example, the object 344 may indicate the expected size and shape for an object that can be compared to an object's actual size and shape to determine whether the object 15 matches the object type. It is possible to identify not just categories of objects (e.g., bird, drone, airplane, helicopter, etc.) but also specific object types within a category. As an example, it is possible to identify an object as a specific type of airplane (e.g., a Cessna 172). In some embodiments, the sense and avoid element 207 may employ a machine learning algorithm to classify object types. For each object type, the object data 344 defines information indicative of the object's performance capabilities and threat risk.
The sense and avoid logic 350 is configured to process sensor data 343 dynamically as new data becomes available. As an example, when sense and avoid element 207 receives new data from sensors 20, 30, the sense and avoid logic 350 processes the new data and updates any previously made determinations as may be desired. The sense and avoid logic 350 thus may update an object's location, velocity, threat envelope, etc. when it receives new information from sensors 20, 30. Thus, the sensor data 343 is repetitively updated as conditions change.
In an exemplary operation of aircraft monitoring system 5, each of the sensors 20, 30 may sense the object 15 and provide data that is indicative of the object's position and velocity to sense and avoid element 207, as described above. Sense and avoid element 207 (e.g., logic 350) may process the data from each sensor 20, 30 and may note discrepancies between information indicated by data from each sensor (e.g., based on sensor data 343 or otherwise). Sense and avoid logic 350 further may resolve discrepancies present within data from sensors 20, 30 based on various information such as calibration data for each sensor 20, 30 that may be stored as sensor data 343 or otherwise in other embodiments. In this regard, sense and avoid logic 350 may be configured to ensure that information about objects sensed by sensors 20, 30 of the aircraft 10 is accurate for use by the LIDAR control logic 355 in modulating the range and power level of the LIDAR sensor 30.
Note that, in some embodiments, sense and avoid logic 350 may be configured to use information from other aircraft 10 for detecting the presence or location of objects 15. For example, in some embodiments, the aircraft 10 may be one unit of a fleet of aircraft which may be similarly configured for detecting objects within a vicinity of the aircraft. Further, the aircraft may be configured to communicate with one another in order to share information about sensed objects. As an example, the sense and avoid element 207 may be coupled to a transceiver 399, as shown by
The LIDAR control logic 355 can be used to modulate the range of the LIDAR sensor 30 by controlling the power level provided to a laser for the LIDAR sensor 30. The LIDAR control logic 355 can provide signals to the laser for the LIDAR sensor 30 to control the output power level from the laser. In one embodiment, the signals provided by the LIDAR control logic 355 to the laser for the LIDAR sensor 30 can be pulse width modulated signals. However, the LIDAR control logic 355 can provide other types of signals to the laser for the LIDAR sensor 30 in other embodiments. In addition, the LIDAR control logic 355 can continuously receive signals from the LIDAR sensor 30 indicating the current power level for the laser of the LIDAR sensor 30. The LIDAR control logic 355 can use the information regarding the current power level of the laser for the LIDAR sensor 30 when generating the signals to adjust the power level of the laser for the LIDAR sensor 30.
When the aircraft 10 is in an area where there may be people or animals susceptible to eye damage from the laser in the LIDAR sensor 30, such as when the aircraft is in a takeoff/landing mode (i.e., performing a takeoff or landing operation), the LIDAR control logic 355 can operate the laser in the LIDAR sensor 30 at an “eye safe” level that corresponds to a power level of the beams or pulses from the laser that is deemed safe for the eyes of a person or animal. In contrast, if the aircraft 10 is at a cruising elevation (i.e., a predefined distance above ground level (AGL) where people or animals are not expected to be located) and in a cruise mode (i.e., performing (or about to perform) a cruising operation for forward flight), the LIDAR control logic 355 can operate the laser in the LIDAR sensor 30 at an “extended range” level, such that the power level of the beams or pulses from the laser are able to detect objects at a greater distance from the LIDAR sensor 30 relative to the range available to the LIDAR sensor 30 when operated at the eye safe level. In one embodiment, the detection range of the LIDAR sensor 30 operating at the extended range level can be about 1000 meters. However, in other embodiments, the range of the LIDAR sensor 30 operating at the extended range level can be greater than or less than 1000 meters. The range of the LIDAR sensor 30 operating at the extended range level can be about 5 to 10 (or more) times greater than the range of the LIDAR sensor 30 operated at the eye safe level, which can be about 100-200 meters. The power level for the laser when operated at the extended range level can vary based on many different factors such as the size and configuration of the aircraft 10 and the velocity of the aircraft 10 during a cruising operation. For example, an aircraft 10 that is operated at a higher velocity during a cruising operation may require a larger range (and corresponding higher power level) from the LIDAR sensor 30 in order to detect objects 15 with sufficient time to avoid collisions relative to an aircraft 10 that is operated at a lower velocity.
During operation of the aircraft 10 in a cruising mode for forward flight, the sense and avoid logic 350 can determine if an object 15 is within the scan range (or sweep) of the LIDAR sensor 30. The scan range of the LIDAR sensor 30 corresponds to the angular displacement of a beam or pulse from the laser of the LIDAR sensor 30 between the beginning of a scan by the LIDAR sensor 30 and the end of a scan by the LIDAR sensor 30. In one embodiment, as shown in
After the sense and avoid logic 350 determines that there is an object 15 in the scan range for the LIDAR sensor 30, the LIDAR control logic 355 can determine whether the power level for the laser of the LIDAR sensor 30 should be adjusted from the extended range level due to eye safety concerns associated with the object 15. The LIDAR control logic 355 can make the determination on whether the object 15 has an associated eye safety concern based on object identification information, distance information (i.e., the distance between the LIDAR sensor 30 and the object 15) and environment information provided to the LIDAR control logic 355 by the sense and avoid logic 350. If the object 15 raises eye safety concerns, such as when the object 15 is an animal (e.g., a goose) or contains one or more people (e.g., a building or helicopter) and is at a distance from the LIDAR sensor 30 where the increased power level of the beam or pulse from the laser for the LIDAR sensor 30 may be unsafe and cause eye damage to a person or animal, the LIDAR control logic 355 reduces the power level of the laser for the LIDAR sensor 30 from the extended range level. For example, the LIDAR control logic 355 can modulate or limit the power of the LIDAR sensor 30 based on the proximity of the aircraft 10 to a known static object, such as a building. The LIDAR control logic 355 can know the location of the building from 3D map information provided to (or generated by) the LIDAR control logic 355. The LIDAR control logic 355 can then determine the position of the aircraft 10 in the 3D map and calculate the distance and/or direction of the aircraft 10 relative to the building. The LIDAR control logic 355 can then use the distance and/or direction information to adjust the power to the LIDAR sensor 30.
The LIDAR control logic 355 can reduce the power level for the laser of the LIDAR sensor 30 to either the eye safety level or an intermediate level between the eye safety level and the extended range level. In one embodiment, the intermediate level is based on a distance of the aircraft 10 from the object 15. In another embodiment, the intermediate level can correspond to a power level that does not raise eye safety concerns at the location of the object. In other words, the power level of the beam or pulse transmitted by the laser is reduced by a sufficient amount such that when the beam or pulse reaches the object, the beam or pulse has dissipated enough energy such that the beam or pulse does not raise eye safety concerns to a person or animal. In still other embodiments, the intermediate level can be based on the type of object (e.g., an animal and human may have different intermediate levels) or on the velocity of the object (e.g., faster moving objects and slower moving objects may have different intermediate levels). If the object 15 does not raise eye safety concerns, such as when the object is part of the terrain (e.g., a mountain) or a drone, the LIDAR control logic 355 can continue to keep the power level for the laser of the LIDAR sensor 30 at the extended range level.
When the LIDAR control logic 355 determines that the power level for the laser of the LIDAR sensor 30 is to be reduced, the LIDAR control logic 355 may reduce the power level for only a portion of the scan range that corresponds to an area or zone in which the object 15 is located. The LIDAR control logic 355 can determine the location or position of the object 15 relative to the LIDAR sensor 30 using information from sensors 20, 30 and the sense and avoid logic 350. Once the position of the object 15 is known, the LIDAR control logic 355 can operate the laser of the LIDAR sensor 30 at a reduced power level, as discussed above, for the portion of the scan range corresponding to the object. In one embodiment, the LIDAR control logic 355 operates the laser at a reduced power in the direction of the object 15 plus an angular offset to provide a desired margin of error. In one embodiment, the angular offset can be about ±10 degrees, but other offsets are possible in other embodiments. The LIDAR control logic 355 can operate the remainder of the scan range for the LIDAR sensor 30 at the extended range level. By reducing the power level of the LIDAR sensor 30 in the area or zone of an object with eye safety concerns, while maintaining the extended range power level for the remainder of the scan range, the LIDAR sensor 30 is able to continue receive information at an extended range without introducing an eye safety concern to people or animals associated with the object 15. Once the object 15 has moved from the scan range of the LIDAR sensor 30, the LIDAR control logic 355 can operate the laser for the LIDAR sensor 30 at the extended range level for the entire scan range of the LIDAR sensor unless a new object 15 with eye safety concerns has been detected. In one embodiment, if multiple objects 15 with eye safety concerns have been detected within the scan range of the LIDAR sensor 30, the LIDAR control logic 355 can reduce the power level for each of the objects 15 in the scan range, as described above.
As the aircraft 10 transitions from cruise mode to takeoff/landing mode, such as when the aircraft 10 has reached the end of a flight path and is preparing to land, the LIDAR control logic 355 can modulate the power level for the laser of the LIDAR sensor 30 from the extended range level back to the eye safe level. In one embodiment, if the aircraft 10 is a VTOL aircraft that has a hover mode (i.e., the aircraft 10 maintains a predefined position and elevation), the LIDAR control logic 355 can provide different power levels for the LIDAR sensor 30 for different types of scans. For example, a vertical scan from the LIDAR sensor may be at the eye safe level, while a horizontal scan from the LIDAR sensor 30 may be at the extended range level depending on the elevation of the aircraft 10 and the environment surrounding the aircraft 10.
The LIDAR control logic 355 is configured to process data dynamically as new data becomes available from the sense and avoid logic 350 when the aircraft 10 is operating in cruise mode. For example, the LIDAR control logic 355 can receive new data from the sense and avoid logic 350 indicating that an object 15 having eye safety concerns has either exited the scan range for the LIDAR sensor 30 or changed position relative to the LIDAR sensor 30. If the object 15 has exited the scan range, the LIDAR control logic 355 can operate the laser for the LIDAR sensor 30 at the extended range level. If the object 15 has moved closer to the LIDAR sensor 30, the LIDAR control logic 355 can lower the power level to the laser for the LIDAR sensor 30 (if not already at the eye safe level) and if the object 15 has moved away from the LIDAR sensor 30, the LIDAR control logic 355 can increase the power level to the laser for the LIDAR sensor 30 that can still address eye safety concerns.
In one embodiment, if the LIDAR control logic 355 determines that the beam or pulse from the laser for the LIDAR sensor 30 poses an immediate eye safety concern, the LIDAR control logic 355 can send a signal to the shut-off system 37 to prevent or stop the laser for the LIDAR sensor 30 from transmitting a beam or pulse. As an example, if the LIDAR control logic 355 initially detects an object susceptible to eye damage in close proximity to the LIDAR sensor 30 (e.g., less than a threshold distance away), the LIDAR control logic 355 may completely shut off the laser rather than just reduce its power. In one embodiment, the shut-off system 37 can incorporate a shutter device or cover that can be closed to prevent the laser for the LIDAR sensor 30 from transmitting a beam or pulse. In another embodiment, the shut-off system 37 can incorporate a disconnect switch that can remove power from the laser for the LIDAR sensor 30 and prevent any transmission of a beam or pulse from the laser. In still other embodiments, other mechanical or electrical devices can be used to prevent transmission of a pulse or beam by the laser for the LIDAR sensor 30. The LIDAR control logic 355 can then send a subsequent signal to the shut-off system 37 to return to an operational state that permits the laser for the LIDAR sensor 30 to transmit a beam or pulse.
An exemplary use and operation of the system 5 in order to modulate the range and power level of a LIDAR sensor 30 of the aircraft 10 will be described in more detail below with reference to
At step 802, the LIDAR control logic 355 can operate the LIDAR sensor 30 at the eye safe level since the aircraft 10 is either located on the ground or initiating a takeoff operation. A determination is then made as to whether the aircraft 10 has satisfied a predefined flight characteristic (e.g., reached a predetermined phase of flight) associated with the aircraft 10 at step 804. The predefined flight characteristic may correspond to a measurement of altitude, a transition to a particular flight configuration (e.g., a configuration for hover flight or forward flight), or a location of the aircraft. Further, reaching a predetermined phase of flight can be one or more of the aircraft 10 reaching a predefined altitude or entering a new altitude range, the aircraft transitioning to a new flight configuration (e.g., transitioning between a configuration for hover flight and forward flight), or the aircraft reaching a predefined location along a flight plan (e.g., entering or arriving at a less populated area or an urban area). As an example, once the aircraft 10 reaches a certain altitude (e.g., cruise altitude), transitions to a configuration for forward flight, or leaves an urban area to a sparsely populated area, it can be assumed that the risk of eye injury has sufficiently diminished such that the transmit power of the LIDAR sensor may be increased, as will be described below.
Referring to step 804, if the aircraft 10 has not satisfied the flight characteristic, the process returns to step 802 and the LIDAR control logic 355 can continue to operate the LIDAR sensor 30 at the eye safe level. However, if the aircraft 10 has satisfied the flight characteristic, the LIDAR control logic 355 can operate the LIDAR sensor 30 at the extended range level in step 806. As shown in
Next, at step 808, a determination is made as to whether the aircraft 10 is initiating a landing operation. If the aircraft 10 is initiating a landing operation, the LIDAR control logic 355 can operate the LIDAR sensor 30 at the eye safe level at step 810 since there is an expectation that people or animals are within the scan range of the LIDAR sensor 30 and the process can end. If the aircraft 10 is not performing a landing operation at step 808, a determination can be made as to whether the aircraft 10 has detected an object 15 within the scan range of the LIDAR sensor 30 at step 812. The sense and avoid logic 350 can receive signals from sensors 20, 30 to make a determination as to whether there is an object 15 within the scan range of the LIDAR sensor 30. If the sense and avoid logic 350 has not detected an object 15 in the scan range of the LIDAR sensor 30, the process returns to step 806 and the LIDAR control logic 355 can continue to operate the LIDAR sensor 30 at the extended range level. However, if the sense and avoid logic 350 has detected an object 15 in the scan range of the LIDAR sensor 30, the LIDAR control logic 355 can then determine if the object 15 poses an eye safety concern at step 814. As discussed above, if the object 15 is associated with a person or animal and is at a sufficiently close distance to the LIDAR sensor 30, then the object 15 has as eye safety concern.
If the LIDAR control logic 355 determines that the object 15 does not have an eye safety concern, the process returns to step 806 and the LIDAR control logic 355 can continue to operate the LIDAR sensor 30 at the extended range level. However, if the LIDAR control logic 355 determines that the object 15 does have an eye safety concern, the LIDAR control logic 355 can reduce the power level of the LIDAR sensor 30 near the object 15 at step 816. As discussed above, the portion of the scan range of the LIDAR sensor 30 that is associated with the object 15 having eye safety concerns can be operated at a reduced power level that corresponds to either the eye safe level or an intermediate level that does not pose a risk of eye damage to the person or animal associated with the object 15 at the corresponding distance between the object 15 and LIDAR sensor 30.
After the LIDAR control logic 355 adjusts the power level of the LIDAR sensor 30 near the object 15, the LIDAR control logic 355 determines whether the object has exited the scan range for the LIDAR sensor 30 at step 818. The LIDAR control logic 355 can determine if the object 15 has exited the scan range for the LIDAR sensor 30 by receiving updated information from the sense and avoid logic 350 that indicates the object 15 has exited the scan range. An object 15 can exit the scan range for the LIDAR sensor 30 by travelling in a direction or elevation away from the scan range of the LIDAR sensor or by having the aircraft 10 alter its flight path or elevation as part of a collision avoidance algorithm. If the object 15 has not exited the scan range for the LIDAR sensor 30, the process returns to step 816 and the LIDAR control logic 355 can continue to operate the LIDAR sensor 30 at the reduced power level for the corresponding portion of the scan range as discussed above. However, if the object 15 has exited the scan range for the LIDAR sensor 30, the process returns to step 806 and the LIDAR control logic 355 can operate the LIDAR sensor 30 at the extended range level.
In one exemplary embodiment as shown in
The foregoing is merely illustrative of the principles of this disclosure and various modifications may be made by those skilled in the art without departing from the scope of this disclosure. The above described embodiments are presented for purposes of illustration and not of limitation. The present disclosure also can take many forms other than those explicitly described herein. Accordingly, it is emphasized that this disclosure is not limited to the explicitly disclosed methods, systems, and apparatuses, but is intended to include variations to and modifications thereof, which are within the spirit of the following claims.
As a further example, variations of apparatus or process parameters (e.g., dimensions, configurations, components, process step order, etc.) may be made to further optimize the provided structures, devices and methods, as shown and described herein. In any event, the structures and devices, as well as the associated methods, described herein have many applications. Therefore, the disclosed subject matter should not be limited to any single embodiment described herein, but rather should be construed in breadth and scope in accordance with the appended claims.
This application claims priority to International Application PCT/US2017/040461, entitled “SYSTEMS AND METHODS FOR MODULATING THE RANGE OF A LIDAR SENSOR ON AN AIRCRAFT” and filed on Jun. 30, 2017, which is incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US17/40461 | 6/30/2017 | WO | 00 |