Aspects of the present disclosure generally relate to unmanned aerial vehicle (UAV) operations, and more particularly to techniques and apparatuses for UAV illumination systems.
An unmanned aerial vehicle (UAV), commonly known as a drone, is an aircraft without a human pilot on board. UAVs may operate with various degrees of autonomy, either autonomously using on-board computers, or under remote control by a human operator. The definition provides for a powered, aerial vehicle that does not carry a human operator, uses aerodynamic forces to provide vehicle elevation, can fly autonomously or by remote piloting, may be expendable or recoverable, and can carry a payload. UAVs are used in multiple applications including military, commercial, scientific, and agricultural. Some uses include policing, surveillance, product delivery, aerial photography, infrastructure inspections, and drone racing.
UAVs are a component of an unmanned aircraft system, which includes a ground-based controller and a communication system linking the controller and UAV. The unmanned aircraft system may include ground control stations, data links, and support equipment. The UAV may be a quadcopter that has a body containing a power supply and a microcontroller unit (MCU). The system hardware for a UAV includes a flight controller, sensors, and actuators. System software is known as the flight stack or autopilot and is designed to provide a real-time rapid response to changing sensor data.
Sensors provide information about the state of the aircraft and include position and movement sensors. The multiple sensors may also connect to camera systems and digital video recording systems. As UAVs have developed and become more widely used, the need for nighttime operations has also grown. A UAV uses multiple vision systems to facilitate navigation and information collection. Two types of camera systems may be used: red, green, blue (RGB) cameras and stereo depth cameras. The RGB cameras primarily provide video streaming and image capture, and the stereo depth cameras primarily provide state estimation and navigation.
In order to function in both daytime and nighttime, the vision systems utilize a series of illuminators placed around the UAV body. The RGB cameras specify one type of illumination and the stereo depth cameras specify another type of illumination. These disparate needs necessitate two illumination systems. In some situations, the needs of the vision systems adversely affect each other. If infrared (IR) pattern projector illuminators are active while the RGB cameras are exposing, the RGB image will have an IR projector illuminator pattern on the image. In addition, users of night vision goggles may be affected when the pattern and floodlight illuminators are triggered at a frequency lower than a sampling rate of the human eye, producing a strobe light effect. The night vision goggles may also see the IR projector illuminator pattern projected on the image. There is a need for control of the illumination systems to coordinate each illumination system with dynamic camera operations to enhance image quality and vision system performance, and mitigate adverse effects to nearby users with night vision goggles.
The disclosure provides an apparatus for controlling camera systems and illumination systems. The apparatus includes a depth camera system and a vision camera system. A microcontroller unit is in communication with the depth camera system and the vision camera system. The microcontroller unit provides multiple options to control the camera and illumination systems including variable exposure times, brightness, among others.
In addition, the disclosure provide a method of controlling camera systems and illumination systems. The method starts with triggering a first camera system to start exposing. The method also provides for triggering a first illumination system to illuminate an exposure of the first camera system. Then, the method provides for stopping the first camera system and the first illumination system at a first later time. A wait period of a selected time interval follows. After the wait period, the method continues with triggering a second camera system to start exposing and triggering a second illumination system to illuminate an exposure of the second camera system. The method concludes with stopping the second camera system and the second illumination system at a second later time.
Furthermore, the disclosure also provides a method of identification by a transmitter. The method provides for modulating an infrared light source with a unique pattern to identify an unmanned aerial vehicle (UAV). The method continues with sending the modulated light pattern to a receiver.
The foregoing has outlined rather broadly the features and technical advantages of examples according to the disclosure in order that the detailed description that follows may be better understood. Additional features and advantages will be described. The conception and specific examples disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the scope of the appended claims. Characteristics of the concepts disclosed, both their organization and method of operation, together with associated advantages will be better understood from the following description when considered in connection with the accompanying figures. Each of the figures is provided for the purposes of illustration and description, and not as a definition of the limits of the claims.
So that the above-recited features of the present disclosure can be understood in detail, a more particular description, briefly summarized above, may be had by reference to aspects, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only certain typical aspects of this disclosure and are therefore not to be considered limiting of its scope, for the description may admit to other equally effective aspects. The same reference numbers in different drawings may identify the same or similar elements.
Various aspects of the disclosure are described more fully below with reference to the accompanying drawings. This disclosure may, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based on the teachings, one skilled in the art should appreciate that the scope of the disclosure is intended to cover any aspect of the disclosure disclosed, whether implemented independently of or combined with any other aspect of the disclosure. For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth. In addition, the scope of the disclosure is intended to cover such an apparatus or method, which is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the disclosure set forth. It should be understood that any aspect of the disclosure disclosed may be embodied by one or more elements of a claim.
Several aspects of unmanned aerial vehicle (UAV) systems will now be presented with reference to various apparatuses and techniques. These apparatuses and techniques will be described in the following detailed description and illustrated in the accompanying drawings by various blocks, modules, components, circuits, steps, processes, algorithms, and/or the like (collectively referred to as “elements”). These elements may be implemented using hardware, software, or combinations thereof. Whether such elements are implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system.
Unmanned aerial vehicles (UAVs) may use multiple vision systems to facilitate navigation and information collection. These vision systems can be placed into two general categories: red, green, blue (RGB) cameras for video streaming and image capture, and stereo depth cameras for state estimation and navigation. UAVs operate in both day and night. For nighttime operation, UAV vision systems have a series of illuminators placed around the body of the UAV. The RGB cameras have a dedicated illumination system and the stereo depth cameras have a separate, dedicated illumination system.
Each camera system has its own specifications for an illumination system. The specifications for one illumination system can negatively affect the other illumination system. These negative effects may include image quality degradation, depth measurement, and user vision. User vision may also be affected if night vision goggles are worn. Aspects of this disclosure address the negative effects by linking control of the illumination system with control of dynamic camera operation in order to enhance image quality and vision system performance.
An unmanned aerial vehicle (UAV) vision system may incorporate multiple camera systems each dedicated to a particular function. Each of the vision systems may use multiple cameras.
The disclosure provides an apparatus for controlling camera systems and illumination systems. The apparatus includes a depth camera system and a vision camera system. A microcontroller unit is in communication with the depth camera system and the vision camera system. The microcontroller may control separate triggers for each camera and illumination system.
In addition, the camera system provides a method of controlling camera systems and illumination systems. The method begins with triggering a first camera system to start exposing. A first illumination system is also triggered to illuminate an exposure of the first camera system. At a first later time, the first camera system and the first illumination system are stopped. After stopping the first camera system and the first illumination system, waiting a selected time interval occurs. The method then continues with triggering a second camera system to start exposing. A second illumination system is also triggered to illuminate an exposure of the second camera system. At a second later time, the second camera system and the second illumination system are stopped.
Furthermore, the disclosure also provides a method of identification by a transmitter. The method begins with modulating an infrared light source with a unique pattern to identify an unmanned aerial vehicle (UAV). The modulated light pattern is then sent to a receiver.
The UAV commences operation at time tO. In operation, at time tl, a depth synchronization pulse triggers the depth cameras 106 to commence operation. All five depth cameras 106 begin operating at time t2. While all start operating at time t2, the depth forward camera 204 and the depth down camera 212 of the depth cameras 106 operate until time t4. The depth left camera 206, the depth right camera 208, and the depth up camera 210 of the depth cameras 106 operate until time t3. This allows cameras that need more exposure time (due to exposure to a darker scene, or different camera settings) to take longer to complete their exposures, and not be limited by the need to finish exposure at the same time as a camera with a shorter exposure.
By design, the exposure times for each vision system do not overlap. This allows each vision system to use a unique illumination system without interference from the other vision system.
Infrared (IR) illumination is used for the depth cameras 106 and also for the RGB cameras 108. For each camera, there are corresponding illuminators pointed in the same direction as the camera. For example, the forward facing RGB camera 108 (not shown) has forward facing floodlight illuminators 404 shown by the markers in areas 1 and 8 in
Depth camera 1614 receives input from a vision processing unit (VPU) 1622, while depth cameras 2-5616 receive input from VPUs 2-5624. VPU 1622 and VPU 2-5624 are also in communication with one another and receive illumination operation instructions through a master synchronization (e.g., MASTER SYNC) command sent from the flight controller 610. Depth projector 5618 is in communication with a driver 626. The driver 626 receives commands for projector brightness and projector timing from the flight controller 610. The projector brightness command is a pulse width modulated (PWM) signal. Depth projectors 1-4620 are each in communication with a depth AND gate 628, which communicates with the illumination enable unit 608. The depth AND gate 628 is in communication with the illumination enable unit 608 and receives the projector timer command sent from the flight controller 610.
The RGB cameras 1-4630 receive an RGB illumination command (e.g., RBG ILLUM) from the RGB timer in the flight controller 610. RGB floodlights 1-8632 are in communication with an RGB AND gate 634, which communicates with the illumination enable unit 608.
The UAV commences operation at time to. At time tl, a trigger pulse is transmitted on the depth sync input to the depth cameras 204, 206, 208, 210, and 212 (e.g., depth cameras 106). Also at time tl, the IR pattern projector illuminators 402 turn on in response to the trigger pulse. The IR pattern projector illuminators 402 remain on through to time t3. At time t2, the depth cameras 106, 204, 206, 208, 210, and 212 turn on and remain on in conjunction with the IR pattern projector illuminators 402. At time t3, both the IR pattern projector illuminators 402 and the depth cameras 106, 204, 206, 208, 210, and 212 turn off
At time t4, a trigger pulse is transmitted on the RGB sync input to the RGB cameras 214, 216, and 218 (e.g., RGB cameras 108) and also to the floodlight illuminators 404. The RGB cameras 214, 216, and 218 (e.g., RGB cameras 108) start their exposure at time t4 and turn off shortly before time t5. The floodlight illuminators 404 turn off at time t5. After time t6, the sequence repeats at a selected ‘n’ frames per second (FPS).
A further problem may be solved using the synchronization systems discussed above. With night vision goggles, the IR pattern projector illuminators 402 and the floodlight illuminators 404 may cause a strobing effect. The strobing effect is similar to a strobe light and is caused when the IR pattern projector illuminators 402 and the floodlight illuminators 404 are triggered at a frequency lower than the sampling rate of the human eye. The sampling rate for this strobing effect may occur, for example, at 30 Hz. This strobe effect is distracting and potentially nauseating for a human operator.
An alternative approach also eliminates the strobe light effect and does not use dummy pulses. An alternative approach leaves the floodlight illuminators 404 on for the majority of the time period and turns them off only when the IR pattern projector illuminators 402 are turned on. In this approach, the floodlight illuminators 404 may be run at 60 Hz, to give one example, while the IR pattern projector illuminators 402 operate at 30 Hz.
Another approach allows adjusting the magnitude of the IR pattern projector illuminators 402 to reduce the brightness of the dot pattern. At the same time, the floodlight illuminators 404 brightness is increased. This approach also eliminates the strobe light effect of the dots to the human operator because the dots comprise a smaller portion of the total light the human operator sees over a period of time. It should be noted that operation of the floodlight illuminators 404 while the depth cameras 106, 204, 206, 208, 210, and 212 are operating reduces the accuracy and the effective range of the depth camera vision system.
While the approaches described above halt the strobing effect for an operator wearing night vision goggles, the constant dot pattern projected onto all surfaces viewed through the illumination systems remains. The dot pattern projected onto the landscape makes distinguishing objects difficult, if not impossible. Objects may appear simply as blobs. Much of this problem arises because night vision goggles amplify available light and many do not incorporate technologies that rely on frame rates or sampling techniques. If, however, the night vision goggles incorporate an image capture technology that uses a frame rate similar to the frame rate of the RGB cameras 214, 216, and 218 (e.g., RGB cameras 108), then the synchronization techniques described above can remove the dot pattern. This situation may be mitigated from the perspective of the wearer by increasing the time the floodlight illuminators 404 are turned on. In addition, the dot pattern may be removed by adjusting a magnitude of the brightness between the floodlight illuminators 404 to make the IR pattern projector illuminators 402 less bright compared to the floodlight illuminators 404.
In military UAV systems, especially swarm systems, identification friend or foe (IFF) systems are critical to distinguish between friendly and hostile assets. UAVs may be grouped into swarms, which are coordinated groups of UAVs operating together. IFF is challenging for swarm UAV systems due to size, weight, and power constraints. UAVs may incorporate light emitting diodes (LEDs) and vertical-cavity surface-emitting laser (VCSEL) diodes. VCSEL diodes emit perpendicularly from the top surface, which can be useful for communication within a swarm of UAVs. IFF between swarming UAVs relies on a high speed camera, with a speed significantly greater than the illumination frame rates (e.g., 120 frames per second.) A receiving camera recognizes the unique frame rate pattern emitted by the VCSEL and identifies the target as a swarm member. In addition, simple image recognition provides distance and altitude information once that pattern is acquired. Minor variations in the illumination pattern may be used to transmit additional information such as a unique vehicle identifier or “squawk” for each UAV. The minor variations in the illumination pattern may also convey vehicle state information. This vehicle state information may include incoming vehicle, departing vehicle, battery level, or damage state. The vehicle state information may be used by a receiving vehicle to estimate what route the emitting UAV will take, or to inform an operator about the state of the emitting UAV.
A modulation scheme may also be varied by adding an extra pulse during a time period when no cameras are exposing. A pulse may also be lengthened to extend into the time between camera exposures, also known as “dead time.” Further variations may include intentionally dropping an illumination time when the operational environment permits, such as when moving slowly, and varying the intensity. Intensity modulation that is insignificant from the illumination perspective can still be received and decoded for the transmission of information. The amount of data that may be sent depends largely on the frame rate of the receiving camera. Both LEDs and VCSELs are capable of modulations into the MHz, while even a fast camera often works below 100 Hz, resulting in the modulation scheme being limited by the receiving frame rate, not the transmission rate.
Additional aspects of the illumination management system provide knowledge of the camera systems exposure times. The sensors on the UAV may also receive a signal from each camera system when exposing stops. A further alternative allows firing the illumination systems 402 and 404 very brightly and briefly so that timing of the turn on/off is not necessary.
In addition, the illumination systems can automatically adjust each type of illumination system relative to the other. This minimizes the user's view of the projector pattern. As one example, if a brighter dot pattern is needed to increase the visible range of the depth camera system (e.g., 204, 206, 208, 210, and 212) then the brightness of the floodlight illumination system 400 also increases.
Illuminators can be selected with a light spectrum output that does not fall in the same band as night vision goggles. This may mitigate the pattern projection for a user of night vision goggles. This problem may also be mitigated by using light filters passing only a specific band of light. Night vision goggle users may also share a common synchronization signal with the illumination systems on the UAV to further mitigate the pattern projection seen when wearing the night vision goggles. Such a synchronization signal may also be shared with multiple UAVs flying in the same space, such as a room. Such a synchronization signal may be carried over a separate channel (e.g., a radio signal) or may come directly from the illuminators. For example, the night vision goggles may receive an IR signal from the illuminators and synchronize with it directly.
Each camera system, both depth cameras (204, 206, 208, 210, and 212) and RGB cameras (e.g., 214, 216, and 218) may have their own illuminators. With each camera having dedicated illuminators, depending on camera type, brightness adjustments for each camera are possible. These individually controlled illuminators also allow the system to dim or turn off when the UAV points directly at the user. This is particularly helpful when recovering the UAV. When the UAV is flying away from the user the illumination system can delay illumination to avoid revealing the user's location. This is of particular concern for military operations.
Users can also manually adjust the illumination system through the separately controllable illuminators. If a user does not want the UAV seen, all illumination systems can be turned off
Camera exposure time is also controllable. While each camera in a camera system receives an exposure trigger simultaneously, individual cameras may have unique brightness or exposure time duration, based on the view of each camera. In mobile vision systems, it is best to minimize camera exposure time. This ensures that motion blur caused by system movement is mitigated in the captured image. However, as the UAV, or mobile vision system, moves through varying light conditions, it may be helpful to increase exposure times in low light areas. The camera systems exposure timers are coupled with the illumination systems to provide this control. The illumination system can be turned on first to adjust brightness, preventing longer camera exposure times. The brightness of the illumination system can also be controlled by selectively turning on illuminators as desired and not just by brightness adjustments. The brightness adjustments occur quickly so that an individual camera's exposure control loop remains stable.
The illumination control systems minimize ripple in the current drive through the illuminators for constant illumination over the exposure period. Current adjustment in the illuminators can be accomplished with pulse-width modulation (PWM), a DC reference, or other desired current regulation scheme.
The foregoing disclosure provides illustration and description, but is not intended to be exhaustive or to limit the aspects to the precise form disclosed. Modifications and variations may be made in light of the above disclosure or may be acquired from practice of the aspects.
As used, the term “component” is intended to be broadly construed as hardware, firmware, and/or a combination of hardware and software. As used, a processor is implemented in hardware, firmware, and/or a combination of hardware and software.
Some aspects are described in connection with thresholds. As used, satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, greater than or equal to the threshold, less than the threshold, less than or equal to the threshold, equal to the threshold, not equal to the threshold, and/or the like.
It will be apparent that systems and/or methods described this disclosure may be implemented in different forms of hardware, firmware, and/or a combination of hardware and software. The actual specialized control hardware or software code used to implement these systems and/or methods is not limiting of the aspects. Thus, the operation and behavior of the systems and/or methods were described without reference to specific software code—it being understood that software and hardware can be designed to implement the systems and/or methods based, at least in part, on the description.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of various aspects. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one claim, the disclosure of various aspects includes each dependent claim in combination with every other claim in the claim set. A phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, and c-c-c or any other ordering of a, b, and c).
No element, act, or instruction used should be construed as critical or essential unless explicitly described as such. Also, as used, the articles “a” and “an” are intended to include one or more items, and may be used interchangeably with “one or more.” Furthermore, as used, the terms “set” and “group” are intended to include one or more items (e.g., related items, unrelated items, a combination of related and unrelated items, and/or the like), and may be used interchangeably with “one or more.” Where only one item is intended, the phrase “only one” or similar language is used. Also, as used, the terms “has,” “have,” “having,” and/or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise.
The present application claims the benefit of U.S. provisional patent application No. 63/071,322, filed Aug. 27, 2020, in the names of WESTER et al., the disclosure of which is expressly incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63071322 | Aug 2020 | US |