This relates generally to image processing and informatics, including but not limited to remote sensing with airborne robotic drones and mounted sensor devices.
The range of applications in which unmanned aerial vehicles, or drones, can be employed for data capture has expanded dramatically. In many situations, however, commercially available drones still have limited sensing and operational capabilities. Also apparently lacking is the ability of a fleet of drones to effectively and efficiently coordinate operations such that captured data can be easily synthesized in a meaningful way.
Accordingly, there is a need for faster, more efficient methods, systems, and interfaces for remote sensing with airborne robotic drones and mounted sensor devices. By utilizing the aerial functionality of drone devices, in combination with the robust sensing capabilities of sensor devices, such as smart phones equipped with cameras, accelerometers, and gyroscopes, images and associated meta data may be captured and coordinated among a fleet of devices, and subsequently processed and analyzed. Such methods and interfaces optionally complement or replace conventional methods for remote sensing with robotic drones.
In accordance with some embodiments, a method is performed at a first self-contained battery-operated device (e.g., a first client device/sensor device, such as a smart phone) fixedly mounted to a first drone device in a plurality of drone devices. The first self-contained battery-operated device includes one or more processors, an accelerometer, a gyroscope, a location detection device, a two-dimensional pixilated detector, and memory for storing one or more programs for execution by the one or more processors, all connected by a common bus. The method includes obtaining (e.g., capturing) one or more time-stamped images of an environmental region using the two-dimensional pixilated detector at a time when the drone device is airborne. For each respective time-stamped image of the captured one or more time-stamped images, a respective set of meta data is obtained. The respective set includes (1) rotational values obtained using the gyroscope, the rotational values indicating a relative orientation of the self-contained battery-operated device with respect to a respective reference orientation, and (2) location information obtained using the location detection device at a time when the corresponding time-stamped image is captured. The one or more time-stamped images and the respective sets of meta data for the one or more time-stamped images are sent to a remote processing device.
In accordance with some embodiments, an airborne remote sensing apparatus includes a first drone device and a first self-contained battery-operated device mounted to the first drone device. The self-contained battery-operated device includes one or more processors and memory for storing one or more programs for execution by the one or more processors, the one or more programs include instructions for performing the operations of the client-side method described above.
In accordance with some embodiments, a computer-readable storage medium has stored therein instructions that, when executed by the first self-contained battery-operated device mounted to the first drone device, cause the first self-contained battery-operated device to perform the operations described above.
Thus, drones (e.g. robotic drones) and mounted sensor devices are provided with faster, more efficient methods for remote sensing, thereby increasing the value, effectiveness, efficiency, and user satisfaction with such devices.
For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings. Like reference numerals refer to corresponding parts throughout the figures and description.
Reference will now be made to embodiments, examples of which are illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide an understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are used only to distinguish one element from another. For example, a first smart phone could be termed a second smart phone, and, similarly, a second smart phone could be termed a first smart phone, without departing from the scope of the various described embodiments. The first smart phone and the second smart phone are both smart phones, but they are not the same smart phone.
The terminology used in the description of the various embodiments described herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting” or “in accordance with a determination that,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event]” or “in accordance with a determination that [a stated condition or event] is detected,” depending on the context.
As used herein, the term “exemplary” is used in the sense of “serving as an example, instance, or illustration” and not in the sense of “representing the best of its kind.”
The device environment 100 includes a number of client devices (also called “client systems,” “client computers,” “clients,” “sensor devices,” etc.) 104-1, 104-2, . . . 104-n fixedly mounted to respective drone devices 102-1, 102-2, . . . 102-n. The client devices 104 and/or the drone device 102 are communicably connected to each other, to one or more processing devices 108, and/or to one or more control devices 110, by one or more networks 106 (e.g., the Internet, cellular telephone networks, mobile data networks, other wide area networks, local area networks, metropolitan area networks, and so on).
In some embodiments, the one or more networks 106 include a public communication network (e.g., the Internet and/or a cellular data network), a private communications network (e.g., a private LAN or leased lines), or a combination of such communication networks. In some embodiments, the one or more networks 106 use the HyperText Transport Protocol (HTTP) and the Transmission Control Protocol/Internet Protocol (TCP/IP) to transmit information between devices or systems. HTTP permits client devices to access various resources available via the one or more networks 106. In some embodiments, the one or more networks 106 are wireless communications channels based on various custom or standard wireless communications protocols (e.g., IEEE 802.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.11a, WirelessHART, MiWi, etc.), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document. Alternatively, in some embodiments, at least a portion of the one or more networks 106 comprise physical interfaces based on wired communications protocols (e.g., Ethernet, USB, etc.). In some implementations, any of the aforementioned devices or systems are communicably connected with each other, or with other devices via any combination of the aforementioned networks 106 (e.g., client devices 104 communicate with one another via Bluetooth, transmit time-stamped images to the processing device 108 via a cellular network, and receive control commands from the control device 110 via Wi-Fi). The various embodiments of the invention, however, are not limited to the use of any particular communication protocol.
In some embodiments, client devices 104 are computing devices or self-contained battery-operated devices, such as smart phones, cameras, video recording devices, smart watches, personal digital assistants, portable media players, tablet computers, 2D devices, 3D (e.g., virtual reality) devices, laptop computers with one or more processors embedded therein or coupled thereto, in-vehicle information systems (e.g., an in-car computer system that provides navigation, entertainment, and/or other information), and/or other appropriate computing devices that can be used to communicate with other client devices, a control device 110, and/or a processing device 108.
In some embodiments, drone devices 102 (also referred to as “unmanned aerial vehicles” (UAV), “remotely piloted aircraft” (RPA), etc.) are robotic aircraft that do not have human pilots aboard. In some embodiments, drone devices 102 execute control commands that include or correspond to instructions and parameters for a flight pattern (e.g., flight line/path, speed, altitude, etc.). In some embodiments, drone devices 102 are autonomous and execute pre-programmed flight patterns. Additionally and/or alternatively, drone devices 102 may be controlled in real-time by a user (e.g., via a remote control device 110) through detected user inputs or generated controls commands. Control commands may be received from mounted devices (e.g., a mounted client device 104), or directly from one or more control devices (e.g., control device 110). In some embodiments, drone devices 102 include one or more processors and memory storing instructions (e.g., received control commands, pre-programmed flight patterns, flight instructions, etc.) for execution by the one or more processors. In some embodiments, the drone device 102 includes at least some of the same operational capabilities and features of the client devices 104, which may be used additionally, alternatively, and/or in conjunction with the client devices 104 (e.g., drone devices 102 include additional sensors that may be used in conjunction with sensors of the client devices 104).
Client devices 104 are mounted to respective drone devices 102, the combination of which may be employed to obtain or generate data for transmission to the processing device 108 and/or other client devices, or to receive, display, and/or manipulate data (e.g., data generated, obtained, or produced on the device itself, data received from the processing device 108 or other client devices, etc.). In some embodiments, the client devices 104 and/or the drone device 102 capture multimedia data (e.g., time-stamped images, video, audio, etc.) and acquire associated meta data (e.g., environmental information (time, geographic location), device readings (sensor readings from accelerometers, gyroscopes, barometers), etc.) for specified environmental regions (e.g., any terrain or geographic area that may be aerially surveyed). Data captured by the client devices 104 and/or the drone device 102 are communicated to the processing device 108 for further processing and analysis. The same or other client devices 104 may subsequently receive data from the processing device 108 and/or other client devices for display (e.g., data visualizations).
In some embodiments, a control device 110 is any electronic device (e.g., a smart phone, a laptop device, a workstation in a control center, etc.) that generates and transmits control commands for manipulating a flight pattern of one or more airborne drone devices 102. Control commands include instructions and parameters for a flight pattern (e.g., flight line/path, speed, altitude, etc.), and may include a pre-programmed instructions (e.g., transferred to the drone device 102 prior to flight, and later executed) or manual user controls that are transmitted to the client device 104 (and/or the drone device 102) in real-time. In some embodiments, control commands are generated and transmitted by the processing device 108 and/or one or more client devices 104 (used as control devices for synchronizing operational processes of one or more client devices 104 and/or respective drone device 102). Control commands are described in greater detail with respect to the methods 500 and 550 of
In some embodiments, data is sent to and viewed by the client devices in a variety of output formats, and/or for further processing or manipulation (e.g., CAD programs, 3D printing, virtual reality displays, holography applications, etc.). In some embodiments, data is sent for display to the same client device that performs the image capture and acquires sensor readings (e.g., client devices 104), and/or other systems and devices (e.g., processing device 108). In some embodiments, client devices 104 access data and/or services provided by the processing device 108 by execution of various applications. For example, in some embodiments client devices 104 execute web browser applications that can be used to access services provided by the processing device 108. As another example, one or more of the client devices 104-1, 104-2, . . . 104-n execute software applications that are specific to viewing and manipulating data (e.g., visualization “apps” running on smart phones or tablets).
The processing device 108 stores, processes, and/or analyzes data received from one or more client devices 104 and/or drone devices 102 (e.g., multimedia data, respective sets of meta data, etc.). Data resulting from processing and analytical operations are in turn disseminated to the same and/or other client devices for viewing, manipulation, and/or further processing and analysis. In some embodiments, the processing device 108 is a single computing device such as a computer server, while in other embodiments, the processing device 108 is implemented by multiple computing devices working together to perform the actions of a server system (e.g., cloud computing).
The processing device 108 typically includes one or more processing units (processors or cores) 202, one or more network or other communications interfaces 204, memory 206, and one or more communication buses 208 for interconnecting these components. The communication buses 208 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. The processing device 108 optionally includes a user interface (not shown). The user interface, if provided, may include a display device and optionally includes inputs such as a keyboard, mouse, trackpad, and/or input buttons. Alternatively or in addition, the display device includes a touch-sensitive surface, in which case the display is a touch-sensitive display.
Memory 206 includes high-speed random-access memory, such as DRAM, SRAM, DDR RAM, or other random-access solid-state memory devices; and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, and/or other non-volatile solid-state storage devices. Memory 206 optionally includes one or more storage devices remotely located from the processor(s) 202. Memory 206, or alternately the non-volatile memory device(s) within memory 206, includes a non-transitory computer-readable storage medium. In some embodiments, memory 206 or the computer-readable storage medium of memory 206 stores the following programs, modules and data structures, or a subset or superset thereof:
The data store 214 (and any other data storage modules) stores data associated with one or more subjects in one or more types of databases, such as graph, dimensional, flat, hierarchical, network, object-oriented, relational, and/or XML databases, or other data storage constructs.
The client device 104 (e.g., a self-contained battery-operated device, a smart phone, etc.) typically includes one or more processing units (processors or cores) 302, one or more network or other communications interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components. The communication buses 308 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. The client device 104 includes a user interface 310. The user interface 310 typically includes a display device 312. In some embodiments, the client device 104 includes inputs such as a keyboard, mouse, and/or other input buttons 316. Alternatively or in addition, in some embodiments, the display device 312 includes a touch-sensitive surface 314, in which case the display device 312 is a touch-sensitive display. In client devices that have a touch-sensitive display 312, a physical keyboard is optional (e.g., a soft keyboard may be displayed when keyboard entry is needed). The user interface 310 also includes an audio output device 318, such as speakers or an audio output connection connected to speakers, earphones, or headphones. Furthermore, some client devices 104 use a microphone and voice recognition to supplement or replace the keyboard. Optionally, the client device 104 includes an audio input device 320 (e.g., a microphone) to capture audio (e.g., speech from a user). Optionally, the client device 104 includes a location detection device 322, such as a GPS (global positioning satellite) or other geo-location receiver, for determining the location of the client device 104.
The client device 104 also optionally includes an image/video capture device 324, such as a camera or webcam. In some embodiments, the image/video capture device 324 includes a two-dimensional pixilated detector/image sensor configured to capture images of environmental regions or other subjects/surfaces, in accordance with one or more predefined capture settings (e.g., resolutions, capture frequencies, etc.). In some embodiments, the client device 104 includes a plurality of image/video capture devices 324 (e.g., a front facing camera and a back facing camera), where in some implementations, each of the multiple image/video capture devices 324 captures a distinct set of images (e.g., capturing images at different resolutions, ranges of light, etc.). Optionally, the client device 104 includes one or more illuminators (e.g., a light emitting diode) configured to illuminate a subject or environment. In some embodiments, the one or more illuminators are configured to illuminate specific wavelengths of light (e.g., ultraviolet, infrared, polarized, fluorescence, for night time operations when there is less than a threshold level of ambient light, for example), and the image/video capture device 324 includes a two-dimensional pixilated detector/image sensor configured with respect to wavelength(s) of the illuminated light. Additionally and/or alternatively, the image/video capture device 324 includes one or more filters configured with respect to wavelength(s) of the illuminated light (i.e., configured to selectively filter out wavelengths outside the range of the illuminated light).
In some embodiments, the client device 104 includes one or more sensors 326 including, but not limited to, accelerometers, gyroscopes, compasses, magnetometer, light sensors, near field communication transceivers, barometers, humidity sensors, temperature sensors, proximity sensors, lasers, range finders (e.g., laser-based), and/or other sensors/devices for sensing and measuring various environmental conditions. In some embodiments, the one or more sensors operate and obtain measurements at respective predefined frequencies.
Memory 306 includes high-speed random-access memory, such as DRAM, SRAM, DDR RAM or other random-access solid-state memory devices; and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. Memory 306 may optionally include one or more storage devices remotely located from the processor(s) 302. Memory 306, or alternately the non-volatile memory device(s) within memory 306, includes a non-transitory computer-readable storage medium. In some embodiments, memory 306 or the computer-readable storage medium of memory 306 stores the following programs, modules and data structures, or a subset or superset thereof:
In some embodiments, the control device 110 (
Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions as described above and/or in the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 206 and/or 306 store a subset of the modules and data structures identified above. Furthermore, memory 206 and/or 306 optionally store additional modules and data structures not described above.
Furthermore, in some implementations, the functions of any of the devices and systems described herein (e.g., client devices 104, drone device 102, processing device 108, control device 110, etc.) are interchangeable with one another and may be performed by any other devices or systems, where the corresponding sub-modules of these functions may additionally and/or alternatively be located within and executed by any of the devices and systems. As one example, although the client device 104 (
Specifically, the environment shown in
The client device 104-1 is used to capture one or more still-frame images, video sequences, and/or audio recordings of the environmental region 400-1 (i.e., multimedia data) from one or more aerial positions and angles.
Concurrently with image capture, the client device 104-1 also acquires respective meta data for the time-stamped images as the location and orientation of the client device changes throughout the flight pattern. Meta data includes rotational values indicating a relative orientation of the client device 104-1 with respect to a respective reference orientation (e.g., roll (φ), pitch (θ), yaw (ψ)), location information of the client device (e.g., GPS coordinates, relative position with respect to the environmental region, etc.), and/or other various environmental conditions (e.g., altitude, temperature, etc.).
Once obtained, time-stamped images and respective sets of meta data are sent to a processing device (e.g., processing device 108,
The use of multiple client devices 104 and drone devices 102 is advantageous for obtaining and acquiring comprehensive data, and ultimately for enabling an enhanced analytical approach to processing data. By concurrently using multiple devices for image capture, data may be captured for larger environmental regions with greater efficiency and effectiveness. Moreover, additional client devices 104 may be used to acquire time-stamped images and sensor readings from a variety of viewpoints and angles.
In addition to providing coverage for designated environmental regions, individual drone devices 102 (or subsets of a plurality of drone devices 102) may also be programmed to execute specific and distinct flight patterns (e.g., an orbital trajectory for a first drone device 102-1, and a linear trajectory for a second drone device 102-2).
Multiple client devices 104 may also be used to capture images in accordance with different sets of capture settings. For example, each client device 104 of a fleet can capture images at a different resolution (e.g., a first client device 104-1 for capturing low-resolution images, a second client device 104-2 for capturing high-resolution images), and/or to capture images representing a distinct frequency (or range of frequencies) of light (e.g., a first client device 104-1 configured to detect visible light frequencies, a second client device 104-2 configured to detect IR light frequencies).
The use of multiple client devices 104 and drone devices 102 is described in greater detail below with respect to
The method 500, described with respect to
In some embodiments, the first client device is fixedly mounted to the first drone device such that sensor readings by the first client device are substantially representative of environmental conditions associated with the first drone device. For example, sensor readings obtained by the first client device that indicate an orientation of the first client device, also indicate an orientation of the first drone device to which the first client device is mounted. In other words, in some embodiments, because the first client device and the first drone device are fixedly mounted, their respective orientations are substantially the same. Similarly, as another example, a location of the first client device (derived from sensor readings acquired by the first client device) is substantially the same as a location of the first drone device.
In some embodiments, the first client device is (504) a first smart phone. In some embodiments, the first client device further includes (506) a communications device configured for communications using a cellular communications protocol (e.g., communications module 212 for communicating using a CDMA based protocol,
The first client device obtains (508) one or more first time-stamped images of a first environmental region using the first two-dimensional pixilated detector at a time when the first drone device is airborne. An environmental region is any viewable surface, terrain, or geographic area that may be aerially surveyed, such as crop fields, urban landscapes, etc. In some embodiments, once a flight pattern is executed and the first drone device becomes airborne, the first client device commences image capture and obtains the one or more first time-stamped images of the first environmental region (e.g., environmental region 400-1,
In some embodiments, the obtaining (508) includes storing (510) the one or more first time-stamped images in a first memory location of the memory of the first client device (e.g., image/video capture module 332,
In some embodiments, the one or more first time-stamped images are obtained in accordance with one or more image capture settings (e.g., a frequency of image capture, capture duration, capture start/end time, configuration settings for captured images, such as a resolution, zoom, focal length, etc.). For example, in some embodiments, the first two-dimensional pixilated detector is exposed (514) to a discrete first wavelength range reflected off the surface of the first environmental region, wherein the discrete first wavelength range is a subset of the visible spectrum. In some embodiments, the one or more image capture settings are based on parameters of the flight pattern of the first drone device (e.g., the first frequency is based on a current speed of the first drone device, a type of trajectory, a height above the environmental region, etc.). In some embodiments, one or more image capture settings are adjusted in accordance with detected changes in the flight pattern of the first drone device. For example, a headwind that reduces a speed of the first drone device causes a frequency of image capture to be adjusted such that the one or more first time-stamped images are captured at a lower frequency. Conversely, a tailwind that increases the speed of the first drone device causes the frequency of image capture to be adjusted such that the one or more first time-stamped images are captured at a higher frequency.
In some embodiments, after obtaining (508) the one or more first time-stamped images of the first environmental region (e.g., after completing a first image capture session), the first client device obtains one or more subsequent time-stamped images of an additional environmental region. In some embodiments, the additional environmental region is at least partially distinct from the first environmental region (e.g., adjacent to, but partially overlapping the first environmental region). In some embodiments, while obtaining the one or more subsequent time-stamped images, the first drone device executes a flight pattern in accordance with the flight pattern executed by the first drone device while obtaining the one or more first time-stamped images. As an example, while obtaining a first set and a second set of time-stamped images, the drone device executes respective flight patterns having parallel flight lines (e.g., linear trajectory) so as to methodically canvas a large geographic area having multiple, parallel environmental regions.
Referring now to
In some embodiments, the first client device receives (522), from a remote control device over a wireless communications channel, and provides, to the first airborne drone device, one or more control commands for manipulating a flight pattern of the first airborne drone device. Control commands include instructions (e.g., to be executed by the first client device and/or the first drone device) for one or more parameters of a flight pattern, including but not limited to a speed (e.g., changing current speed, enabling variable speed, etc.), an altitude, an orientation (e.g., yaw, pitch roll), a flight duration, a trajectory (e.g., linear, orbital), a flight line (e.g., a specific path/environmental region over which the drone device spans), and/or any other parameters affecting the flight behavior of the first drone device. In some embodiments, the one or more control commands correspond (524) to commands for executing a flight pattern having an orbital trajectory. In some embodiments, the one or more control commands correspond (526) to commands for executing a flight pattern having a linear trajectory. Additionally, control commands may also include instructions for obtaining time-stamped images in accordance with one or more image capture settings (e.g., instructions for modifying existing image capture settings for the first client device, such as changing a frequency of image capture, a capture resolution, a capture duration, a start/end time, etc.).
In some embodiments, the one or more control commands include a pre-programmed set of instructions (e.g., corresponding to predefined parameters of a flight pattern to be executed, such as a flight line, speed, altitude, etc.). By executing control commands that include a pre-programmed flight pattern, drone devices thus become autonomous (and do not require user intervention) from the moment of turning on, or the flight pattern being initiated. In some embodiments, the one or more control commands (e.g., a pre-programmed flight pattern) are received by the first client device (and subsequently provided to the drone device) before initiating a capture session and before executing a flight pattern of the first drone device (i.e., before the drone device is airborne).
Alternatively, in some embodiments, the remote control device is (528) a second client device (e.g., client device 104 of
In some embodiments, the control device is an electronic device or system (e.g., a smart phone, a laptop device, a workstation in a control center, etc.) that includes memory storing one or more modules (e.g., a drone control application) for generating and transmitting control commands to the first client device and/or the first drone device.
In some embodiments, the wireless communications channel is based on a cellular communications protocol (e.g., CDMA, GSM). In other embodiments, the wireless communications channel is based on a wireless communications protocol (e.g., IEEE 802.11 Wi-Fi, Wi-Fi direct, Bluetooth, IEEE 802.15.4, etc.).
Alternatively, in some embodiments, the first drone device receives, from the remote control device (e.g., a laptop, smart phone, etc.), one or more control commands for execution (e.g., via a cellular connect, Wi-Fi direct, Bluetooth, etc.). That is, in some embodiments, the first drone device receives control commands directly, rather than from the first client device. In some embodiments, the first drone device is preloaded with the one or more control commands that include a pre-programmed flight pattern (e.g., drone device is autonomous upon being powered on, mounted, operationally initiated, etc.).
Referring now to
In some embodiments, the sending includes (532) transmitting the one or more first time-stamped images and the respective sets of meta data over the wireless communications channel (of step 522) (i.e., the same wireless communications channel over which control commands are received by the first client device). In other embodiments, the sending includes transmitting the one or more first time-stamped images and the respective sets of meta data over a wireless communications channel that is distinct from the wireless communications channel over which the one or more control commands were received (e.g., transmitting images and meta data over cellular communications channel, while receiving control commands over Wi-Fi).
In some embodiments, the sending includes (534) transferring the one or more first time-stamped images and the respective sets of meta data to the remote processing device via a wired interface (e.g., transferring over a USB cable interface between the processing device 108 and the client device 104-1 at the conclusion of an image capture and flight session,
In some embodiments, the sending includes (536) transmitting the one or more first time-stamped images and the respective sets of meta data to the remote processing device via a wireless communications interface (e.g., IEEE 802.11 Wi-Fi, Bluetooth, etc.). In some embodiments, the sending includes (538) transmitting to the remote processing device, with the cellular communications protocol (e.g., GSM, CDMA), the one or more first time-stamped images and the respective sets of meta data. In some embodiments, the transmitting is performed (540) concurrently with obtaining the one or more first time-stamped images. In other words, the one or more first time-stamped images and the respective sets of meta data are transmitted during, rather than at the conclusion of, an image capture and flight session (i.e., streaming images and meta data in flight and in real-time as it is obtained). In some embodiments, the sending is performed subsequent to obtaining the one or more first time-stamped images and the respective set of meta data (e.g., transferring to the remote processing device at the conclusion of the image capture and flight session).
In some embodiments, a first portion of the one or more time-stamped images and the respective sets of meta data is sent while in flight (e.g., via a wireless communications channel), while a second portion (e.g., a remaining portion) is transferred after the mission is complete (e.g., via a removable storage device).
In some embodiments, the one or more first time-stamped images are formatted (542) to be stitched together using respective sets of meta data, wherein stitching together the one or more first time-stamped images thereby forms a composite image. In some embodiments, stitching includes determining an order in which all or a subset of time-stamped images are arranged in order to form the composite image, wherein the order is determined by matching time-stamped images based on the obtained meta data (e.g., matching time stamps, identifying images having adjacent locations, etc.).
In some embodiments, aspects of the composite image are enhanced by using the respective set of meta data (e.g., sensor readings). Enhanced aspects of the composite image include straightness of line boundaries, color accuracy, and/or alignment of portions of the composite image. For example, inertial meta data obtained from gyroscopes of the first client device may be used to compensate and straighten distorted image data resulting from unsteady flight lines due to environmental conditions.
As described with respect to
In some embodiments, the method 550 is performed (552) at a second client device fixedly mounted to a second drone device (e.g., client device 104-2 mounted to drone device 102-2,
In some embodiments, the second client device obtains (554) one or more second time-stamped images of a second environmental region using the second two-dimensional pixilated detector at a time when the second drone device is airborne. In some embodiments, the first client device obtaining the one or more first time-stamped images (step 508,
In some embodiments, the second environmental region is distinct from the first environment region that corresponds to the one or more first time-stamped images obtained by the first client device (step 508,
In some embodiments, the one or more second time-stamped images are obtained in accordance with one or more image capture settings (e.g., a frequency of image capture, configuration settings for captured images, such as a resolution, zoom, focal length, etc.) that are the same, or at least partially distinct from, the one or more image capture settings used for obtaining the first time-stamped images (step 508, described in
In some embodiments, for each of the captured one or more second time-stamped images, the second client device obtains (558) a respective set of meta data. The respective set of meta data includes: (1) rotational values (e.g., with respect to a predefined axis,
In some embodiments, the second client device receives (560), from a remote control device over a wireless communications channel, and provides, to the first airborne drone device and the second airborne drone device, one or more control commands for manipulating a respective flight pattern of the first airborne drone device and the second airborne drone device. In some embodiments, the remote control device (e.g., a laptop, a smart phone with a drone control application, etc.) is the same remote control device from which the first client device receives control commands (step 522,
In some embodiments, the one or more control commands include a first respective set of control commands for the first drone device, and a second respective set of control commands for the second drone device. In some embodiments, the respective sets of control commands for the first and second drone device are distinct with respect to at least some parameters of their respective flight patterns (e.g., respective control commands received by the first and the second client device having different trajectory types and speeds). In some embodiments, respective control commands sent to and received by the first and the second client devices are selected such that image capture by the first and second client devices is synchronized (e.g., images are captured and obtained at aligned locations so they can be stitched).
In some embodiments, the one or more control commands are based on an identified characteristic of the one or more first and/or second time-stamped images, or of the first and/or second environmental regions. Characteristics may correspond to features of interest observable in the images or environmental regions (e.g., specific portions of an image, objects observed in an image/environmental regions, such as animals, insects, sections of crops, etc.). Additionally and/or alternatively, characteristics correspond to, relate to, or indicate technical aspects or a status of image data or respective meta data, such as a quality (e.g., clarity, sharpness, focus, color accuracy, etc.), completeness (e.g., missing, incomplete, or deficient image data/meta data of an environmental region, feature of interest, etc.), or resolution (e.g., resolution below a predefined threshold) of the image data or respective meta data. Characteristics may be identified manually (e.g., portions of images selected by a user upon reviewing the captured images) or through image processing (e.g., performed by the remote processing device). In some embodiments, characteristics are identified (and thus control commands are generated) in real-time (i.e., based on real-time image processing performed during image capture) or alternatively, after the first and/or second time-stamped images and respective sets of meta data have been received and processed by a remote processing device (e.g., performed by processing device 108, the identified characteristics being sent to the client/drone devices thereafter).
In some embodiments, the one or more control commands are commands for remedying (e.g., adjusting, correcting, compensating) an identified characteristic (e.g., incomplete image data) of the captured images or environmental regions. The commands instruct the client devices and/or drone devices to obtain, for example, more image data (e.g., commands for capturing additional time-stamped images at specific coordinates), image data at higher resolutions, and/or image data at closer/farther distances. The control commands may include modified parameters of a flight pattern (e.g., instructing a drone device to fly closer to a region of interest) and/or image capture settings (e.g., increasing an image capture resolution) for remedying the identified characteristics. In some embodiments, an identified characteristic corresponds to a feature of interest observable in the images or environmental regions that is mobile (i.e., location information changes over time) (e.g., bird, animal, insect, etc.), and the one or more control commands include modified parameters of a flight pattern (e.g., instructing a drone device to track and fly closer to the feature of interest) and/or image capture settings (e.g., increasing an image capture resolution, capturing additional images, etc.) for tracking, and capturing images and/or respective meta data, for the feature of interest.
In some embodiments, the one or more control commands are sent to one or more client devices of the plurality of client devices (or alternatively, one or more drone devices of the plurality of drone devices) based on a respective ability to remedy the identified characteristic, wherein the respective ability to remedy is based on technical capabilities of the client/drone devices (e.g., client device with most available storage, highest resolution image sensor, etc.) or meta data (e.g., commands sent to client devices in closest proximity to region of interest that has insufficient image data).
In some embodiments, the second client device receives, from the first client device (and/or the first drone device), one or more control commands for manipulating a flight pattern of the second airborne drone device. In some embodiments, the first client device receives control commands (e.g., from a remote control device) and re-broadcasts the control commands to the second client device, for execution by the second drone device. In some embodiments, the first client device identifies one or more characteristics of the one or more first time-stamped images or of the first environmental region (e.g., identifying in real-time portions of the captured images having insufficient image data) and, based on the one or more identified characteristics, generates and sends to the second client device (and/or second drone device) one or more control commands for remedying the one or more identified characteristics (e.g., commands modifying a flight pattern to navigate to a region of interest for which there is insufficient image data, and commands including image capture settings for obtaining high-resolution images of the region of interest).
In some embodiments, the second client device sends (562), to the remote processing device, the one or more second time-stamped images and the respective sets of meta data for the one or more second time-stamped images. In some embodiments, the one or more first time-stamped images and the one or more second time-stamped images are formatted (564) to be stitched together using the respective sets of meta data, wherein stitching together the one or more first time-stamped images and the one or more second time-stamped images thereby forms a composite image (e.g., using location information obtained by the first and second client devices to align the first and second time-stamped images for stitching).
Any operations performed by the second client device (e.g., steps of the method 550,
For situations in which the systems discussed above collect information about users, the users may be provided with an opportunity to opt in/out of programs or features that may collect personal information (e.g., information about a user's preferences or a user's contributions to social content providers). In addition, in some embodiments, certain data may be anonymized in one or more ways before it is stored or used, so that personally identifiable information is removed. For example, a user's identity may be anonymized so that the personally identifiable information cannot be determined for or associated with the user, and so that user preferences or user interactions are generalized (for example, generalized based on user demographics) rather than associated with a particular user.
Although some of various drawings illustrate a number of logical stages in a particular order, stages which are not order dependent may be reordered and other stages may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be apparent to those of ordinary skill in the art, so the ordering and groupings presented herein are not an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the scope of the claims to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen in order to best explain the principles underlying the claims and their practical applications, to thereby enable others skilled in the art to best use the embodiments with various modifications as are suited to the particular uses contemplated.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2015/057420 | 10/26/2015 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62068738 | Oct 2014 | US | |
62203310 | Aug 2015 | US | |
62203312 | Aug 2015 | US | |
62206754 | Aug 2015 | US | |
62209787 | Aug 2015 | US |