Some embodiments described herein relate generally to methods and apparatus for unmanned aerial vehicle enabled video recording. In particular, but not by way of limitation, some embodiments described herein relate to methods and apparatus for Unmanned Aerial Vehicles (UAVs) enabled automatic video editing.
An Unmanned Aerial Vehicle (UAV) (also referred to herein as a drone) is an aircraft without a human pilot on board. Its flight is often controlled either autonomously by computers or by the remote control of a human on the ground. Drones have been used for keeping a skier in a frame of a camera while the skier travels along a ski path. Video recorded by drones during such sporting activities often include segments that are less interesting. For example, when a skier is preparing to ski but has not moved yet, the drone may have already started recording the video. Such video segments are of little interests but an accumulation of such uninteresting video segments can needlessly consume server and network bandwidth resources.
Accordingly, a need exists for methods and apparatus for Unmanned Aerial Vehicles (UAVs) enabled automatic video editing.
In some embodiments, an apparatus includes a processor and a memory. The memory is connected to the processor and stores instructions executed by the processor to receive a video segment of a moving object recorded by an Unmanned Aerial Vehicle (UAV). The memory also stores instructions executed by the processor to receive a measured moving object parameter and edit the video segment of the moving object based on the measured moving object parameter to form an edited video segment. The memory stores instructions executed by the processor to send the edited video segment.
In some embodiments, an apparatus includes a processor and a memory. The memory is connected to the processor and stores instructions executed by the processor to receive a video segment of a moving object recorded by an Unmanned Aerial Vehicle (UAV). The memory also stores instructions executed by the processor to receive a measured moving object parameter and edit the video segment of the moving object based on the measured moving object parameter to form an edited video segment. The memory stores instructions executed by the processor to send the edited video segment.
As used in this specification, the singular forms “a,” “an” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, the term “a moving object” is intended to mean a single moving object or a combination of moving objects.
In one embodiment, a mobile device 105 associated with the moving object can communicate with the wearable device via Bluetooth. In addition, the mobile device 105 can be used to control the drone, view and/or share recorded videos. A kiosk 106, which can be disposed locally at the sporting activity site, can receive the video recorded by the drone and upload the video to a server 107. The server 107 can communicate with a video editor 108 and/or video sharing websites 104 for post-editing and sharing.
The wearable device 102 may also include a display device for the user to view analytics associated with the user and/or analytics associated with the drone. The analytics may include location, altitude, temperature, pressure, humidity, date, time, and/or flight route. In some instances, the display device can also be used to view the recorded video. A control inputs unit 124 can be included in the wearable device 102 to allow the user to provide control commands to the wearable device or to the drone. As discussed above with regards to
In some embodiments, the wearable device 102 can be configured to communicate with the drone in order to update it about the user's current position and velocity vector. In some embodiments, the wearable device 102 can be configured to communicate with the backend server to log the status of a user. In some embodiments, the wearable device 102 can be configured to communicate with user's phone to interact with a smartphone app. In some embodiments, the wearable device 102 can be configured to give a user the ability to control the drone via buttons. In some embodiments, the wearable device 102 can be configured to give a user insight into system status via audio output, graphical display, LEDs, etc. In some embodiments, the wearable device 102 can be configured to measure environmental conditions (temperature, wind speeds, humidity, etc.)
In some embodiments, the wearable device is a piece of hardware worn by the user. Its primary purpose is to notify the drone of the user's position, thus enabling the drone to follow the user and to keep the user in the camera frame.
In some embodiments, the wearable device can be in a helmet, a wristband, embedding in clothing (e.g., jackets, boots, etc.), embedded in sports equipment (snowboard, surfboard, etc.), and/or embedded in accessories (e.g., goggles, glasses, etc.).
In some embodiments, the UAV video editor 500 includes a processor 510, a memory 520, a communications interface 590, a synchronizer 530, a video eliminator 550, and a video enhancer 560. In some embodiments, the UAV video editor 500 can be a single physical device. In other embodiments, the UAV video editor 500 can include multiple physical devices (e.g., operatively coupled by a network), each of which can include one or multiple modules and/or components shown in
Each module or component in the UAV video editor 500 can be operatively coupled to each remaining module and/or component. Each module and/or component in the UAV video editor 500 can be any combination of hardware and/or software (stored and/or executing in hardware) capable of performing one or more specific functions associated with that module and/or component.
The memory 520 can be, for example, a random-access memory (RAM) (e.g., a dynamic RAM, a static RAM), a flash memory, a removable memory, a hard drive, a database and/or so forth. In some embodiments, the memory 520 can include, for example, a database, process, application, virtual machine, and/or some other software modules (stored and/or executing in hardware) or hardware modules configured to execute a UAV automatic video editing process and/or one or more associated methods for UAV automatic video editing. In such embodiments, instructions of executing the UAV automatic video process and/or the associated methods can be stored within the memory 520 and can be executed at the processor 510.
The communications interface 590 can include and/or be configured to manage one or multiple ports of the UAV video editor 500. In some embodiments, the communications interface 590 can be configured to, among other functions, receive data and/or information, and send commands, and/or instructions, to and from various devices including, but not limited to, the drone, the wearable device, the mobile device, the kiosk, the server, and/or the world wide web.
The processor 510 can be configured to control, for example, the operations of the communications interface 590, write data into and read data from the memory 520, and execute the instructions stored within the memory 520. The processor 510 can also be configured to execute and/or control, for example, the synchronizer 530, the video eliminator 550, and the video enhancer 560, as described in further detail herein. In some embodiments, under the control of the processor 510 and based on the methods or processes stored within the memory 520, the synchronizer 530, the video eliminator 550, and the video enhancer 560 can be configured to execute a UAV automatic video editing process, as described in further detail herein.
The synchronizer 530 can be any hardware and/or software module (stored in a memory such as the memory 520 and/or executing in hardware such as the processor 510) configured to synchronize a video segment (also referred to herein as a video clip, a video track, a video snippet, or a video footage) with a measured moving object parameter. The measured moving object parameter can be selected from a velocity vector, a gravitational force value, an audio clip, compass readings, magnetometer readings, barometer readings, altitude readings, an analysis of the recorded video itself (for instance, with various computer vision approaches, the processor 510 can be configured to determine that the moving subject which is being filmed by the UAV is not in the frame of the video. In such circumstance, this video section can be automatically removed because it is uninteresting), and/or the like. For example, as shown in
The video eliminator 550, as shown in
The video enhancer 560, as shown in
In some embodiments, the above mentioned parameters (such as 904 and 906) can be learned by analyzing video metrics after the videos are posted on the various social media outlets. For example, the video metrics 910 can include how many views a YouTube® video has received, how many Facebook® likes the post containing the video received, how many times a video has been tweeted, and the like. The video metrics can be input to a machine learning process to learn how the parameters (such as 904 and 906) can be adjusted to increase the video metrics. In other words, a feedback loop is created such that how the drone flies and how the video is edited can be impacted by how well the output video does on social media.
An embodiment of the present invention relates to a computer storage product with a non-transitory computer readable storage medium having computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: magnetic media, optical media, magneto-optical media and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment of the invention may be implemented using JAVA@, C++, or other object-oriented programming language and development tools. Another embodiment of the invention may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practice the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed; obviously, many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, they thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.
This application claims priority to U.S. Provisional Patent Application Ser. No. 62/041,009, filed on Aug. 22, 2014, and U.S. Provisional Patent Application Ser. No. 62/064,434, filed on Oct. 15, 2014. The contents of the aforementioned applications are incorporated herein by reference in their entirety. The contents of a related application Ser No. ______, filed on Aug. 21, 2015, entitled “Methods and Apparatus for Unmanned Aerial Vehicle Autonomous Aviation”, (Attorney Docket No. CAPE-001/02US 322555-2003) are incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62041009 | Aug 2014 | US | |
62064434 | Oct 2014 | US |