A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
The disclosed embodiments relate generally to video broadcasting and more particularly, but not exclusively, to systems and methods for supporting video broadcasting from one or more mobile platforms.
Traditional aerial imaging systems lack a capacity to broadcast captured pictures in a real-time manner. The pictures captured by such aerial imaging systems are usually presented in a time-delayed manner via a storage device of some sort. This delay sometimes affects an entertaining effect and/or a news propagation speed of the captured pictures.
In view of the foregoing reasons, there is a need for a system and method to broadcast, via the Internet, pictures captured with an aerial imaging system in a real-time manner.
In accordance with a first aspect disclosed herein, there is set forth a system for video broadcasting, comprising:
one or more mobile nodes, each the mobile node operates to capture one or more pictures; and
a terminal node that operates to upload the captured pictures from the mobile nodes to a video server.
In an exemplary embodiment of the disclosed systems, mobile nodes are associated with a plurality of mobile platforms.
In another exemplary embodiment of the disclosed systems, each of the mobile nodes is associated with a respective mobile platform.
In another exemplary embodiment of the disclosed systems, the terminal node receives the captured pictures from the mobile nodes.
In another exemplary embodiment of the disclosed systems, the video server is accessible via one or more client receivers.
In another exemplary embodiment of the disclosed systems, at least one of the mobile nodes is an aerial node.
In another exemplary embodiment of the disclosed systems, the mobile nodes exchange control signals via a peer-to-peer protocol.
In another exemplary embodiment of the disclosed systems, at least one of the mobile nodes is configured to collect a first audio signal.
Exemplary embodiments of the disclosed systems further comprise a control node that operates to coordinate the mobile nodes and/or the terminal node.
In another exemplary embodiment of the disclosed systems, the control node is associated with at least one of the mobile nodes and the terminal node.
In another exemplary embodiment of the disclosed systems, at least one of the terminal node and the client receivers is enabled to control the mobile nodes.
In another exemplary embodiment of the disclosed systems, the terminal node is associated with a ground node or an aerial node.
In another exemplary embodiment of the disclosed systems, the mobile nodes are configured to transmit the captured pictures to the terminal node as a first bitstream.
In another exemplary embodiment of the disclosed systems, the terminal node is configured to receive the first bitstream from the mobile nodes via a datalink.
In another exemplary embodiment of the disclosed systems, the terminal node operates to upload the captured pictures to the video server as a second bitstream.
In another exemplary embodiment of the disclosed systems, the video server operates to receive the second bitstream for broadcasting the captured pictures.
In another exemplary embodiment of the disclosed systems, each of the mobile nodes comprises at least one imaging device that operates to capture the pictures.
In another exemplary embodiment of the disclosed systems, each of the mobile nodes is configured to encode the captured pictures to generate the first bitstream.
In another exemplary embodiment of the disclosed systems, the captured pictures are encoded in accordance with a private protocol.
In another exemplary embodiment of the disclosed systems, the captured pictures are encoded on or before being transmitted to the terminal node.
Exemplary embodiments of the disclosed systems further comprise a datalink configured to transmit the first bitstream from a selected mobile node to the terminal node.
In another exemplary embodiment of the disclosed systems, the mobile node is an unmanned aerial vehicle (“UAV”).
In another exemplary embodiment of the disclosed systems, the terminal node is a mobile device.
In another exemplary embodiment of the disclosed systems, the mobile device is at least one of a laptop, a desktop, a tablet and a mobile phone.
In another exemplary embodiment of the disclosed systems, the terminal node comprises an audio device that operates to capture a second audio signal.
In another exemplary embodiment of the disclosed systems, the audio device is a microphone.
In another exemplary embodiment of the disclosed systems, the terminal node further comprises an audio mixer that operates to merge the second audio signal with the captured pictures.
In another exemplary embodiment of the disclosed systems, the terminal node is configured to pack the captured pictures in accordance with a public protocol to generate the second bitstream for transmission to the video server.
In another exemplary embodiment of the disclosed systems, the terminal node transmits the second bitstream to the video server via the Internet.
In another exemplary embodiment of the disclosed systems, the public protocol includes at least one of a Real Time Messaging Protocol (“RTMP”) protocol and a Real Time Streaming Protocol (“RTSP”) protocol.
In another exemplary embodiment of the disclosed systems, the video server is provided by a web service provider.
In another exemplary embodiment of the disclosed systems, the mobile nodes capture the pictures from a plurality of view-angles and/or elevations.
In another exemplary embodiment of the disclosed systems, the client receivers have access to each of the video servers for displaying the captured pictures.
In another exemplary embodiment of the disclosed systems, the client receivers access the video server via the Internet.
In accordance with another aspect disclosed herein, there is set forth a method for video broadcasting, comprising:
receiving one or more pictures captured by one or more mobile nodes by a terminal node; and
uploading the captured pictures from the terminal node to a video server accessible from a plurality of client receivers.
Exemplary embodiments of the disclosed methods further comprise capturing the pictures with the mobile nodes.
In another exemplary embodiment of the disclosed methods, capturing the pictures comprises capturing the pictures with the mobile nodes associated with respective mobile platforms.
In another exemplary embodiment of the disclosed methods, capturing pictures with one or more mobile nodes comprises capturing pictures with one or more aerial nodes.
Exemplary embodiments of the disclosed methods further comprise communicating control signals among the mobile nodes in accordance with a peer-to-peer protocol.
In another exemplary embodiment of the disclosed methods, capturing the pictures comprises collecting a first audio signal with at least one mobile node.
Exemplary embodiments of the disclosed methods further comprise coordinating the mobile nodes and/or the terminal node with a control node.
In another exemplary embodiment of the disclosed methods, the control node is associated with at least one of the mobile nodes and the terminal node.
Exemplary embodiments of the disclosed methods further comprise enabling at least one of the terminal node and the client receivers to control the mobile nodes.
In another exemplary embodiment of the disclosed methods, uploading comprises uploading the captured pictures by the terminal node as a second bitstream.
Exemplary embodiments of the disclosed methods further comprise positioning the mobile nodes on one or more respective aerial platforms.
In another exemplary embodiment of the disclosed methods, uploading the second bitstream of the captured pictures comprising uploading the second bitstream to the Internet.
Exemplary embodiments of the disclosed methods further comprise encoding the pictures by the mobile node to generate the second bitstream.
In another exemplary embodiment of the disclosed methods, encoding the pictures comprises encoding the pictures in accordance with a private protocol.
Exemplary embodiments of the disclosed methods further comprise transmitting the first bitstream to the terminal node.
In another exemplary embodiment of the disclosed methods, transmitting the first bitstream comprises transmitting the first bitstream through a datalink.
In another exemplary embodiment of the disclosed methods, the mobile node is an Unmanned Aerial Vehicle (“UAV”).
In another exemplary embodiment of the disclosed methods, transmitting the first bitstream to the terminal node comprising transmitting the first bitstream to a mobile device.
In another exemplary embodiment of the disclosed methods, transmitting the first bitstream to a mobile device comprises transmitting the first bitstream to at least one of a computer and a mobile phone.
Exemplary embodiments of the disclosed methods further comprise capturing audio data via an audio device from the terminal node.
In another exemplary embodiment of the disclosed methods, capturing the audio data via an audio device comprises capturing the audio data via a microphone.
Exemplary embodiments of the disclosed methods further comprise merging the audio data with the pictures.
Exemplary embodiments of the disclosed methods further comprise converting the second bitstream to a public protocol.
Exemplary embodiments of the disclosed methods further comprise transmitting the second bitstream to a video server via the Internet.
Exemplary embodiments of the disclosed methods further comprise transmitting the second bitstream by the terminal node to the video server via the Internet.
In another exemplary embodiment of the disclosed methods, converting the second bitstream to a public protocol comprises converting the second bitstream to at least one of a Real Time Messaging Protocol (“RTMP”) protocol and a Real Time Streaming Protocol (“RTSP”) protocol.
In another exemplary embodiment of the disclosed methods, capturing the pictures comprises capturing the pictures from a plurality of view-angles and/or elevations.
Exemplary embodiments of the disclosed methods further comprise comprising displaying the pictures.
In another exemplary embodiment of the disclosed methods, displaying the pictures comprises enabling the pictures accessible for the client receivers.
In accordance with another aspect disclosed herein, there is set forth a system for broadcasting videos being captured from one or more aerial platforms configured to perform the broadcasting process in accordance with any one of previous embodiments of the disclosed methods.
A computer program product comprising instructions for broadcasting videos being captured from one or more aerial platforms configured to perform the broadcasting process in accordance with any one of previous embodiments of the disclosed methods.
It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are generally represented by like reference numerals for illustrative purposes throughout the figures. It also should be noted that the figures are only intended to facilitate the description of the embodiments. The figures do not illustrate every aspect of the described embodiments and do not limit the scope of the present disclosure.
In an aerial imaging system, pictures captured by an imaging device from a mobile platform, such as an Unmanned Aerial Vehicle (“UAV”), are stored in a storage device installed on the mobile platform for display at a later time.
In other aerial imaging systems, the captured pictures are transferred, via a datalink connection, to a ground device that saves the pictures in a storage device on the ground. The ground device can present the captured pictures at any time after receiving the pictures. The ground device, however, does not broadcast the pictures in real-time to client display devices.
In some other aerial imaging systems, Internet-based video servers can make the captured pictures available to viewers. The captured pictures are uploaded to the video servers in a time-delayed manner and thus are available for viewing only at a later time. Accordingly, currently-available aerial imaging systems are unable to broadcast the captured pictures in a real-time manner.
Since currently-available aerial imaging systems lack of means for broadcasting pictures captured from an aerial vehicle, a system and method that can transmit the captured pictures captured from the aerial vehicle to a video server and make the pictures enable client receivers associated with the Internet to view the motion pictures in a real-time manner can prove desirable. This result can be achieved, according to one embodiment illustrated in
The mobile node 110 can capture pictures, including, but not limited to, still pictures, motion pictures and videos. The mobile node 110 can transfer (or transmit) the pictures to the terminal node 510 via the wired and/or wireless first connection 308. The transfer can allow the captured pictures to be presented at the terminal node 510 as the pictures are being captured. With the mobile node 110 and the transfer from the mobile node 110 to the terminal node 510, the mobile node 110 can acquire the captured pictures in a real-time manner.
The video broadcasting system 100 is shown and described with one mobile node 110 for purposes of illustration only and not for purposes of limitation. In the embodiments of the system 100, a plurality of mobile nodes 110 can be employed in a coordinated manner to capture the pictures.
The terminal node 510 can receive the captured pictures via the first connection 308 from the mobile node 110. At the terminal node 510, the captured pictures can be processed for certain purposes. Such purposes can include, but are not limited to, merging captured pictures, merging other data with the captured pictures and/or improving quality of the captured pictures. For example, audio data can be mixed with the captured pictures. Additional detail of the terminal node 510 will be shown and described below with reference to
After being processed at the terminal node 510, the pictures can be transferred (or transmitted) to a video server 810 for purposes of distribution. The terminal node 510 can transfer the captured pictures in accordance with a public protocol that is acceptable to the video server 810. Additional detail regarding the transmission will be shown and described below with reference to
The video server 810 can receive the captured pictures from the terminal node 510 via the second connection 806. The video server 810 can notify or alert viewers with regard to availability of the captured pictures and make the pictures available to client receivers 910 (shown in
Since the captured pictures can be transferred from the terminal node 510, while received, to the video server 810, the client receivers 910 can present the captured pictures as the pictures are received by the video server 810 in a real-time manner. Thereby, the system 100 can advantageously presents the pictures, captured by the mobile node 110, with the client receivers 910 in a real-time manner.
Although shown and described as using the video server 810 for purposes of illustrations only, other suitable web services that are accessible through the Internet can be used to broadcast the pictures captured by the mobile node 110.
The terminal node 510 can upload the pictures, at 180, to the video server 810. The pictures can be uploaded, at 180, in any conventional manner, such as via the Internet 808 (shown in
The video server 810 can make the uploaded pictures accessible from the client receivers 910 (shown in 10). Thereby, the pictures captured from the one or more mobile nodes 110 can be transferred to the video server 810 and be presented to the client receivers 910 in a real-time manner. Detail regarding access the pictures will be discussed below with reference to
In
In some embodiments, the mobile node 110 can have an audio input device (not shown) for capturing audio data. For purposes of illustration and not for purposes of limitation, the audio input device can be a microphone associated with the imaging device 210 or the first processor 218. The audio input device can be used to capture on-site audio data while the imaging device 210 is capturing pictures.
In
Although shown and described as being one imaging device 210 for purposes of illustration only, the mobile platform 118 can include any preselected number of the imaging devices 210 for capturing the pictures.
Without limitation, the first processor 218 can include one or more general purpose microprocessors, for example, single or multi-core processors, application-specific integrated circuits, application-specific instruction-set processors, graphics processing units, physics processing units, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like. The first processor 218 can be configured to perform any of the functions described herein, including but not limited to, a variety of operations relating to image processing. In some embodiments, the first processor 218 can include specialized hardware for processing specific operations relating to obstacle detection and avoidance—for example, processing time-of-flight data, processing ultrasound data, determining an obstacle distance based on collected data, and controlling the mobile platform 118 based on the determined distance.
At 162, the captured pictures can be streamed (and/or segmented) with a first protocol. The first protocol can be a proprietary protocol that is agreed by the mobile node 110 and a terminal node 510. The first protocol can be an only communication protocol running on both of the mobile node 110 and the terminal node 510. Alternatively, if the mobile node 110 and/or the terminal node 510 run a plurality of protocols, a negotiation between the mobile node 110 and the terminal node 510 can be conducted for selecting a proper protocol for the streaming the captured pictures into to a first bitstream 111.
At 164, the captured pictures can be transferred to the terminal node 510 in the form of the first bitstream 111. The transfer can be via a wired and/or wireless connection with any suitable transmission protocol. Additional detail regarding the packing and transferring will be discussed below with reference to
Optionally, a selected mobile node 110 can communicate with each of the other mobile nodes 110. The mobile nodes 110, for example, can communicate with each other for purposes of coordination. By being enabled to communicate, the mobile nodes 110 can cooperate to achieve a common goal, such as capturing pictures of a common scene 125 (shown in
In
In some embodiments, at least one of the mobile nodes 110 can be configured, as a control node, to issue commands to other mobile nodes 110. The control node can be enabled to control at least one of the other mobile nodes 110 via the commands. Such control can include, but not limited to, synchronization of the mobile nodes 110 and/or coordination of each of the mobile nodes 110 to capture a complete view the object of interest 120. The coordination of the mobile nodes 110 can be conducted in a same manner shown and described with reference to
In some other embodiments, at least one of the mobile nodes 110 can have the audio input device described above with reference to
Although shown and described as being three aerial nodes 110A, 110B and 110C for purposes of illustration only, the system 100 can employ any suitable type and/or number of mobile nodes 110 for capturing pictures from different perspectives of the scene 125. In some embodiments, at least one of the mobile nodes 110 can be an aerial node for capturing the scene 125 from an elevation.
In
The displays 612 can be associated with the second processor 518 and can be attached to or placed in proximity of the terminal node 510. The pictures, captured by the one or more aerial nodes 110, can be displayed on the respective displays 612 for facilitating processing of the pictures. The processing can include, but is not limited to, improving a quality of the pictures and/or mixing other data with the pictures. The other data can include, but is not limited to, video data, audio data and/or caption data. The other data can be either captured with any nodes described herein or with any other devices for capturing video data, audio data and/or textual data. The audio data can include, but is not limited to, comments and/or instructions to the pictures. In an exemplary embodiment, the pictures captured by the one or more mobile nodes 110 (not shown) can be merged to generate a combined video clip.
Without limitation, the second processor 518 can comprise any commercially-available graphic processor. The second processor 518, for example, can be a custom-designed graphic chips specially produced for the terminal node 510. Additionally and/or alternatively, the second processor 518 can include one or more general purpose microprocessors (for example, single or multi-core processors), application-specific integrated circuits, application-specific instruction-set processors, graphics processing units, physics processing units, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like. The second processor 518 can be configured to perform any of the functions described herein, including but not limited to, a variety of operations relating to image processing. In some embodiments, the second processor 518 can include specialized hardware for processing specific operations relating to image processing.
The microphone 610 can be operably associated with the mixer 710. The microphone 610 can be any commercially-available microphones, including any type of device that can be used to capture audio signals. The microphone 610 can convert audio signals into electric data that is transmitted to the mixer 710. With the microphone 610, a user, e.g. a commentator, can record his/her voice while watching the captured pictures on the display 612 while the first bitstream 111 is being unpacked and displayed. Since the captured pictures can be displayed while the first bitstream 111 being unpacked, the user can give comments and/or instructions regarding the captured pictures in a real-time manner. Although shown and described as using the microphone 610 for purposes of illustration only, any other suitable audio input device 610 can be used for capturing the audio signals.
The mixer 710 can take the audio data captured by the microphone 610 and merge the audio data with the pictures unpacked by the second processor 518. In some embodiments, the mixer 710 can merge the pictures captured by different mobile nodes 110, e.g. the three mobile nodes 110A, 110B, 110C (shown in
Although shown and described as being contained in the terminal node 510 for purposes of illustration only, the microphone 610 and/or the mixer 710 can be external to terminal node 510 and be associated to the terminal node 510 for capturing and merging the audio data with the pictures.
The first bitstream 111 can be packed in a proprietary protocol as shown and described with reference to
At 560, audio data can be acquired from an audio device, such as a microphone 610. The audio data can include, but is not limited to, commentary and/or dubbing voice. The audio data can be mixed with the unpacked pictures, at 570. The terminal node 510 can mix the audio data with the pictures with a mixer 710. In an embodiment, the audio data can be recorded and merged while repacking the pictures, at 580. The repacking of the pictures can be conducted in accordance with a second protocol. The second protocol can comprise any suitable conventional protocol that can be the same as, or different from, the first protocol. In one embodiment, the second protocol can be a protocol accepted by a video server 810, including, but not limited to, a video server 810, e.g. YouTube® and YouKu®.
The terminal node 510 can transfer the second bitstream via the Internet 808 to the video server 810, at 590. As an exemplary embodiment, a plurality of video servers 810 can receive the second bitstream 222 at a same time. For purposes of illustration, and not limitation, the pictures can be repacked into a plurality of second bitstream 222, each being streamed and/or segmented in accordance with a separate protocol acceptable to a respective video server 810.
As shown and described with reference to
Although shown and described as using one control node 618 from the terminal node 510 for purposes of illustration, any number of control nodes 618, in any locations, can be employed for coordinating the one or more mobile terminals from any suitable locations.
The coordination of the mobile nodes 110 can include controlling at least one of the mobile platforms 118 and the imaging device 210 for each of the mobile nodes 110 (collectively shown in
The user can control the one or more imaging devices 210 via one centralized control node 618 and/or via a plurality of distributed control nodes 618 (not shown). The one or more control nodes 618 can be a portion of, or connected with, the terminal node 510. The control nodes 618 can connect with the terminal node 510 and/or the mobile node 110 with wired or wireless connections. The control nodes 618 can be any type of device that can send control signals to the mobile nodes 110, including, but not limited to, a desktop, a laptop, a tablet and a smartphone and the like.
Although shown and described as being with coordinating the one or more mobile nodes 110 after the capturing the pictures from the mobile nodes 110, the coordinating can be conducted at any time before and/or while capturing the pictures.
The client receivers 910 can comprise any device that can have access to the Internet 808, including, but not limited to, a desktop, laptop, tablet and other handheld devices, e.g. smart phone. In some embodiments, the client receivers 910 can serve as a control node 618. A user can issue a command, directed to a mobile node 110, to the terminal node 510 via the video server 810. The terminal node 510 can pass the command to the respective mobile node 110.
At 816, the second bitstream 222 can be made accessible, via the Internet 808, to the client receivers 910. Each of the client receivers 910 can connect to the video server 810 and be authenticated and/or authorized when each of the client receivers 910 selects to access the second bitstream 222.
The captured pictures can be video reflecting real-time views of a scene 125 (shown in
In
The microphone 610 can capture sound signal and convert the audio signal into electrical data. The electrical data can be transmitted to the mixer 710 and then merged with the pictures. The audio signal can represent comments and/or explanations to the pictures. A user can, for example, commentate on the pictures while watching the pictures on the display 612. The commentating voice can be converted into electrical signal and mixed, via the mixer 710, with the captured pictures in a synchronized manner.
In
The second processor 518 can stream and/or segment the pictures into a second bitstream 222 (shown in
The terminal node 510 can have a connection 807 to the Internet 808, which can be a wired or a wireless connection. A video server 810 can receive the second bitstream 222 from the Internet 808 via an Internet connection 809. The second bitstream can be accessible to one or more client receivers 910 that have Internet access. In some embodiments, the second bitstream can be unpacked to facilitate the accessibility of the one or more client receivers 910.
The described embodiments are susceptible to various modifications and alternative forms, and specific examples thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the described embodiments are not to be limited to the particular forms or methods disclosed, but to the contrary, the present disclosure is to cover all modifications, equivalents, and alternatives.
This application is a continuation application of International Application No. PCT/CN2015/090749, filed on Sep. 25, 2015, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2015/090749 | Sep 2015 | US |
Child | 15912025 | US |