The present invention relates to live broadcast/streaming technology, and more particularly to a live broadcasting/streaming device and related live broadcasting/streaming method with cross-platform synchronous live broadcasting/streaming capabilities and independent control of screen layout.
Advancements in data transmission over the internet and image encoding technologies have significantly fueled the growth of live streaming platforms. Today, live broadcasts/streaming of various styles and contents have become an integral part of modern entertainment. To cater to this trend, the market has introduced a range of software and hardware solutions designed for individual users to facilitate live streaming. These live streaming devices enable users to conveniently stream local video content to live streaming platforms in the form of streaming media for viewership. However, existing live streaming devices have several noticeable limitations. Firstly, existing live streaming devices do not support simultaneous multi-platform broadcasting/streaming. Live-streamers are unable to stream video content simultaneously to different live streaming platforms for cross-platform synchronous broadcasting. Additionally, existing live streaming devices lack dynamic bitrate adjustment capabilities. When network status fluctuates between the live streaming devices and the live streaming platform, they fail to automatically adjust the bitrate of the streaming media, thus compromising the viewer's experience. Furthermore, existing live streaming devices do not allow live-streamers to adjust the screen layout during a live streaming event. Once a live streaming event starts, the screen layout becomes fixed and cannot be modified in real-time according to the needs of the live-streamers or specific application scenario. Lastly, existing live streaming devices are unable to differentiate between local monitoring video and live output video. These live streaming devices limit the content of the monitoring video and the live output video to be identical, lacking adaptability to different scenes or requirements. From the above, it is evident that the existing live streaming devices have numerous areas that urgently need improvement.
In light of above, it is one of objectives of the present invention to provide a novel live streaming device and method. The live streaming device and method of the present invention are characterized by the following features. Firstly, the present invention allows for direct interfacing with multiple live streaming platforms without the need for third-party cloud service platforms. It enables simultaneous transmission of live streaming media to multiple live streaming platforms, thereby facilitating cross-platform synchronous broadcasting/streaming and featuring dynamic bit-rate adjustment for streaming media. Furthermore, the live streaming device and method of the present invention support real-time adjustments to screen layouts. Even after a live streaming event has commenced and the live streaming media is being streamed to the live streaming platforms, users retain the ability to modify the screen layout at any moment. In addition, the present invention also supports independent control of screen layouts, allowing users to employ multiple screen layouts for local monitoring and live output.
According to one embodiment, a multi-platform live streaming device is provided. The multi-platform live streaming device comprises: a user interface service module and a streaming output service module. The user interface service module is configured to control one or more image elements based on a user interface setting, thereby determining screen layouts of a local monitoring video and a live output video respectively. The streaming output service module is configured to encode the live output video to generate live streaming media and streaming the live streaming media simultaneously to multiple live streaming platforms. Specifically, the screen layout of the local monitoring video can differ from that of the live output video.
According to one embodiment, a multi-platform live streaming method is provided. The multi-platform live streaming method comprises: controlling one or more image elements based on a user interface setting, thereby determining screen layouts of a local monitoring video and a live output video respectively; and encoding the live output video to generate live streaming media and streaming the live streaming media simultaneously to multiple live streaming platforms. Specifically, the screen layout of the local monitoring video can differ from that of the live output video.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present embodiments. It will be apparent, however, to one having ordinary skill in the art that the specific detail need not be employed to practice the present embodiments. In other instances, well-known materials or methods have not been described in detail in order to avoid obscuring the present embodiments.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present embodiments. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable combinations and/or sub-combinations in one or more embodiments.
Please refer to
The live streaming device 100 may comprise ports A (111) and B (112), a network interface 113, signal transmission interfaces A (114), and B (115). The ports A (111) and B (112) are coupled respectively to image capturing devices 130 and 140 (e.g., cameras with video or image sequence output capabilities), incorporating captured images or videos from the cameras 130 and 140 as part of the live streaming/broadcasting content. In one embodiment, the ports A (111) and B (112) can be ports compliant with Universal Serial Bus (USB) standard. The signal transmission interface A (114) can be coupled to a display device 135 to display a local monitoring video produced by the live streaming device 100 for the streamer/broadcaster to view the current live streaming/broadcasting content, where the signal transmission interface A (114) may be a V-by-One interface. The signal transmission interface B (115) can be coupled to a video source 145 to utilize videos or images outputted by the video source 145 as base videos or images for the live streaming/broadcasting content, where the signal transmission interface B (115) can be a High Definition Multimedia Interface (HDMI) or a DisplayPort (DP). In one embodiment, the video source 145 can be a personal computer, mobile phone, tablet, game console, etc. The network interface 113 is coupled to the internet and to encode a live output video produced locally by live streaming device 100 into live streaming media, and then sends the live streaming media to platforms A, B, and C via real-time messaging protocols.
Please note that types and quantities of the image elements 191-195 mentioned in the aforementioned embodiment are not limitations of the present invention. In some embodiments, the screen layouts of the local monitoring video and the live output video may include more or fewer image elements, as well as image elements different from those shown in
At step S19, it is determined whether the information UI has been enabled; if not, the flow proceeds to step S20; if yes, it goes to step S21. At step S20, as the information UI is disabled, it can be enabled at step U23. On the other hand, at step S21, since the information UI is already enabled, the flow may proceed to a next step, or to enter step S22 to disable the information UI. When the information UI is enabled, steps S24, S25, or S26 allows the stream or the broadcaster to adjust individual image elements in the information UI, including movement of individual image elements (position adjustment), scaling of individual image elements (size adjustment), and Z-axis positions of individual image elements (a stacking order/occlusion relationship) according to their needs and operations. Once adjustments to the information UI are completed, relevant parameters are saved as new user interface setting data or utilized to update existing user interface setting data in the database (at step S27). At steps S28 and S29 the editing mode is exited and a live streaming/broadcasting mode is entered. At step S30, real-time information service provided by the user interface service module 160 is enabled (i.e., displaying the information UI in the local monitoring video or the live output video). At step S31, previously set user interface setting data is loaded from the database. At step S32, composition and display settings of the information UI are updated based on the loaded user interface setting data. At step S33, a streaming service provided by the streaming output service module 170 starts, outputting the local monitoring video to the local display device 135 and encoding the streaming output video into the live streaming media for being streamed to live streaming platforms. At step S34, the composition and display settings of the information UI are updated. At this step, since the network connection status UI in the information UI (if present) needs to display real-time network connection information, and the streamer or the broadcaster might adjust the information UI during the live streaming/broadcasting (e.g., hiding specific image elements, or adjusting transparency, sizes, positions, and Z-axis positions of specific image elements), the composition and display settings of the information UI need to be continuously updated after the start of the live streaming/broadcasting event. Finally, at step S35, the information UI is displayed in the local monitoring video or the live output video based on the updated composition and display settings.
In the present invention, the user interface service module 160 enables the screen layout of the local monitoring video to differ from that of the live output video. Therefore, this needs an architecture that allows independent control of image elements. Please refer to
Further, the status information 310 provides transparency information, size information, position information, and actual content information of various status information UIs to be displayed (including: timer UI, network connection status UI) to the composer 320 for rendering related graphics. The system UI 311 provides UI information regarding an underlying operating system to the composer 320 for rendering the related graphics. The cursor 312 provides position information of the cursor to the composer 320 for rendering related graphics. The video source 313 provides base videos or images to the composer 320. The camera sources 314 and 314 provide captured videos or images to the composer 320. Furthermore, the composer 320, based on Z-axis positions corresponding to each image element, blend or mix the render graphics with a correct stacking order or occlusion relationships, thereby rendering the local monitoring video and live output video. The composer 320 can also decide whether to display or hide the render graphics. Thus, the local monitoring video and live output video can have different screen layouts. For example, the streamer or the broadcaster can control to display certain image elements in the local monitoring video but not in the live output video, or vice versa. Additionally, the streamer or the broadcaster can, during the live streaming/broadcasting event, automatically or manually hide specific image elements already displayed in the local monitoring video or live output video, thereby dynamically changing the screen layout in real-time.
Based on the above descriptions,
Step S110: controlling one or more image elements based on a user interface setting to determine screen layouts of a local monitoring video and a live output video respectively; and
Step S120: encoding the live output video to generate live streaming media and streaming the live streaming media simultaneously to multiple live streaming platforms, where the screen layout of the local monitoring video can differ from that of the live output video.
The principles and specific details of the above steps have been thoroughly explained through previous embodiments and are not reiterated here. It should be noted that the aforementioned flow can be enhanced by adding additional steps or through appropriate modifications and adjustments to better achieve the generation and streaming of the live output video, thereby obtaining an improved live broadcasting/streaming experience.
Embodiments in accordance with the present embodiments can be implemented as an apparatus, method, or computer program product. Accordingly, the present embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects that can all generally be referred to herein as a “module” or “system.” Furthermore, the present embodiments may take the form of a computer program product embodied in any tangible medium of expression having computer-usable program code embodied in the medium. In terms of hardware, the present invention can be accomplished by applying any of the following technologies or related combinations: an individual operation logic with logic gates capable of performing logic functions according to data signals, and an application specific integrated circuit (ASIC), a programmable gate array (PGA) or a field programmable gate array (FPGA) with a suitable combinational logic.
The flowchart and block diagrams in the flow diagrams illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It is also noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. These computer program instructions can be stored in a computer-readable medium that directs a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
| Number | Date | Country | Kind |
|---|---|---|---|
| 202311612385.2 | Nov 2023 | CN | national |