The present invention relates to the field of video production. More specifically, the present invention relates to a cloud-based video production system that enables real time or near real time manipulation of video from multiple sources.
In particular, the present invention relates to meeting the demand in the video production process requiring low latency video for interaction, preview and control, while the video quality for the signal delivered to the consumer needs to be very high. It should be appreciated that the same technology disclosed herein for use with cloud-based video production can also be used for normal video production.
To transmit a broadcast quality video signal, the latency is quite often longer than what real time interaction demands in certain situations, such as a news anchor or television host having a video conference with one or more remote guests, in real time. However, real time video conferencing systems have difficulty delivering broadcast quality video.
It would be advantageous to provide methods and apparatus that enable real time or near real time interaction with video content while still enabling broadcast quality video production. It would also be advantageous to provide such features as part of a cloud-based video production system to produce live video content.
The methods and apparatus of the present invention provide the foregoing and other advantages.
The present invention relates to a cloud-based video production system that enables real time or near real time manipulation of video from multiple sources.
In accordance with an example embodiment of a method for real time video production using multiple streams, video content from each of one or more video sources is encoded into at least two separate streams, a first video stream from each of the one or more video sources comprising a low quality version of the video content having a low latency and a second video stream from each of the one or more video sources comprising a higher quality version of the video content having a higher latency. Corresponding video frames of the first video stream and the second video stream are provided with identical video timestamps. A first signal path is provided for communicating the first video stream from the one or more video sources to a user interface at the low latency. A second signal path is provided for communicating the second video stream from the one or more video sources to the user interface at the higher latency. Control signals comprising commands are sent (e.g., from the user interface to a control unit) for selecting and manipulating the video content from the one or more video sources. The commands may be based on one or more of the first video streams received on the first signal path. The commands are carried out on video content from one or more of the second video streams received on the second signal path to produce a video program.
The second video streams may be buffered at one or more buffers to account for latencies in carrying out the commands. The buffering may enable rewind and instant replay features. A user device comprising the user interface may be adapted to receive the first video streams over the first signal path. A cloud production server may be adapted to receive the second video streams over the second signal path. The commands may be communicated from the user device to a control unit of the cloud production server.
The user device may comprise a display for displaying the first video streams from the one or more video sources.
The control signal may further comprise the video timestamp for each of the video frames subject to the command. The commands are executed at the time when the video timestamp at an output of the buffer matches the video timestamp received in the control signal.
The second video streams may comprise production quality video. Further, the video content may comprise live video content.
The commands may comprise commands for: switching between the video content provided by the one or more video sources; selecting the video content or portions of the video content provided by one or more of the one or more video sources; combining the video content or portions of the video content from the one or more video sources; manipulating the video content or the portions of the video content; adjusting video brightness; providing a graphics overlay; turning on and off a video graphics overlay; adjusting an audio level; and adding special effects to the video content or portions of the video content.
The present invention also includes apparatus and systems for carrying out the method. An example embodiment of a system for real time video production using multiple streams may comprise one or more video sources each providing respective video content, each of the one or more video sources encoding the respective video content into at least two separate streams, a first video stream from each of the one or more video sources comprising a low quality version of the video content having a low latency and a second video stream from each of the one or more video sources comprising a higher quality version of the video content having a higher latency. The system may also comprise a first signal path for communicating the first video stream from the one or more video sources to a user interface at the low latency and a second signal path for communicating the second video stream from the one or more video sources to the user interface at the higher latency. A user interface may be provided for sending control signals comprising commands for selecting and manipulating the video content from the one or more video sources, the commands being based on one or more of the first video streams received on the first signal path. The system may also comprise a video production server for carrying out the commands on one or more of the second video streams to produce a video program. Corresponding video frames of the first video stream and the second video stream may be provided with identical video timestamps.
The systems and apparatus of the present invention may also include various features of the method embodiments discussed above.
The present invention will hereinafter be described in conjunction with the appended drawing figures, wherein like reference numerals denote like elements, and:
The ensuing detailed description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the invention. Rather, the ensuing detailed description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an embodiment of the invention. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth in the appended claims.
The present invention relates to a cloud-based video production system that delivers real time or near real time video with broadcast quality. Video from the same source is encoded into at least two separate streams, one with low latency and another with higher latency and higher quality. Corresponding frames of each stream have identical timestamps. Encoding the same video content into two streams with identical timestamps but different latencies and quality enables advantageous methods of video production and control, as will be explained in detail below in connection with exemplary embodiments.
The video signals may comprise any combination of one or more of video, audio, and data. The video sources 10 may comprise one or more of a video camera, a camcorder, a television camera, a movie camera, a portable electronic device, an IP or web camera, a tablet computer, a laptop or computer monitor with built-in web camera, a smart phone, or the like. The first signal path 1 is a real time control (RTC) path having low latency. The first signal path 1 is used to communicate the lower quality video content from video sources 10 to a producer user interface (UI) 12. The second signal path 2 provides the same video content at a higher video quality than that provided over the first signal path 1, but with longer latency. The video provided over the second signal path 2 may be, for example, production quality video.
The video streams sent along the first signal path 1 may be seen as proxy signals. All interactions between the video sources 10 and the producer UI 12 take place over the first signal path 1. For example, commands initiated from the producer UI 12 over the first signal path 1 may comprise commands for switching between the video sources 10, selecting the video content (or portions of the video content) from one or more of the video sources 10, combining the video content (or portions of the video content) from one or more of the video sources 10, manipulating the video content or portions of the video content, adjusting video brightness, providing a graphics overlay, turning on and off a graphics overlay, adjusting an audio level, adding special effects (either via embedding or overlay) to the video content or portions of the video content, and the like.
The video streams sent along the second signal path 2 are used to create high quality video output based upon the commands from the producer UI 12.
A user may be operating the system remotely through a network 20 via the producer UI 12. The producer UI 12 receives the video streams over the first signal path. The producer UI 12 may be an application or a web browser running on an Internet enabled user device 15 (e.g., a computer, a laptop, a portable computer, a tablet computer, a smart phone, a smart watch, or any other type of personal computing device or other internet or network enabled device). The producer UI 12 may include a display for displaying the video content from the one or more video sources 10 (e.g., via the network 20) received via the first signal path 1. The producer UI 12 enables a user to select and control which content to use in the video program via control signals containing commands sent to the cloud production server 14. Selection of the video content, and execution of commands from the producer UI 12, is carried out via a control unit 18 located at or in communication with the cloud production server 14.
In order to apply the commands to the program output of the buffer 16 at exactly the same time location as instructed by the producer UI 12, both video streams sent along signal paths 1 and 2 use the same corresponding timestamps for the same time moment of the corresponding video stream, which may include video, audio and/or data.
The one or more buffers 16 are provided to account for anticipated delay between the cloud production server 14 and producer UI 12. The high quality video streams sent from video sources 10 to the cloud production server 14 along the second signal path 2 are buffered in the one or more buffers 16. When a user initiates an action on the producer UI 12 based on the low latency video streams received along the first signal path 1, the producer UI 12 creates a corresponding command and also records the timestamps of the video showing on the producer UI 12 at that moment. The producer UI 12 sends a control signal with the command(s) together with the timestamp(s) of the corresponding video frames subject to the commands to the production server 14 for execution on the program content.
Once the control unit 18 receives a command together with the timestamp from the producer UI 12, it will execute the command on the video streams sent along the second signal path 2 at the buffer 16, according to the timestamps in the video signal. The cloud production server 14 then produces a high-quality video output 22 based on the video streams from the second signal path 2.
For example, during a video conference among multiple participants each participant may participate using a separate video source 10 (e.g., a web cam). Each video source 10 for each participant may simultaneously send video streams over the first and second signal paths 1 and 2, with a lower quality streams being sent with low latency over the first signal path 1 and a high quality video stream with higher latency being sent over the second signal path. The video streams are received by the cloud production server 14 and the producer UI 12, with the video streams sent over the first signal path 1 being made available to a user on the producer UI 12 and the video streams sent over the second signal path 2 being buffered at the one or more buffers 16. During the video conference, the video output 22 may show a first participant in the video conference.
If the user decides to switch from viewing or speaking to the first participant in the video conference, an instruction may be sent to switch to a different video source 10 for a second participant. The command and a time stamp of the video stream at the time the command was sent will be provided from the producer UI 12 to the control unit 18. The control unit 18 will execute the command at a time when the video timestamp at the buffer 16 matches the timestamp received from the producer UI 12. After execution of the command, the buffer 16 will output video from a video source 10 of the second participant that is received along the second signal path 2, resulting in a high-quality video output 22.
Since the video content received from the video sources 10 along the second signal path 2 is delayed by the one or more buffers 16, any network delay in carrying out the command from the producer UI 12 is accounted for. However, such delay is reduced since the commands are based on the low latency video signals received along the first signal path 1. Using the time stamps from the video content received over the first signal path 1 and corresponding timestamps from the commands, the desired actions specified in the commands can be synchronized to affect the desired video content output from the second signal path 2 at the appropriate time.
The output video 22 may be output for live broadcast or distribution to other media platforms, such as one or more social media outlets, a digital media distribution platform, or the like. The completed video program 22 may also be downloaded to the user device 15 and distributed to media outlets or further modified or edited prior to such distribution.
The system is particularly advantageous for the production of live video conferences among multiple participants. However, the system may also be used to create various other types of video programs, including news programs, sports, weather, live events, entertainment, and more. The video sources 10 may provide live video content for the production of live video programs. The video content can also be stored for later use. For example, the system may be used to produce a live sporting event, where each video source comprises a different camera or camera angle of the sporting event. Instant replays can be generated by sending commands to the control unit 18 requesting the addition of content from one or more of the video sources which includes the play to be shown in the instant replay.
The one or more buffers 16 may be large enough to record all input feeds from the corresponding video source 10 for the entire event, facilitating instant replay and rewind features. The buffers 16 may also be used for storing the raw video content for future production needs, such as reproducing the video program. The buffers 16 may be implemented with short term memory (RAM), local storage (hard disk on the cloud production server 14 or otherwise associated with the video source 10), and long term storage (cloud storage, such as AWS S3 or the like). Access to long term storage of the buffers 16 may be seamless, similar to access to short term or local storage.
It should now be appreciated that the present invention provides advantageous methods and apparatus for providing a high-quality video output from a cloud-based video production system.
Although the invention has been described in connection with various illustrated embodiments, numerous modifications and adaptations may be made thereto without departing from the spirit and scope of the invention as set forth in the claims.
This application claims the benefit of U.S. Provisional Application No. 63/018,717 filed on May 1, 2020, and is a continuation-in-part of commonly owned co-pending U.S. application Ser. No. 16/369,957 filed on Mar. 29, 2019, which claims the benefit of U.S. Provisional Application No. 62/652,978 filed on Apr. 5, 2018, each of which is incorporated herein and made a part hereof by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5510848 | Nocture et al. | Apr 1996 | A |
5991799 | Yen et al. | Nov 1999 | A |
6208691 | Balakrishnan et al. | Mar 2001 | B1 |
7299275 | Tsukidate et al. | Nov 2007 | B2 |
7369749 | Ichioka et al. | May 2008 | B2 |
7603683 | Reto | Oct 2009 | B2 |
7653921 | Herley | Jan 2010 | B2 |
7712125 | Herigstad et al. | May 2010 | B2 |
7734579 | White et al. | Jun 2010 | B2 |
7817186 | Tamamura | Oct 2010 | B2 |
RE41968 | Washino et al. | Nov 2010 | E |
7908625 | Robertson et al. | Mar 2011 | B2 |
7958532 | Paul | Jun 2011 | B2 |
8064389 | Khan | Nov 2011 | B2 |
8072943 | Khan | Dec 2011 | B2 |
8151301 | Bennett | Apr 2012 | B2 |
8184168 | Kindborg et al. | May 2012 | B2 |
8307395 | Issa et al. | Nov 2012 | B2 |
8490133 | Parekh et al. | Jul 2013 | B1 |
8516345 | Quere et al. | Aug 2013 | B2 |
8621508 | Rowe et al. | Dec 2013 | B2 |
8631452 | Xu et al. | Jan 2014 | B2 |
8713195 | Pickens et al. | Apr 2014 | B2 |
8806563 | Coufal et al. | Aug 2014 | B2 |
8839295 | Kim | Sep 2014 | B2 |
8881220 | Arya et al. | Nov 2014 | B2 |
9226022 | Ferguson | Dec 2015 | B2 |
10021433 | Hundemer et al. | Jul 2018 | B1 |
10270959 | Bart et al. | Apr 2019 | B1 |
20020054244 | Holtz et al. | May 2002 | A1 |
20020057348 | Miura et al. | May 2002 | A1 |
20030061206 | Qian | Mar 2003 | A1 |
20030063213 | Poplin | Apr 2003 | A1 |
20030063217 | Smith | Apr 2003 | A1 |
20040103426 | Ludvig et al. | May 2004 | A1 |
20040148571 | Lue | Jul 2004 | A1 |
20040215718 | Kazmi et al. | Oct 2004 | A1 |
20050144455 | Haitsma | Jun 2005 | A1 |
20050262542 | DeWeese et al. | Nov 2005 | A1 |
20060031883 | Ellis et al. | Feb 2006 | A1 |
20060031889 | Bennett et al. | Feb 2006 | A1 |
20060055785 | Nagajima | Mar 2006 | A1 |
20060190966 | McKissick et al. | Aug 2006 | A1 |
20070124756 | Covell et al. | May 2007 | A1 |
20070157281 | Ellis et al. | Jul 2007 | A1 |
20070204285 | Louw | Aug 2007 | A1 |
20080027953 | Morita et al. | Jan 2008 | A1 |
20080059532 | Kazmi et al. | Mar 2008 | A1 |
20080060036 | Cox | Mar 2008 | A1 |
20080077568 | Ott | Mar 2008 | A1 |
20080170630 | Falik et al. | Jul 2008 | A1 |
20080215170 | Milbrandt et al. | Sep 2008 | A1 |
20080235733 | Heie et al. | Sep 2008 | A1 |
20090037954 | Nagano | Feb 2009 | A1 |
20090119708 | Harrar et al. | May 2009 | A1 |
20090179982 | Yanagisawa et al. | Jul 2009 | A1 |
20090183201 | Dasgupta | Jul 2009 | A1 |
20090237548 | Watanabe et al. | Sep 2009 | A1 |
20090268806 | Kim | Oct 2009 | A1 |
20090320058 | Wehmeyer et al. | Dec 2009 | A1 |
20090320060 | Barrett | Dec 2009 | A1 |
20090320072 | McClanahan et al. | Dec 2009 | A1 |
20090320073 | Reisman | Dec 2009 | A1 |
20090328085 | Beyabani et al. | Dec 2009 | A1 |
20100050203 | Yamagishi | Feb 2010 | A1 |
20100121936 | Liu et al. | May 2010 | A1 |
20100131385 | Harrang et al. | May 2010 | A1 |
20100157020 | Choi et al. | Jun 2010 | A1 |
20100251292 | Srinivasan et al. | Sep 2010 | A1 |
20100260254 | Kimmich | Oct 2010 | A1 |
20100260268 | Cowan | Oct 2010 | A1 |
20100296487 | Karaoguz et al. | Nov 2010 | A1 |
20100333148 | Musha et al. | Dec 2010 | A1 |
20110002397 | Wang | Jan 2011 | A1 |
20110066744 | Del Sordo et al. | Mar 2011 | A1 |
20110068899 | Ioffe et al. | Mar 2011 | A1 |
20110086619 | George et al. | Apr 2011 | A1 |
20110096828 | Chen | Apr 2011 | A1 |
20110138064 | Rieger et al. | Jun 2011 | A1 |
20110164683 | Takahashi | Jul 2011 | A1 |
20110179462 | Kubo et al. | Jul 2011 | A1 |
20110187503 | Costa | Aug 2011 | A1 |
20110191439 | Dazzi et al. | Aug 2011 | A1 |
20110191446 | Dazzi et al. | Aug 2011 | A1 |
20110239078 | Luby | Sep 2011 | A1 |
20120110614 | Whitley | May 2012 | A1 |
20120117590 | Agnihotri et al. | May 2012 | A1 |
20120144435 | Spilo et al. | Jun 2012 | A1 |
20120185907 | Park | Jul 2012 | A1 |
20120250619 | Twitchell, Jr. | Oct 2012 | A1 |
20120278725 | Gordon et al. | Nov 2012 | A1 |
20120291079 | Gordon et al. | Nov 2012 | A1 |
20120307052 | Thiruvengada et al. | Dec 2012 | A1 |
20120307082 | Thiruvengada et al. | Dec 2012 | A1 |
20120320168 | Yun | Dec 2012 | A1 |
20130128074 | Mitsugi | May 2013 | A1 |
20130136193 | Hwang | May 2013 | A1 |
20130155182 | Bekiares et al. | Jun 2013 | A1 |
20130268962 | Snider et al. | Oct 2013 | A1 |
20130268973 | Archibong et al. | Oct 2013 | A1 |
20130305304 | Hwang | Nov 2013 | A1 |
20140067828 | Archibong et al. | Mar 2014 | A1 |
20140092254 | Mughal et al. | Apr 2014 | A1 |
20140098289 | Jang | Apr 2014 | A1 |
20140119712 | Jang | May 2014 | A1 |
20140125703 | Roveta et al. | May 2014 | A1 |
20140211861 | Lee | Jul 2014 | A1 |
20140250484 | Duennebier et al. | Sep 2014 | A1 |
20140280564 | Darling et al. | Sep 2014 | A1 |
20150019670 | Redmann | Jan 2015 | A1 |
20150022674 | Blair et al. | Jan 2015 | A1 |
20150039608 | Basilico | Feb 2015 | A1 |
20150082203 | James et al. | Mar 2015 | A1 |
20150100586 | Caruso | Apr 2015 | A1 |
20150113576 | Carroll | Apr 2015 | A1 |
20150128179 | Cormican et al. | May 2015 | A1 |
20150286716 | Snibbe et al. | Oct 2015 | A1 |
20150304689 | Warren | Oct 2015 | A1 |
20150378000 | Bar David et al. | Dec 2015 | A1 |
20160057477 | Finkelstein | Feb 2016 | A1 |
20160112741 | Elm et al. | Apr 2016 | A1 |
20160182785 | Ogata et al. | Jun 2016 | A1 |
20160366228 | Overton | Dec 2016 | A1 |
20160381276 | Li et al. | Dec 2016 | A1 |
20170070659 | Kievsky et al. | Mar 2017 | A1 |
20170127150 | Kuo et al. | May 2017 | A1 |
20170295309 | Cabral et al. | Oct 2017 | A1 |
20170303005 | Shen et al. | Oct 2017 | A1 |
20170337912 | Caligor et al. | Nov 2017 | A1 |
20180088444 | Matsumoto et al. | Mar 2018 | A1 |
20200013432 | Doi et al. | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
105100829 | Nov 2015 | CN |
105684414 | Jun 2016 | CN |
3550848 | Oct 2019 | EP |
2015148634 | Oct 2015 | WO |
2016202889 | Dec 2016 | WO |
2018138300 | Aug 2018 | WO |
Number | Date | Country | |
---|---|---|---|
20210235137 A1 | Jul 2021 | US |
Number | Date | Country | |
---|---|---|---|
63018717 | May 2020 | US | |
62652978 | Apr 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16369957 | Mar 2019 | US |
Child | 17249303 | US |