The present invention relates to the field of video production. More specifically, the present invention relates to a cloud-based video production system.
Live video production, for example for news or sports broadcasts, requires a physical production switch along with production staff on site, and is a costly process. It would be advantageous to be able to carry out the entire production process on a cloud-based video production system without requiring the physical presence of production staffs on site.
The methods and apparatus of the present invention provide the foregoing and other advantages.
The present invention relates to a cloud-based video production system.
In an example embodiment of a cloud-based video production system in accordance with the present invention, a cloud-based video production server is provided. A remote user interface running on a user device is also provided. One or more video sources are in communication with the cloud-based video production server and the remote user interface via a network. A control unit, located at or in communication with the cloud-based video production server, is in communication with the remote user interface via the network. A buffer, corresponding to each of the one or more video sources, is disposed between each of the one or more video sources and the control unit to account for network delays. Each frame of video content is provided with a video timestamp. Commands for selecting and manipulating video content from the one or more video sources are sent from the user interface to the control unit, each of the commands containing a command timestamp corresponding to the video timestamp of the video frame displayed on the user interface when the command is issued. The control unit executes each command at a time when the video timestamp at an output of the corresponding buffer corresponds to the command timestamp. The commands may comprise at least one of commands for providing the video content or a portion of the video content provided by the one or more video sources with a graphics overlay and commands for turning on or off the graphics overlay. The control unit outputs a video program in accordance with the commands.
The user interface may comprise one of an application or a web browser running on an Internet-enabled user device. The user device may comprise one of a computer, a laptop computer, a portable computer, a tablet computer, a smart phone, a smart watch, a personal computing device, an Internet-enabled device, or the like.
The content from the one or more video sources is viewable on the user interface.
The commands may further comprise commands for selecting from among the video content or portions of the video content provided by the one or more video sources, commands for combining the video content or portions of the video content, commands for switching between the one or more video sources, commands for manipulating the video content or portions of the video content, commands for adjusting an audio level for the video content or the portions of the video content, and the like.
Each of the one or more video sources comprises one of a video camera, a camcorder, a television camera, a movie camera, a portable electronic device, a tablet computer, a smart phone, an IP or web camera, or the like.
The video program is output for at least one of live broadcast, distribution to one or more social media platforms, and distribution to a digital media distribution platform. The video program may also be downloaded to the user device, and the downloaded video program may be one of distributed to media outlets or social media platforms in an original downloaded form, or modified at the user device prior to such distribution.
The video program may comprise one of a news program, a sports program, a weather program, a live event program, an entertainment program, or the like.
The corresponding buffer delays the video content to account for network delay in carrying out the commands.
The instructions for the commands may be sent via an API. The instructions may be scriptable.
The present invention also encompasses a method for cloud-based video production. The method may comprise providing a cloud-based video production server, providing a remote user interface running on a user device, providing video content from one or more video sources to the cloud-based video production server and the remote user interface via a network, providing a control unit located at or in communication with the cloud-based video production server which is also in communication with the remote user interface via the network, and buffering the video content via a corresponding buffer for each of the one or more video sources. The corresponding buffer may be disposed between each of the one or more video sources and the control unit to account for network delays. The method may further comprise providing each frame of video content with a video timestamp, sending commands from the user interface to the control unit for selecting and manipulating video content from the one or more video sources, each of the commands containing a command timestamp corresponding to the video timestamp of the video frame displayed on the user interface when the command is issued, executing each command at the control unit at a time when the video timestamp at an output of the corresponding buffer corresponds to the command timestamp, and outputting a video program in accordance with the commands. The commands may comprise at least one of commands for providing the video content or a portion of the video content provided by the one or more video sources with a graphics overlay and commands for turning on or off the graphics overlay.
The method embodiments of the present invention may also include various features and functionality of the apparatus and system embodiments discussed above.
The present invention will hereinafter be described in conjunction with the appended drawing FIGURE:
The ensuing detailed description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the invention. Rather, the ensuing detailed description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an embodiment of the invention. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth in the appended claims.
The present invention relates to a cloud-based video production system. To reduce live video production costs, it is desirable to have a cloud-based video production system.
The commands may comprise commands for switching between the video sources 16, selecting the video content (or portions of the video content) from one or more of the video sources 16, combining the video content (or portions of the video content) from one or more of the video sources 16, manipulating the video content or portions of the video content, providing the video content (or one or more portions of the video content) with a graphics overlay, turning on and off a graphics overlay, adjusting an audio level, and the like.
The raw video content (the input video provided by the video sources 16) comprises the unprocessed materials for use by the video production server 10 in producing the finished video program 22. These raw materials may be recorded for later use as well by the buffers 20 or associated storage, as discussed below. Each frame of the raw video content, as well as each frame of the final video program 22, is provided with a time stamp or time code. The time stamps/time codes of the raw video content may be different than those of the video program 22.
In one example embodiment, instructions for the command may be sent via an application program interface (API), and all instructions may be scriptable. It should be appreciated that the video production process can be driven by a user interface, by scripts, or by an AI engine which generates the commands. In addition, all the instructions and/or commands provided to the system during the video production process may be recorded with corresponding time stamp(s)/time code(s) in the original video content used for the production and be stored along with the original video content. The resulting video program 22 can then be reproduced using the raw video content and the recorded/stored command scripts or instructions.
The video sources 16 may comprise one or more of a video camera, a camcorder, a television camera, a movie camera, a portable electronic device, a tablet computer, a smart phone, an IP or web camera, or the like.
One challenge in making such a cloud-based system work efficiently is the presence of network delay. When there is a network delay, the actual execution of any commands on the cloud-based production system occurs sometime after the time the actual video frame appears on a display of the user device 15.
To solve any problems associated with network delay, a buffer 20 is added between each video source 16 and the control unit 18. Each video frame of the video content from all the video sources 16 is provided with a timestamp. When a user enters a command into the user interface 14, the command will be sent to the cloud-based video production server 10 with a command timestamp corresponding to the video timestamp of the video frame displayed on the user interface 14 when the command is issued. The control unit 18 in the cloud-based video production server 10 will execute the command at a time when the video timestamp at an output of the video buffer 20 matches or passes the command timestamp.
Since the video content from the video sources 16 is delayed by the corresponding buffer 20, any network delay in carrying out the command from the user interface 14 is accounted for. Using the time stamps from the video content and corresponding timestamps from the commands, the desired actions specified in the commands can be synchronized to affect the desired video content at the appropriate time.
The completed video program 22 may be output from the cloud-based video production server 10 for live broadcast or distribution to other media platforms, such as one or more social media outlets, a digital media distribution platform, or the like. The completed video program 22 may also be downloaded to the user device 15 and distributed to media outlets or further modified or edited prior to such distribution.
The system may be used to create various types of video programs, including news programs, sports, weather, live events, entertainment, and more. The video sources 16 may provide live video content for the production of live video programs. The video content can also be stored for later use. For example, the system may be used to produce a live sporting event, where each video source comprises a different camera or camera angle of the sporting event. Instant replays can be generated by sending commands to the control unit 18 requesting the addition of content from one or more of the video sources which includes the play to be shown in the instant replay. Each buffer 20 may be large enough to record all input feeds from the corresponding video source 16 for the entire event, facilitating instant replay and rewind features. They buffer 20 may also be used for storing the raw video content for future production needs, such as reproducing the video program. The buffer may be implemented with short term memory (RAM), local storage (hard disk on the video production server 10 or otherwise associated with the video source 16), and long term storage (cloud storage, such as AWS S3 or the like). Access to long term storage of the buffers 20 may be seamless, similar to access to short term or local storage.
It should now be appreciated that the present invention provides advantageous methods and apparatus for a cloud-based video production system.
Although the invention has been described in connection with various illustrated embodiments, numerous modifications and adaptations may be made thereto without departing from the spirit and scope of the invention as set forth in the claims.
This application is a continuation of commonly owned co-pending U.S. application Ser. No. 16/369,957 filed on Mar. 29, 2019, which claims the benefit of U.S. Provisional Application No. 62/652,978 filed on Apr. 5, 2018, each of which is incorporated herein and made a part hereof by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5510848 | Nocture | Apr 1996 | A |
5991799 | Yen | Nov 1999 | A |
6208691 | Balakrishnan | Mar 2001 | B1 |
7299275 | Tsukidate | Nov 2007 | B2 |
7369749 | Ichioka | May 2008 | B2 |
7603683 | Reto | Oct 2009 | B2 |
7653921 | Herley | Jan 2010 | B2 |
7712125 | Herigstad | May 2010 | B2 |
7734579 | White | Jun 2010 | B2 |
7817186 | Tamamura | Oct 2010 | B2 |
RE41968 | Washino | Nov 2010 | E |
7908625 | Robertson | Mar 2011 | B2 |
7958532 | Paul et al. | Jun 2011 | B2 |
8064389 | Khan et al. | Nov 2011 | B2 |
8072943 | Khan et al. | Dec 2011 | B2 |
8151301 | Bennett | Apr 2012 | B2 |
8184168 | Kindborg et al. | May 2012 | B2 |
8307395 | Issa | Nov 2012 | B2 |
8490133 | Parekh | Jul 2013 | B1 |
8516345 | Quere | Aug 2013 | B2 |
8621508 | Rowe | Dec 2013 | B2 |
8631452 | Xu | Jan 2014 | B2 |
8713195 | Pickens | Apr 2014 | B2 |
8806563 | Coufal | Aug 2014 | B2 |
8839295 | Kim | Sep 2014 | B2 |
8881220 | Arya | Nov 2014 | B2 |
9226022 | Ferguson | Dec 2015 | B2 |
10021433 | Hundemer | Jul 2018 | B1 |
10270959 | Bart et al. | Apr 2019 | B1 |
20020054244 | Holtz | May 2002 | A1 |
20020057348 | Miura et al. | May 2002 | A1 |
20030061206 | Qian | Mar 2003 | A1 |
20030063213 | Poplin | Apr 2003 | A1 |
20030063217 | Smith | Apr 2003 | A1 |
20040103426 | Ludvig | May 2004 | A1 |
20040148571 | Lue | Jul 2004 | A1 |
20040215718 | Kazmi | Oct 2004 | A1 |
20050144455 | Haitsma | Jun 2005 | A1 |
20050262542 | DeWeese | Nov 2005 | A1 |
20060031883 | Ellis | Feb 2006 | A1 |
20060031889 | Bennett | Feb 2006 | A1 |
20060055785 | Nagajima | Mar 2006 | A1 |
20060190966 | McKissick | Aug 2006 | A1 |
20070124756 | Covell | May 2007 | A1 |
20070157281 | Ellis | Jul 2007 | A1 |
20070204285 | Louw | Aug 2007 | A1 |
20080027953 | Morita | Jan 2008 | A1 |
20080059532 | Kazmi | Mar 2008 | A1 |
20080060036 | Cox | Mar 2008 | A1 |
20080077568 | Ott | Mar 2008 | A1 |
20080170630 | Falik | Jul 2008 | A1 |
20080215170 | Milbrandt | Sep 2008 | A1 |
20080235733 | Heie | Sep 2008 | A1 |
20090037954 | Nagano | Feb 2009 | A1 |
20090119708 | Harrar | May 2009 | A1 |
20090179982 | Yanagisawa et al. | Jul 2009 | A1 |
20090183201 | Dasgupta | Jul 2009 | A1 |
20090237548 | Watanabe et al. | Sep 2009 | A1 |
20090268806 | Kim et al. | Oct 2009 | A1 |
20090320058 | Wehmeyer | Dec 2009 | A1 |
20090320060 | Barrett | Dec 2009 | A1 |
20090320072 | McClanahan | Dec 2009 | A1 |
20090320073 | Reisman | Dec 2009 | A1 |
20090328085 | Beyabani | Dec 2009 | A1 |
20100050203 | Yamagishi | Feb 2010 | A1 |
20100121936 | Liu | May 2010 | A1 |
20100131385 | Harrang | May 2010 | A1 |
20100157020 | Choi et al. | Jun 2010 | A1 |
20100251292 | Srinivasan | Sep 2010 | A1 |
20100260254 | Kimmich et al. | Oct 2010 | A1 |
20100260268 | Cowan et al. | Oct 2010 | A1 |
20100296487 | Karaoguz | Nov 2010 | A1 |
20100333148 | Musha | Dec 2010 | A1 |
20110002397 | Wang et al. | Jan 2011 | A1 |
20110066744 | Del Sordo | Mar 2011 | A1 |
20110068899 | Ioffe | Mar 2011 | A1 |
20110086619 | George | Apr 2011 | A1 |
20110096828 | Chen et al. | Apr 2011 | A1 |
20110138064 | Rieger | Jun 2011 | A1 |
20110164683 | Takahashi et al. | Jul 2011 | A1 |
20110179462 | Kubo | Jul 2011 | A1 |
20110187503 | Costa et al. | Aug 2011 | A1 |
20110191439 | Dazzi | Aug 2011 | A1 |
20110191446 | Dazzi | Aug 2011 | A1 |
20110239078 | Luby et al. | Sep 2011 | A1 |
20120110614 | Whitley | May 2012 | A1 |
20120117590 | Agnihotri | May 2012 | A1 |
20120144435 | Spilo | Jun 2012 | A1 |
20120185907 | Park et al. | Jul 2012 | A1 |
20120250619 | Twitchell, Jr. | Oct 2012 | A1 |
20120278725 | Gordon | Nov 2012 | A1 |
20120291079 | Gordon | Nov 2012 | A1 |
20120307052 | Thiruvengada et al. | Dec 2012 | A1 |
20120307082 | Thiruvengada et al. | Dec 2012 | A1 |
20120320168 | Yun et al. | Dec 2012 | A1 |
20130128074 | Mitsugi | May 2013 | A1 |
20130136193 | Hwang et al. | May 2013 | A1 |
20130155182 | Bekiares et al. | Jun 2013 | A1 |
20130268962 | Snider | Oct 2013 | A1 |
20130268973 | Archibong | Oct 2013 | A1 |
20130305304 | Hwang et al. | Nov 2013 | A1 |
20140067828 | Archibong | Mar 2014 | A1 |
20140092254 | Mughal et al. | Apr 2014 | A1 |
20140098289 | Jang et al. | Apr 2014 | A1 |
20140119712 | Jang et al. | May 2014 | A1 |
20140125703 | Roveta | May 2014 | A1 |
20140211861 | Lee et al. | Jul 2014 | A1 |
20140250484 | Duennebier et al. | Sep 2014 | A1 |
20140280564 | Darling | Sep 2014 | A1 |
20150019670 | Redmann | Jan 2015 | A1 |
20150022674 | Blair et al. | Jan 2015 | A1 |
20150039608 | Basilico | Feb 2015 | A1 |
20150082203 | James et al. | Mar 2015 | A1 |
20150100586 | Caruso | Apr 2015 | A1 |
20150113576 | Carroll | Apr 2015 | A1 |
20150128179 | Cormican | May 2015 | A1 |
20150286716 | Snibbe | Oct 2015 | A1 |
20150304689 | Warren | Oct 2015 | A1 |
20150378000 | Bar David et al. | Dec 2015 | A1 |
20160057477 | Finkelstein | Feb 2016 | A1 |
20160112741 | Elm et al. | Apr 2016 | A1 |
20160182785 | Ogata et al. | Jun 2016 | A1 |
20160366228 | Overton et al. | Dec 2016 | A1 |
20160381276 | Li et al. | Dec 2016 | A1 |
20170070659 | Kievsky et al. | Mar 2017 | A1 |
20170127150 | Kuo et al. | May 2017 | A1 |
20170295309 | Cabral et al. | Oct 2017 | A1 |
20170303005 | Shen et al. | Oct 2017 | A1 |
20170337912 | Caligor et al. | Nov 2017 | A1 |
20180088444 | Matsumoto et al. | Mar 2018 | A1 |
20200013432 | Doi et al. | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
2015148634 | Oct 2015 | WO |
201620289 | Dec 2016 | WO |
2018138300 | Aug 2018 | WO |
Number | Date | Country | |
---|---|---|---|
20210185413 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
62652978 | Apr 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16369957 | Mar 2019 | US |
Child | 17249223 | US |