The present invention relates to an architecture in which data can be streamed and streamed data can be accessed.
It is known in the art for data streamed from various sources, and various other sources to access streamed data. Thus capture devices may capture events for example using a video camera functionality of a mobile phone, and stream associated data. Viewing devices may receive the streamed data and vie the associated video.
It is an aim of the invention to provide an improved content streaming environment.
There is provided a system for providing streaming services, the system comprising: a plurality of users each for generating a stream of an event on a connection of a public network; and a server configured to: receive a plurality of the generated streams on connections of the public network; determine content for at least one output stream in dependence on one or more of: the content received on the input streams, the content requested by a viewer, and the user profiles of the contributors; and output the at least one output stream on a connection of a public network, at least one user for receiving the at least one output stream on a connection of the public network.
A key feature is the provision of control—the determination feature. This aspect is about multiple streams going into one output stream, but under some form of control—a decision is made. This aspect puts intelligence in the middle: the output stream uses part of the collection of input streams.
The server may include a first interface connected to the public network for receiving the plurality of the generated streams, and a second interface connected to the public network for outputting the at least one output stream. The server may be associated with a private network connected to the first and second interfaces.
The at least one of the plurality of users may provide a live stream on a connection of the public network, and the at least one output stream provided on a public network includes that live stream.
The at least one user may be one or more of: a content originator, a content consumer, a content requestor, a viewer, a contributor, a director, or a content responder. A director may be a client configured to control the output stream of the server. In general, all devices are users of the system, whether they are identified as a consumer or a client in parts of this description and/or parts of the drawings.
The at least one user may be one or more of a content requestor, an editor, an analyst, a syndicator, a curator, a moderator, an advertiser, or a content responder. The syndicator may identify input streams associated with certain events. The curator may choose specific inputs streams. The input streams may be associated with events, and the curator may choose specific events. The moderator may apply predetermined rules to the input streams.
Each user may generate a stream, each stream being allocated a timing reference. The timing reference may be relative to a master clock, an audio trigger or a visual trigger. The timing reference may be relative to a master clock, the timing reference being applied to each stream received by the server. Each stream may have one or more of a start time, a finish time and a continuity time or offset defined relative to the master clock. The timing reference may allow rights management to be applied. The timing reference may provide a time stamp on each stream generated by a user device. At least one stream may be allocated a geo-reference. The geo-reference may vary over time. The geo-reference may indicate or include a pointing direction of a camera associated with the device generating the stream. The geo-reference allocated to a stream may be updated in dependence on the device moving by a distance which is greater than a threshold. The threshold value may be set in dependence on an event. The geo-reference may be included as metadata associated with or embedded in a stream.
The server may group a plurality of input streams into an event. An input stream may be grouped into one or more events. A stream may be grouped according to its metadata.
Each consumer may be associated with an appliance. An appliance may be one of: a website, an application running on a mobile device, an application running on a television, a server application.
A server may comprise an API (application programmable interface) and a media gateway. The API may provide an interface to metadata services, and the media gateway has an interface to the streams. The media gateway may be configured to receive the input streams and generate one or more output streams. The media gateway may be configured to store the input streams. Each input stream may be allocated to an event. If it is identified that the input stream should be associated with an existing event, it is allocated to that event. If it is identified that the input stream is not associated with any existing event, then a new event is created and the input stream associated with that new event. Each input stream is allocated to an event in accordance with its metadata. The media gateway may be configured to receive data, metadata, and data with embedded metadata. The media gateway may be configured to process streamed time dependent data or metadata. The API may be configured to process data which is not time dependent.
The data may be control related. The control data is derived from metadata.
A request from a user is received by the API, and the request defines the rules which are applied to the media gateway and data streams that come into it. The rights holders may create rules that instruct the server to store content. The rights holders may assign rights to the streams.
An input stream of data and an input stream of metadata may be received by the media gateway. The media gateway may output an output stream. The media gateway may communicate with the API. The API receives control (e.g. a request), and user rules.
This provides, in an aspect, a server: determination of association with each event done by API, but then media gateway acts on it.
Static metadata includes, e.g., time. Preferably generated by originating device. Dynamic metadata includes, e.g. point of view. Preferably generated by the originating device. A password is static metadata. The media gateway is preferably only interested in data that changes with time. Static metadata goes into API.
The API may be configured to authenticate and/or authorize each input stream. The API may be configured to provide functionality associated with streaming services. The functionality may be applied to the input stream received at the media gateway in dependence on the metadata associated with the input stream.
There is provided a system for streaming data comprising: a plurality of capture devices for capturing events and generating captured data streams; a server for receiving the captured data streams and for generating at least one viewing data stream in dependence thereon, the server comprising: a media gateway for receiving the at captured data streams and for generating the at least one viewing data stream; and an application programmable interface for receiving control data associated with the captured data streams, wherein the at least one viewing stream is output by the media gateway under the control of the application programmable interface in dependence on the control data, the system further comprising: at least one viewing device for receiving the viewing data stream.
A user is enabled, on receipt of an input stream, to preferably join the input stream to an event. Such a user may also be referred to as a client.
If it is identified that the input stream should be associated with an existing event, it is joined to that event. There may be more than one streamed event.
If it is identified that the input stream is not associated with an existing event, a new event is created and it is joined to that event.
A user is enabled, on determination of provision of an invitation key or password with the input stream, to join the input stream to an associated event. Such a user may also be referred to as a client.
The invention is now described by way of reference to the following figures, in which:
With reference to
With reference to
Each of the devices 12a, 12b, 12c is referred to as a capture device as in the described embodiments of the invention the devices capture content. However the devices are not limited to capturing content, and may have other functionality and purposes. In examples each capture device 12a, 12b, 12c may be a mobile device such as a mobile phone.
Each of the capture devices 12a, 12b, 12c may capture an image utilising a preferably integrated image capture device (such as a video camera), and may thus generate a video stream on a respective communication line 14a, 14b, 14c. The respective communication lines 14a, 14b, 14c provide inputs to the network 4, which is preferably a public network such as the Internet. The communication lines 14a, 14b, 14c are illustrated as bi-directional, to show that the capture devices 12a, 12b, 12c may receive signals as well as generate signals.
The server 2 is configured to receive inputs from the capture devices 12a, 12b, 12c as denoted by the bi-directional communication lines 6, connected between the server 2 and the network 4. In embodiments, the server 2 receives a plurality of video streams from the capture devices, as the signals on lines 14a, 14b, 14c are video streams.
The server 2 may process the video streams received from the capture devices as will be discussed further hereinbelow.
The server 2 may generate further video streams on bi-directional communication line 6 to the network 4, to the bi-directional communication lines 18a, 18b, associated with the devices 16a, 16b respectively.
Each of the devices 16a, 16b is referred to as a viewing device as in the described embodiments of the invention the devices allow content to be viewed. However the devices are not limited to providing viewing of content, and may have other functionality and purposes. In examples each viewing device 16a, 16b may be a mobile device such as a mobile phone.
The viewing devices 16a and 16b may be associated with a display (preferably an integrated display) for viewing the video streams provided on the respective communication lines 18a, 18b.
A single device may be both a capture device and a viewing device. Thus, for example, a mobile phone device may be enabled in order to operate as both a capture device and a viewing device.
A device operating as a capture device may generate multiple video streams, such that a capture device such as capture device 12a may be connected to the network 4 via multiple video streams, with multiple video streams being provided on communication line 14a.
A viewing device may be arranged in order to receive multiple video streams. Thus a viewing device such as viewing device 16a may be arranged to receive multiple video streams on communication line 18a.
A single device may be a capture device providing multiple video streams and may be a viewing device receiving multiple video streams.
Each capture device and viewing device is connected to the network 4 with a bi-directional communication link, and thus one or all of the viewing devices 16A, 16B may provide a signal to the network 6 in order to provide a feedback or control signal to the server 2. The server 2 may provide control signals to the network 4 in order to provide control signals to one or more of the capture devices 12a, 12b, 12c.
The capture devices 12a, 12b, 12c are preferably independent of each other, and are independent of the server 2. Similarly the viewing devices 16a, 16b are preferably independent of each other, and are independent of the server 2.
The capture devices 12a, 12b, 12b are shown in
The system architecture of
Each role requires various services to function and, in turn, provides new or improved services for other roles to use.
As shown in
All of those appliances take the benefit of an aggregated API as well as a media gateway.
With further reference to
Each of these applications 300 operates by interfacing with two platform end points: an API 302 and a media gateway 304.
The API 302 is an outward facing aggregation of each of the services wrapped with authentication, authorisation, load-balancing etc. This API manages the metadata of the platform: the users, the events—past, present and future—and the streams within those events; feedback; ratings; analysis of content for grouping and so on. The API 302 is shown as interfacing with a collection of services contributing to the aggregated API, as denoted by reference numeral 306.
The media gateway 304 manages the flow of the media itself from capture to playback and recording. Imagine the application as being a water utility company: the media gateway is the pipes through which the water flows; while the API 302 is the management, planning, customer and oversight function.
The media gateway 304 is shown as communicating with adapters 308 and a WebRTC 310, with the adapters 308 also being in communication with the API 302.
With reference to
All access to this API is conducted through an authentication and authorisation layer 402 which sits on top.
Whereas the media gateway 304 backs onto a media store (as shown in
As shown in
With reference to
The WebRTC 412 provides only the means by which media is captured and distributed.
The event coordinator 410 organises this content into logical bundles by drawing on the metadata available to it through the metadata adapter 420. It is further assisted by the moderator 406 and live edit coordinator 404 which deliver the enabling features for the director, curator and moderator personae (see
With reference to
The “unit” the platform deals with is always the event. A viewer watches an event and, within that, watches one (or more) of the contributing streams. The viewer may choose which stream from an event to watch, or a director/editor might have chosen, or might have created an output stream by editing other streams together.
Streams and events have separate metadata. An event might be ‘Manchester United vs. Chelsea’ but the streams might be ‘Goal Camera 1, Home’, ‘Goal Camera 2, Away’, etc.
There are separate flows for editing event/stream metadata; merging events (e.g. two people create separate events for ‘Thomas’ Birthday Party’ but who then want to merge them into one); and marketplace content requests.
Each step or box on each flowchart of
All the examples and embodiments described herein may be implemented as processed in software. When implemented as processes in software, the processes (or methods) may be provided as executable code which, when run on a device having computer capability, implements a process or method as described. The execute code may be stored on a computer device, or may be stored on a memory and may be connected to or downloaded to a computer device.
The invention has been described above with reference to particular examples. The invention is not limited to any particular example set out. In particular the invention may be implemented by using parts of any example described. The invention also may be implemented by combining different parts of any described example. The scope of the invention is set forth in the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2016/063799 | 6/15/2016 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/202885 | 12/22/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
8214862 | Lee | Jul 2012 | B1 |
8612517 | Yadid | Dec 2013 | B1 |
10674187 | Cornell | Jun 2020 | B2 |
20020143629 | Mineyama | Oct 2002 | A1 |
20070035612 | Korneluk | Feb 2007 | A1 |
20090009605 | Ortiz | Jan 2009 | A1 |
20090087161 | Roberts | Apr 2009 | A1 |
20090148124 | Athsani | Jun 2009 | A1 |
20100208082 | Buchner | Aug 2010 | A1 |
20100225811 | Konvisser | Sep 2010 | A1 |
20100289900 | Ortiz | Nov 2010 | A1 |
20110202967 | Hecht | Aug 2011 | A1 |
20120077522 | Mate | Mar 2012 | A1 |
20120265621 | Sechrist et al. | Oct 2012 | A1 |
20120320013 | Perez | Dec 2012 | A1 |
20130057413 | Tamarkin | Mar 2013 | A1 |
20130117692 | Padmanabhan | May 2013 | A1 |
20140007154 | Seibold et al. | Jan 2014 | A1 |
20140129942 | Rathod | May 2014 | A1 |
20140150032 | Pacor | May 2014 | A1 |
20140186004 | Hamer | Jul 2014 | A1 |
20140281011 | Zarom | Sep 2014 | A1 |
20150006637 | Kangas | Jan 2015 | A1 |
20150015680 | Wang | Jan 2015 | A1 |
20150016661 | Lord | Jan 2015 | A1 |
20150043892 | Groman | Feb 2015 | A1 |
20150128174 | Rango | May 2015 | A1 |
20150222815 | Wang | Aug 2015 | A1 |
20150317801 | Bentley | Nov 2015 | A1 |
20160007051 | Abuelsaad | Jan 2016 | A1 |
20160165121 | Chien | Jun 2016 | A1 |
20160191591 | Rider | Jun 2016 | A1 |
20160345035 | Han | Nov 2016 | A1 |
20170180780 | Jeffries | Jun 2017 | A1 |
20180132011 | Shichman | May 2018 | A1 |
20200280758 | O'Rourkes | Sep 2020 | A1 |
Number | Date | Country |
---|---|---|
2 403 236 | Jan 2012 | EP |
2 486 033 | Jun 2012 | GB |
2007117613 | Oct 2007 | WO |
2012100114 | Jul 2012 | WO |
Entry |
---|
Wen Gao, et al., “Vlogging: A Survey of Videoblogging Technology on the Web”, ACM Comput. Surv. 42, ACM, NY, NY, Jun. 23, 2010. |
International Search Report and Written Opinion for Application PCT/EP2016/063799, issued by the European Patent Office, dated Oct. 10, 2016. |
Pereira F. et al., “Multimedia Retrieval and Delivery: Essential Metadata Challenges and Standards”, Proceedings of the IEEE, New York, vol. 96, No. 4, Apr. 1, 2008. |
Hesselman, Cristian, et al., “Sharing Enriched Multimedia Experiences Across Heterogeneous Network Infrastructures”, IEEE Communications Magazine, vol. 48, No. 6, Jun. 1, 2010. |
Number | Date | Country | |
---|---|---|---|
20180220165 A1 | Aug 2018 | US |
Number | Date | Country | |
---|---|---|---|
62175878 | Jun 2015 | US |