Information provided in a multimedia format has become an important feature in the interplay among individuals in a modern society. Improvements to the flow of media data in the production of a multimedia presentation enhances one's ability to interact with others.
Visual information captured by cameras can be used to enhance informational presentations or various entertainment productions. As one example, video provided by cameras can be used to provide a lecture to individuals unable to attend the physical presentation of the lecture. As another example, captured video may be provided to editing equipment, so that portions of the video can be inserted into other video presentations. Such captured video may include audio corresponding to the images collected. In other examples, audio only can be captured in a recording and provided to individuals that are distant from the audio, substantially coincident with the generation of the audio or at a later date. Accordingly, inventive apparatus and methods provide, among other things, new structures and processes for capturing and processing video, audio, or combinations thereof.
Embodiments of the invention are illustrated by way of example and not limitation in the figures of the accompanying drawings in which:
The following detailed description refers to the accompanying drawings that show, by way of illustration, details and embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice embodiments of the present invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the inventive subject matter. The various embodiments disclosed herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The following detailed description is, therefore, not to be taken in a limiting sense.
Media production server 105 can operatively communicate with media client 110 over a network 120. Media production server 105 can also operatively communicate with media capture machine 115 over network 120. Media production server 105 may communicate with media capture machine 115 over a network different from the network over which media production server 105 communicates with media client 110. In an embodiment, network 120 is an open network. An open network herein is a network that is not limited to a private network. Network 120 may include various combinations of open networks and private networks. Network 120 may be a local area network (LAN). Network 120 may be a wide area network (WAN). Network 120 may include a WAN and a LAN. Network 120 may include a wireless network.
Media production server 305 can operatively communicate with media client 310 and media capture machines 315-1, 315-2, 315-3 . . . 315-N. A communication conveyance is not shown in
Capture application 314 includes a graphical user interface (GUI) application to provide users with the capability to initiate and complete capture of media content. The capture of media content provides a recording or a memorialization in a storage medium. Capture application 314 provides a user of media client 310 with the capability to define a media capture event. Through interaction of command tool 312 with media production server 305, capture application 314 initiates, controls, and completes capture of media content. The media content may include video, audio, text, or combinations thereof. Capture application 314 allows a user to set parameters in the capture and processing of media content of an event including metadata such as title and description of the media content. Metadata are various pieces of information from different sources that are associated with a capture of media content. Capture application 314 allows a user to choose from among a set of available workflows to be applied to the captured media content. The available workflows and associated parameters are provided from media production server 305 through queries from command tool 312.
In an embodiment, capture application 314 provides local screen capture as a content source. Capture application 314 may provide a mechanism to submit previously edited media files through direct file submission to media production server 305, which allows the use of various media editing software to submit media content, via capture application 314, to media production server 305. In an embodiment, capture application 314 is an application to generate and/or manage a podcast. A podcast episode typically refers to a digital media file, or set of digital media files, that is distributed over a network for playback. The network used for the distribution of a podcast includes, but is not limited to, the Internet. In general, digital media files typically contain audio content and/or video content, but can also include images, text, a file having Portable Document Format (PDF) as a file format, or any file type containing content to provide information or entertainment. The term podcast has also been used to mean the act to distribute (multimedia files) over the Internet for playback typically on a mobile device or a personal computer. Podcasting, typically, is an automatic mechanism whereby multimedia files are transferred from a server to a client. An entity that authors or hosts a podcast is referred to as a podcaster. In an embodiment, media production server 305 can be configured as a podcast producer server to capture video of an event, process the video content, and publish the processed video content to provide a podcast of the event.
Media application 316 provides a user of media client 310 with the capability to access multimedia content. The accessed multimedia content may be transmitted to media production server 305 via command tool 312. The accessed multimedia content may be correlated to a media capture event such that the accessed multimedia content can be partially incorporated, fully incorporated, or edited with media content in a post-processing of the media content through operation of the media production server 305. The multimedia content provided by media application 316 may include video content, audio content, text-type content, or combinations thereof.
Media capture machines 315-1, 315-2, 315-3 . . . 315-N controllable by media production server 305 include a variety of multimedia machines. The multimedia machines may include, but are not limited to, audio capture machines, video capture machines, machines to collect and manage documents, and combinations thereof. In an embodiment, media capture machines 315-1, 315-2, 315-3 . . . 315-N include cameras. Each media capture machine 315-1, 315-2, 315-3, . . . 315-N may interact with media production server through media capture agents 317-1, 317-2, 317-3, . . . 317-N, respectively. A media capture agent accepts and performs commands to stop and start the capture to control a media capture machine. In an embodiment, capture agents 317-1, 317-2, 317-3, . . . 317-N are configured as a single agent for media capture machines 315-1, 315-2, 315-3, . . . 315-N. An agent includes a software entity, or a software entity and its associated hardware, which is a set of instructions executable by a machine, capable of acting with a certain degree of autonomy in order to accomplish tasks on behalf of its user machine. An agent configured as a software entity and its associated hardware, such as machine-readable medium and controllers, may be referred to as an agent machine. An agent is typically defined in terms of its behavior and may execute its function based on the criteria contained within its instruction set. Agent code typically runs continuously to perform one or more activities based on its instructions without being executed directly on demand. Capabilities associated with agents include, but are not limited to, task selection, prioritization, goal-directed behaviour, decision-making without human intervention, engagement of other components through a form of communication and coordination. Agents typically analyze the context of activity in which they operate and react to it appropriately. Media capture agents 317-1, 317-2, 317-3 . . . 317-N may be incorporated in their associated media capture machines 315-1, 315-2, 315-3 . . . 315-N. Media capture agents 317-1, 317-2, 317-3 . . . 317-N may be agent machines separate but communicatively coupled with their associated media capture machines 315-1, 315-2, 315-3, . . . 315-N.
Architecture 300 includes a shared file system 330. Shared file system 330 includes submissions 332 that contains media content submitted via command tool 312 in response to completion of a media capture event by one or more of the media capture machines 315-1, 315-2, 315-3 . . . 315-N. Submission 332 may also collect metadata and associate it with the contained media content. In one example of the invention, submission 332 may collect metadata originating with one or more of possible sources, such as metadata: originating with media production server 305, such as that identifying the type of content presented; that originating with the user, such as the title and other specific identifying information, as discussed relative to capture application 314; and that originating from an administrative function, which might include, for example, publishing information. Command tool 312 may submit the media content as it is being acquired by the respective media capture machines 315-1, 315-2, 315-3 . . . 315-N. Shared file system 330 includes resources 336 that are resources to process the media content.
Shared file system 330 includes workflows 334. Workflows are encapsulated sets of instructions that are processed by media production server 305 and injected into job control system 320 to process submitted media files. Each workflow contains step-by-step instructions with variables that act as locations for replacement of data. The variables allow media production server 305 to customize execution of each workflow to the capture-specific metadata associated with the submitted media files. For example, one workflow may be selected to operate on media content from two different events. The variables allow the user the ability to replace a title and description variables, for example, with different titles and descriptions of the media content from the two different events. Properties of media production server 305 may also be used to replace the variables. Some variables may be automatically replaced by media production server 305, for example, a variable may be replaced automatically with the absolute location of where the media content files for a specific capture event may be found within shared file system 330.
Architecture 300 includes a job control system 320, where job control system 320 has a multi-task controller 325 and a number of task agents 327-1, 327-2, 327-3 . . . 327-M that are managed by multi-task controller 325. Task agents 327-1, 327-2, 327-3 . . . 327-M may be configured on respective task agent machines. Media production server 305 is operatively coupled to multi-task controller 325 to submit a job to multi-task controller 325 to work on post-processing of the media content. Multi-task controller 325 receives a job to execute, queues the jobs until task agents are available, and assigns the job, categorized into sections, to the appropriate task agent for each section. Task agents 327-1, 327-2, 327-3 . . . 327-M are bound to the multi-task controller that assigns the respective tasks to execute. In one example of the system, multi-task controller 325 will monitor available task agents and exclude those that are inappropriate for a specific task to be assigned. For example, multi-task controller 325 may check factors such as the operating system in the task agent, the availability of shared file system 330 to the agent, and the hardware configuration of the task agent to assure that the task agent is compatible and suitable for one or more tasks to be assigned. An example of one system and method suitable for this purpose is described in U.S. application Ser. No. 11/303,105, entitled Assigning Tasks In A Distributed System Based On Ranking, and filed Dec. 16, 2005, and assigned to the assignee of the present application, and which is hereby incorporated herein by reference for all purposes.
Though an archive 339 of shared file system 330 archives source material, copies of the results from job control system 320 may be archived in an archive 339 of shared file system 330. The results may also be published to podcast files 333 and/or stream files 337 in shared file system 330. Podcast files 333 provide an avenue to distribute the processed media content as podcasts to one or more web servers 340. Stream files 337 provide an avenue to distribute the processed media content as streaming media to one or more media streaming servers 345. Task agents 327-1, 327-2, 327-3 . . . 327-M can also post results to a weblog 350, e-mail results to a mail server 355, update a media repository 360, and post results to a media hosting service 365.
At 420, a communication channel is established between media production server 305 and a camera agent 317. In an exchange of information, media production server 305 provides media client 310 with information to establish a communication channel between the camera agent 317 for camera machine 315 and media production server 305. The information is in the form of a secret for the camera agent 317 to setup a secure communication for media production server 305 to control camera machine 315 for the capture of the recording event. The secret may be an encryption key for joint use by camera agent 317 and media production server 305.
At 430, the capture of the media content is launched. With information exchange completed in a pre-capture phase, the user through capture application 314 can launch the recording by camera machine 315 through camera agent 317. The launch can include transmitting a command to media production server 305, which controls operation of media camera machine 315 over a persistent, encrypted connection to media production server 305 established by camera agent 317. The operation commands from media production server 305 include, but are not limited to, start, stop, pause, resume, and cancel a recording.
At 440, the captured media content is submitted to media client 310. Upon completion of the recording of the media event, video content from camera machine 315 is submitted to media production server 305 and to submissions 332 using command tool 312, which is part of media client 310. The submission of the video content to media production server 305 and submissions 332 may be performed automatically by the command tool 312 depending on its underlying instruction set. The described submission of the video content may be directed under control of the user through capture application 314. In addition, media application 316 can provide media content that may be incorporated with the video content in post-processing of the video content to generate a completed media presentation. Such incorporation in post-processing may include copying portions of the media content and editing the video content to include the portions at various sections of the video content. The media content from media application 316 can be submitted to media production server 305 via command tool 312.
At 450, the media content is submitted to the media production server 305 via the media client 310. The submission of the video content and/or other media content from command tool 312 to media production server 305 may be realized by transmitting the video content and/or other media content to the media production server 305. The submission may be realized as a notification of the completion of the camera recording of the event and a notification of the transmission of the video content and/or other media content to the submissions 332 section of shared file system 330. The location of the video content and/or other media content in submissions 332 section may be provided to media production server 305 by command tool 312. The location of the video content and/or other media content in submissions 332 section may be provided to command tool 312 by media production server 305 in an exchange of information upon the notification of the completion of the camera recording of the event. Shared file system 330 may provide the location of the video content and/or other media content in submissions 332 section upon the reception of this material along with appropriate identifying information and in an information exchange with command tool 312 and/or media production server 305. Various permutations of the manner of exchanging information on the storage of the video content and/or other media content may be realized.
At 460, post-processing of the media content is conducted. With media production server 305 notified of a submission and/or receiving the submission, media production server 305 provides a synchronization (sync) control of workflows 334 and resources 336 to generate post-processing of the media content by job control system 320. Media production server 305 submits a job to multi-task controller 325 to work on post-processing of the media content. The job is categorized into tasks by multi-task controller 325, which assigns the tasks to task agents 327-1, 327-2, 327-3 . . . 327-M. The results of task agents 327-1, 327-2, 327-3 . . . 327-M may be provided in various forms. Post-processing can be initiated by media production server 305 using workflows 334 and identified resources 336 in shared file system 330. Media production server 305 can submit a job to multi-task controller 325. Multi-task controller 325 may divide the job into one or more tasks that may be performed substantially independently. Tasks may include combining two or more results from a set of tasks accomplished to provide other results directed by the job submission. Multi-task controller 325 assigns the different tasks to different task agents 327-1 . . . 327-M, which operate on the media content according to its task function. The results from operation by task agents 327-1 . . . 327-M may be published to shared file system 330 and/or provided to external systems by posting, e-mailing, updating, or performing combinations of such transmissions.
Camera capture agent 710 has an outbound connection 713 that receives commands from podcast producer server 705. Outbound connection 713 is a control tunnel that may be realized as an outbound transmission control protocol (TCP) connection from camera capture agent 710 to podcast producer server 705. The control tunnel is provided as an outbound connection to support placing camera capture agent 710 behind network address translated (NAT) routers. Camera capture agent 710 connects to podcast producer server 705 and waits for instructions or commands. The instructions may be initiated by authorized users that connect to podcast producer server 705 using a command tool of a media client (not shown in
Podcast producer server 705 provides an upload endpoint 709 for the recordings using FTP, hypertext transfer protocol HTTPS, a copy command, or other mechanism. Podcast producer server 705 includes a tunnel listener 707 to receive a signal from camera capture agent 710 to establish a connection for podcast producer server 705 to send commands to camera capture agent 710. Commands sent from podcast producer server 705 to camera capture agent 710 include starting, stopping, pausing, resuming, and canceling a recording. In an embodiment, camera capture agent 710 may only be configured and run by a user supplying appropriate local administrative credentials. An administrator on a media client machine may bind camera capture agent 710 to a specific podcast producer server 705. This binding mechanism establishes a shared secret between a specific camera capture agent 710 and a specific podcast producer server 705. Podcast producer server 705 maintains a list of camera agents along with their names, unique identifiers (UUIDs), and a shared secret for each camera agent. The exchange of this information may occur over a secure sockets layer (SSL) connection between the camera agent machine and the podcast producer server. The arrangement depicted in
At 810, a user supplies information through use of a user interface for a capture application. The information includes local administration credentials, selection of audio/video devices, camera agent name, name of podcast producer server as host, and administrative credentials for the podcast producer server. At 820, the command line of a media client is invoked. At 830, the command line writes settings to a local memory. At 840, the command line creates a shared secret and stores the shared secret to the local memory. At 850, command line sends a bind request to the podcast producer server over SSL. At 860, the podcast producer server creates a camera agent entry in its database and stores the shared secret in its database. The stored shared secret is different from the SSL. At 870, a capture application starts camera agent after binding with the podcast producer server. At 880, on launch, the capture agent retrieves the podcast producer server location and the shared secret from local memory. The podcast producer location information includes host/port of the podcast producer server. At 890, the camera capture agent establishes persistent, encrypted connection to server using the shared secret. A bind process may also be implemented for other media production servers and media capture agents.
At 1070, a job in a multi-task controller is created and associated with a job from the media production server. The multi-task controller assigns tasks to task agent machines from the submitted job and may perform such procedures as video transformations, transcoding, and auto extraction. Tasks defined in workflows and executed by task agents may include, but are not limited to, such tasks as unpack, encode, annotate, watermark, title, merge, update a media repository, send to media hosting service, mail, archive, publish, groupblog, template, and approval. Each action may be invoked by the command line. These actions can be chained together to process the media content files. The resulting processed content is published by a task agent to destinations defined in the workflow template. These destinations can include e-mail notifications, weblog postings, media hosting services, and other user-defined destinations. At 1080, the job status of the job in the multi-task controller is monitored and updated in the media production server. In an embodiment, the media production server is configured as a podcast producer server. The multi-task controller and the media production server may be structured in various embodiments according to the teaching described with respect to the previous Figures.
In an embodiment, a media production server may operatively communicate with a task agent to perform a callback mechanism to provide for communications, over a network, between the task agent and a system or service that requires an authentication. The task agent is the entity that is authenticating with the external system or service. Such authentication typically requires a username and password. The callback mechanism provides a procedure for the task agent to acquire a password to authenticate with the external system or service without the task agent maintaining user names, passwords, and service-based authentication procedures. The media production server can be configured to allow a one-time use password that permits the task agent to access the media production server to have a single authentication challenge performed. Such a procedure allows the user names and passwords to be kept completely within the media production server.
The callback mechanism may include embedding a one-time use password in the job submission by the media production server. The one-time use password extracted from the job information allows the task agent to access the media production server. In accessing the media production server, the task agent presents the one-time password and the challenge from the external system or service. The media production server validates the one-time password by looking up the one-time password and associated user name, which are administratively stored and kept secret. The callback mechanism may be considered to be a credential callback mechanism or a credential service. The media production server performs the calculation necessary to respond properly to the challenge from the external system or service. The determined response is provided to the task agent, which uses the response to the challenge to authenticate to the external system or service. The callback from the task agent directly accesses the media production server rather than communicating back through the task controller, which is an alternative procedure. For each external system or service that requires an authentication with the task agent, the media production server may contain the various authentication methods to address the challenges from these systems or services. The media production server and the task agent may be configured according to any of the embodiments described herein.
At 1230, the password is sent to an originator of the job request along with the challenge. A task agent can send the password directly to a media production server without being processed by the multi-task controller that controls the task agent. At 1240, a response to the challenge is received from the originator. With a media production server having access to the authentication procedures of services with which the task agents communicate, media production server can calculate a response to the authentication challenge from the service. The response to the challenge can be sent directly to the task agent without processing by the multi-task controller that manages the task agent. At 1250, the response is sent to the service to complete the authentication with the service. With the authentication challenge received as a result of activity by task agent to send results of a job request, the use of the password is related to the job request. The embedded password can be embedded as a one-time password corresponding to an identification of the job request. With a different job request, the task agent has a different one-time password to solicit aid, for addressing an authentication challenge, from the media production server that originated the job request. In various embodiments, task agents and media production servers, as taught herein, may be configured to operate with each other in authentication sessions with other services and systems.
At 1320, the soft-cutoff time in the media capture agent is updated by a fixed time period in response to receiving a status request from a media production server. The soft-cutoff time can be updated by adding a fixed time period to the soft-cutoff time to generate a new soft-cutoff time in response to receiving the status request from the media production server. In an embodiment, with the soft-cutoff set as a length of time for recording an event, the soft-cutoff time is updated by adding a fixed time period to the length of time for recording the event. A status request may be received periodically. In an embodiment, the status request is received at regular intervals.
The status request is received via a communication tunnel established between the media capture agent and the media production server such that the media production server controls the recording of the event, while the communication channel is established by the media capture agent. A current status of recording the event can be reported to the media production server.
In an embodiment, a fixed time period used for updating the soft cutoff can be modified. The modification can be provided to the media capture agent from the media production server. In addition, during recording of the event, monitoring a level of available storage space to store recorded data can be monitored to stop recording the event when a threshold level in the storage space is reached. Various embodiments for media capture machines, media capture agents, media clients, and media production servers may be arranged to implement an embodiment of a sliding window recording procedure.
In an embodiment, a media production server controls a media capture agent in which a mechanism is provided such that, at the occurrence of a loss of communication between the two entities, the media production server maintains at least some control of a recording by a media capture machine corresponding to the media capture agent. Such a mechanism may be described, for example, in terms of features of the entities associated with the architectures of
Media production server 305 can, in a continual fashion, update its view of the status of media capture machine 315 by issuing status requests to media capture agent 317. The status request may inquire to the current status of the media capture machine 315 and/or its media capture agent 317. The status request is not limited to requesting status of a recording. Media capture agent 317 may reply by providing its current status, whether or not an associated media capture machine is recording, the length of time that the associated media capture machine has been recording, and other status information. Such information may be provided in a variety of formats.
In an embodiment, the status request generated by media production server 305 to media capture agent 317 can be used to provide a sliding window of time for recording an event while a media capture machine is conducting the recording. Each time that media capture agent 317 receives a status request it bumps the window to a future time at which media capture agent 317 will stop on its own if capture agent 317 loses communication with media production server 305. At the onset of the recording event, media production server 305 may set parameters for media capture agent 317 including a maximum time for recording and a time for the sliding window. With the occurrence of a loss of connectivity in communications from media production server 305 to media capture agent 317, the parameters set at the media capture agent 317 by media production server 305 control the recording. This is accomplished by providing that the recording must stop at the maximum time limit, but that the recording may continue to a time corresponding to sliding window, if the time corresponding to sliding window is less than the maximum time. For example, a video recording may be set to have a maximum recording of 4 hours from the start of recording or be set to record for 1 hour once it has lost connectivity with media production server 305. The 1 hour set from loss of connectivity is restricted such that the total recording time is less than or equal to the maximum sending of 4 hours. Use of 4 hours and 1 hour is provided as an example, other maximum times and sliding windows may be used. The selection of the maximum time may depend on the capacity of a media capture machine to record or other capacities of the hardware used in the media capture event. The sliding window time may depend on the application for which the media capture machine is being used.
Various factors may be used to determine a maximum record time. For instance, a media capture machine may have limited amount of storage medium on which to store the recording. In addition to running out of storage space for actual media storage, an additional amount of storage space may be used in the procedure to transfer the media content from the media capture machine to the storage medium. A maximum record time may be set to avoid occurrence of a state in which the captured video could not be off loaded to a storage medium. The storage medium may be realized as any medium for electronically or optically storing data. Since the maximum time is correlated to a time at which media capture agent 317 must cut-off the recording event, this maximum time is a hard-cutoff time. The hard-cutoff time may be represented as an actual time set or as a length of recording time. With the hard cut-off time set as a length of recording time, media capture agent may use data on the start of the recording and the time periods at which the recording is paused or halted to determine the actual time to stop the recording. In an embodiment, media capture agent 317 and its associated media capture machine 315 may be configured to monitor the time at which media capture machine records and collects data regarding on and off periods of time during the scheduled recording. Such a configuration allows media capture agent 317 to determine local periods of recording inactivity that may have occurred without being controlled by media production server 305.
In an embodiment, if a recording terminates at a time earlier than expected, the media capture agent can direct the automatic uploading of the captured media content to media production server 305. The upload may be transmitted in a manner as discussed with respect to submitting media content in the architecture of
In an embodiment, each time a media capture agent 317 receives a status request from media production server 305, it updates its sliding recording lease. In such a manner, the media production server 305 initiates the update. Though media capture agent 317 receives a status request from media production server 305, it is capture agent 317 that connects to media production server 305. With media production server issuing all of the commands to media capture agent 317, the communication arrangement may be considered as a tunnel where media capture agent 317 connects to media production server 305 and opens the tunnel such that media production server 305 issues requests across the tunnel back to media capture agent 317. The status requests from media production server 305 to media capture agent 317 may be issued in a periodic manner. The period for issuing the status request is maintained by media production server 305 and may be adjusted administratively at media production server 305 or through an application that uses media production server 305 to record the event. In an embodiment, the status request is sent every minute. Other polling time periods may be used by media production server 305. Alternatively, media production server 305 may issue the status request on a random basis or in a manner controlled by an application initiating the control activity of media production server 305.
In an embodiment, a size of the sliding window is set in media capture agent 317 that defines a soft-cutoff time. The media production server 305, configured with a default polling period such as one minute, checks with media capture agent 317 once a minute. Each time that media capture agent 317 receives a request from media production server 305, media capture agent 317 takes the time of the status request and adds to it, the size of the window to generate a new stop time. The new stop time becomes the soft-cutoff time. Alternatively, the sliding window may be viewed as a soft-cutoff time that is added to a running time. In an embodiment, one hour may be set as a default size of the sliding window. In an embodiment, media capture agent 317 may be configured to sense a loss of communication connectivity with media production server 305 and apply the size of the sliding window to the time that the loss of connectivity is determined.
System 1900 may also include a communication interface 1940 and peripheral devices 1950. Controller 1910, memory 1920, communication interface 1940, and peripheral devices 1950 can be communicatively coupled among each other by a bus 1930. Bus 1930 may include an address, a data bus, and a control bus, each independently configured. Alternatively, bus 1930 may use common conductive lines for providing one or more of address, data, or control, the use of which may be regulated by controller 1910. Bus 1930 may provide a set of individual independent connections between the various components of 1900.
Peripheral devices 1950 may include one or more displays, alphanumeric input devices, cursor controls, memories, or other control devices that may operate in conjunction with one or more of controller 1910, memory 1920, and communication interface 1940. In various embodiments, communication unit 1940 may include a connection 1947 to couple to an antenna. In various embodiments, communication unit 1940 may include a connection 1943 to couple to a transmission medium 1941. Transmission medium 1941 may be an optical fiber medium. Transmission medium 1941 may couple to a wired network. Transmission medium 1941 may be cable. Transmission medium 1941 may include a coaxial cable, an unshielded twisted pair cable, or a shielded twisted pair cable. Communication interface 1940 may include one or more networks interfaces to communicate over various networks.
Various embodiments or combination of embodiments for apparatus and methods, as described herein, can be realized in hardware implementations, software implementations, and combinations of hardware and software implementations. These implementations may include a machine-readable medium having machine-executable instructions, such as a computer-readable medium having computer-executable instructions, for performing various embodiments associated with the capture and processing of media content of an event under the control of a media production server. The machine-readable medium is not limited to any one type of medium. The machine-readable medium used will depend on the application using an embodiment of a media production server.
Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement that is calculated to achieve the same purpose may be substituted for the specific embodiments shown. It is to be understood that the above description is intended to be illustrative, and not restrictive, and that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Combinations of the above embodiments and other embodiments will be apparent to those of skill in the art upon studying the above description.
Number | Date | Country | |
---|---|---|---|
60953877 | Aug 2007 | US |