Many video game players wish to relive or share their gaming accomplishments and experiences with others in video form. For example, there are many videos of gameplay footage available at online video services highlighting impressive displays of skill or interesting glitches, as well as artistic pieces incorporating gameplay footage. The most readily available method of acquiring such footage is for a player to record imagery from their television with a video camera, a process that can suffer greatly from capture quality issues and environmental noise.
In the past, players who wanted to make high-quality replay recordings would have to obtain expensive video capture equipment to record directly from the game console output. Some games have added replay recording and playback facilities, including sharing recorded replays with other users, but this is typically accomplished by saving proprietary object position and animation data, and a copy of the game is required to view them. Other games have extended this by including a server-side component that transforms data uploaded to it from the game into a standard movie file format.
Accordingly, the present disclosure provides a system and method for producing video replay output of a gameplay sequence occurring during execution of a video game. In one embodiment, the method includes recording the gameplay sequence during gameplay, so as to yield a relatively lower quality recording. The method may also include associating a tag with an event occurring during the gameplay sequence, and receiving, using the tag, a user selection of the gameplay sequence. The method further includes, subsequent to gameplay, producing a viewable representation of relatively higher quality of the gameplay sequence via processing of the relatively lower quality recording, and may further include outputting the viewable representation as a standard-format video file.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
The systems and methods described herein may be used to produce high-quality versions of gameplay sequences that occur during execution of a video game. Initial recording may be carried out at a relatively lower quality level, in order to maintain an appropriate allocation of processing resources for execution of the video game. Users can select desired gameplay sequences, which may then be processed subsequent to gameplay to produce higher quality output, typically in a standard-format video file produced directly on the game console without need for uploading or other remote processing. In many examples, the systems and methods apply tags to denote gameplay events of interest, and the user then selects desired sequences to be processed by using the tags. The user may also specify characteristics associated with the final output, such as quality level, vantage point (e.g., camera angles), video effects, etc. Post-production tags can also be applied to facilitate organizing, sharing and other uses of the video output.
The resulting high-quality output may be saved on the video game console, a separate computer or peripheral storage device, and/or via uploading to a server. The present systems and methods have the advantage of allowing players to generate high-quality, standard-format video files of gameplay sequences with desired video effects and camera angles directly on their gaming console. The standard-format files may be viewed without needing a copy of the video game software, and may be stored, organized, shared, uploaded, etc., as desired by the user.
In
For example, in a car racing game, the entire race may be recorded as experienced during gameplay from a point of view localized to a single player-controlled car. In an alternate example, in a boxing game, a match may be recorded as experienced during gameplay from the point of view of a specific user. In both examples, after the game is completed, the replay recording may be used to produce a cinematic presentation of the replay, for example in the style of television broadcast coverage of the entire race/match. Additionally, during replay, the user may fast-forward or rewind and choose from various camera angles and/or player views. The relatively lower quality recordings may be optionally saved as stored game items, for example in memory/storage of the video game console.
Returning to
At 106 of
At 110 of
After selection of gameplay sequences and video effects and features, the game console 340 (
A multi-pass processing procedure/operation may be used to generate the high-quality video output. Initiation of the procedure is indicated at 114 of
It will often be desirable that the processing culminate in a standard-format video file that can be viewed independently of the source video game (i.e., without a copy of the game). The discussion herein will often refer to the WMV (Windows Media Video) format, though it will be appreciated that other formats may be employed.
In any case, as indicated above, the video file export typically occurs in one or more passes (multi-pass processing). In one example, shown in sub-method 200 of
At 202 of
WMV exporter 300 (
At 204 of
At 206 of
Some information from the replay may be converted to one or more post-production tags, such as metadata tags 322 of
Following finalization of the WMV file 336 on cache 338, the user may be prompted to save WMV file 336 by choosing a game save location, as also shown at 118 in the example method of
The user may also be given an option to upload the resulting WMV file 344 to a game server 346 immediately after export, for example to allow download via a web portal. Such a web portal could allow the user or others to download the WMV file 344. The file may be downloaded to any practicable location, for example to a user computer or a peripheral storage device 350, so as to be saved within the memory/storage 348 of the device. Another option for uploading could be an upload to video-sharing or social media websites (e.g., YouTube, FaceBook, MySpace and the like).
Other possible implementations of the present video export system and method could store the raw video and encode it on a subsequent pass, or encode the audio immediately upon obtaining it. Still other implementations could store some or all of the intermediate streams in memory, encrypt some or all of the streams, obtain audio and video simultaneously, or do any combination of recording, encoding, and multiplexing the audio and video in one or more passes on the game console.
The present systems and methods can provide many advantages and enrich the experience of a video game. For example, a user can easily export a series of higher quality videos with different camera angles that match up end-to-end in terms of progress through the gameplay, then assemble these clips on their computer using standard video editing suites and add music and other artistic elements. A video could be easily made showing gameplay from perspectives of multiple different players. A user could make a “best-of” video highlighting his/her skill or achievements in a variety of different gaming sessions.
Computing system 400 includes a processing subsystem 402 and a data-holding subsystem 404. Computing system 400 may optionally include a display subsystem 406, communication subsystem 408, and/or other components not shown in
Processing subsystem 402 may include one or more physical devices configured to execute one or more instructions. For example, the processing subsystem may be configured to execute instructions that carry out and implement the video production systems and methods described above. Furthermore, processing subsystem 402 may employ single core or multicore processors, and the programs executed thereon may be configured for parallel or distributed processing. The processing subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the processing subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.
Data-holding subsystem 404 may include one or more physical, non-transitory, devices configured to hold data and/or instructions executable by the processing subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem 404 may be transformed (e.g., to hold different data).
Data-holding subsystem 404 may include removable media and/or built-in devices. Data-holding subsystem 404 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. Data-holding subsystem 404 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, processing subsystem 402 and data-holding subsystem 404 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
The terms “module,” “program,” and “engine” may be used in connection with aspects of the described video production systems and methods. In some cases, such a module, program, or engine may be instantiated via processing subsystem 402 executing instructions held by data-holding subsystem 404. For example, WMV exporter 300 can be implemented via execution by processing subsystem 402 of instructions stored in data-holding subsystem 404. It is to be understood that different modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module,” “program,” and “engine” are meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
When included, display subsystem 406 may be used to present a visual representation of data held by data-holding subsystem 404 (e.g., video output occurring during gameplay, or the exported high-quality video output described herein). As the example methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of display subsystem 406 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 406 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with processing subsystem 402 and/or data-holding subsystem 404 in a shared enclosure, or such display devices may be peripheral display devices.
When included, communication subsystem 408 may be configured to communicatively couple computing system 400 with one or more other computing devices. Communication subsystem 408 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, etc. In some embodiments, the communication subsystem may allow computing system 400 to send and/or receive data to and/or from other devices via a network such as the Internet.
It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
This application claims priority to U.S. Provisional Patent Application No. 61/405,093, filed Oct. 20, 2010, the entirety of which is hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 61405093 | Oct 2010 | US |
Child | 13029926 | US |