Online entertainment (including video game) services such as Xbox® LIVE offer many features to users. One contemporary feature is to generate game video clips on behalf of users during game play.
Over time, the user-generated content clips for one user's own game play may become numerous. A gallery may be a suitable way to organize and access such clips if the number of clips is small, but is likely inadequate for most users as the number of clips increases. If friends of that user also have clips that the user may access, the available number of clips will be overwhelming for gallery-style organization and access. Still further, clips may be available from users in general, whereby the number of available clips may be on the order of millions. In such large numbers, there is no way for a user to know what videos are the most interesting and relevant at a given time.
This Summary is provided to introduce a selection of representative concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used in any way that would limit the scope of the claimed subject matter.
Briefly, one or more of various aspects of the subject matter described herein are directed towards returning relevant or otherwise desired videos to users, in which the videos were generated during game play. One or more aspects are directed towards receiving event metadata corresponding to game play for which a video is generated. Video-related metadata of the video is located, and at least some of the event metadata is associated with the video-related metadata of the video.
In one or more aspects, a video and event metadata connector is configured to associate metadata events corresponding to video game play with video content generated during that game play. A video locator queries for one or more video identifiers that match a request for video content based upon event metadata that is associated with the video content by the video and event metadata connector.
One or more aspects are directed towards receiving a request for video content that was generated during the playing of a videogame, and searching for the video content based upon one or more search criteria provided via the request and event metadata that was produced during playing of the video game. One or more video identifiers of video having event metadata that matches the one or more search criteria are returned in response to the request.
Other advantages may become apparent from the following detailed description when taken in conjunction with the drawings.
The present invention is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
Various aspects of the technology described herein are generally directed towards associating metadata that is generated during game play with a video clip (or clips) of that game play; thereafter, clips may be located by a search based upon the associated metadata. In this way, for example, a clip that is relevant, desired and/or otherwise appropriate for a user may be located for the user via the metadata as needed.
It should be understood that any of the examples herein are non-limiting. For example, video other than game play, such as television recording, movies and personal video may benefit from the technology described herein. As such, the present invention is not limited to any particular embodiments, aspects, concepts, structures, functionalities or examples described herein. Rather, any of the embodiments, aspects, concepts, structures, functionalities or examples described herein are non-limiting, and the present invention may be used various ways that provide benefits and advantages in computing, video games and video content in general.
In general, some contemporary titles are written so as to collect and produce metadata related to the game play, including metadata describing what users are doing, where they are in their games, what inventory or abilities they have, and so on. Described herein is connecting at least some of that metadata with the game video or video. This allows querying of videos by the metadata, whereby, for example, developers can create experiences that showcase or otherwise use videos in contextually relevant scenarios for a user currently playing that game.
As used herein, a video may comprise one or more clips or other subsections, and/or a game for which video is generated may be broken up into multiple videos. The metadata may be used differentiate between subsections of a larger video, between clips that make up a video, and so on. In this way, for example, a user may still be playing a game and be able to retrieve video from earlier in that game, e.g., in an almost live replay scenario, and/or from another person's video, such as a friend. Each video has its own video identifier (ID) and a user ID corresponding to the player for whom the video is generated.
As represented in
At a suitable time, such as after the video ends, the video and metadata connector 114 (or optionally a different component) writes queries to run in the metadata store 116 to find which events occurred between the video's start and end time for that user. As represented by arrows two (2) through four (4), those events and their related metadata are added to (or otherwise associated/connected with) the corresponding video's metadata by querying the video database 112. Note that not all received game-related events/metadata need be added to the video's metadata; only those relevant to video may be maintained for retrieval purposes, for example.
It should be noted that performing the association at the data collection service/warehouse level is only one implementation. It is feasible for the title or other console code to perform the association while the game is being played and the video is being generated. For example, events including metadata may be written with timestamps to a data structure that accompanies the video content data. In general, the association may occur at the user level, at the data platform level, or at some combination thereof.
Turning to retrieving the video based upon the associated event metadata, as represented in
Note that the user is not necessarily the same user corresponding to the video's creation. Further, the title need not be the same game. For example, a separate title (or dashboard component) may be provided for interfacing with the data platform simply to view videos, without any game being played. Such a title or component may automatically convert user requests or the like (“see my friend Joe's last ten kills”) to the actual parameters (one or more search criteria corresponding to the video's metadata) by which the corresponding video clip or clips are searched and retrieved.
The one or more requests 222 (or at least one) may be made by a companion device 226 (or multiple companion devices). This is represented in
When a request is received, the parameters corresponding to the video's metadata are provided by the requesting entity. By way of example, consider a user who asks for help while in a certain location in a game's map. The parameters may include the game character's x, y, z, coordinates, by which a video or set of videos is returned. Such a video may, for example, show how an adversary was overcome by another user. Another type of help is to show the user a video (e.g., generated by another user) of surrounding scenes or options to take when the user is lost.
The video requests/parameters correspond to or may be in the form of database queries. By way of example, some various queries are set forth below, and may include “Player Section Exit Queries,” such as queries to “return the Halo® videos to me where my friends completed <section id> on <difficulty level>,” “return all the race game videos where my friends failed <section id> on <difficulty level>.
“Multiplayer Instance Exit Queries” may include queries derived from some request like “return all the videos of my friends who were in <my same multiplayer session>.” “Objective Updated Queries” may include queries derived from some request like “Return all the videos of my friends where they completed <ObjectiveID>”.
“Location-Based Queries” may request the system to “Sort all the public videos by distance from <location X, Y, Z>,” for example. This allows a user to see the videos of others (or possibly the user from a previous game) who were close to the user's specified position, from the point of view of the other user (which may be different from the current user's point of view.
“Element-in-Common Queries” search videos from multiple different titles where there is some element in common, e.g., “Return all of the videos of my friends where they complete an objective wearing chain-mail armor.” “Just Like Me” queries generally take a significant amount of information about the player, e.g., armor, inventory, level, location and so forth, and return videos that are of other players in similar situations. These may be used for “game help” scenarios, where the user would like to see how other people who have similar attributes have completed a task he or she is attempting to complete, and/or be given help corresponding to one or more others who had a similar or close XYZ location.
If the metadata is available, queries may be from a perspective or direction, e.g., videos looking in a certain direction, videos of players walking West. Temporal and/or current game state may be factored in to which video or videos are selected, e.g., in a quest-type game, one user may not see a video in which a treasure is shown because the user has not yet earned that level through some accomplishment, while another user may see such a video.
Many other usage models are feasible. For example, a user may be given an option to re-live moments in various recently played worlds.
When the video is done (or at some other suitable point, such as if a video is divided into clips and a clip is done) as evaluated at step 406, step 408 queries metadata store, e.g., to find those events between video's start and end time for the user's video. It is feasible to perform such filtering here as well, e.g., one component may log all events, and another component may filter only those related to video for adding to the video's metadata. Step 410 represents the adding of the events and their accompanying metadata to the corresponding video's metadata.
Step 508 represents selecting the video or videos to return, e.g., as a list of video identifiers. Depending on the query, some filtering may need to be done, for example if the query was not sufficiently narrow. Sorting and/or ranking may be done as part of the selection process. For example, “Sort all the public videos by distance” may need a sort operation after the retrieval operation. Note that a query may be reformulated, e.g., “Sort all the public videos by distance” may result in too many videos being identified via the corresponding query, and thus some distance limit may be specified in the query, e.g., retrieve the public videos by distance within ten yards of these coordinates. However, this may not result in a sufficient number, and thus a larger radius may be provided in a reformulated query until some threshold number of identified videos is met.
Step 510 sends the list of matching videos to the requester, which as set forth above may be to a different device. The list may also include some or all of the metadata (or data derived from the metadata). For example, in
Note that returned videos need not be limited to the same title from which they were generated. For example, a user may request to see his friends' videos from multiple titles where his friend did something noteworthy, such as won a race, beat a boss and so on.
The following sets forth some possible examples of events; more may be present than those shown, and not all those shown need be present. One or more implementations allow title developers to create custom events as well.
In-Game Genre Events are for titles that belong to the corresponding genre, e.g., Racing, Action & Adventure, Shooter, Sports & Recreation, Fighter, and so on. Any title can optionally log these events regardless of their genre if there are game mechanics which make them applicable, and the technology described herein may use them as desired.
It can be readily appreciated that the above-described implementation and its alternatives may be implemented on any suitable computing device, including a gaming system, personal computer, tablet, DVR, set-top box, smartphone and/or the like. Combinations of such devices are also feasible when multiple such devices are linked together. For purposes of description, a gaming (including media) system is described as one exemplary operating environment hereinafter.
The CPU 602, the memory controller 603, and various memory devices are interconnected via one or more buses (not shown). The details of the bus that is used in this implementation are not particularly relevant to understanding the subject matter of interest being discussed herein. However, it will be understood that such a bus may include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus, using any of a variety of bus architectures. By way of example, such architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnects (PCI) bus also known as a Mezzanine bus.
In one implementation, the CPU 602, the memory controller 603, the ROM 604, and the RAM 606 are integrated onto a common module 614. In this implementation, the ROM 604 is configured as a flash ROM that is connected to the memory controller 603 via a Peripheral Component Interconnect (PCI) bus or the like and a ROM bus or the like (neither of which are shown). The RAM 606 may be configured as multiple Double Data Rate Synchronous Dynamic RAM (DDR SDRAM) modules that are independently controlled by the memory controller 603 via separate buses (not shown). The hard disk drive 608 and the portable media drive 609 are shown connected to the memory controller 603 via the PCI bus and an AT Attachment (ATA) bus 616. However, in other implementations, dedicated data bus structures of different types can also be applied in the alternative.
A three-dimensional graphics processing unit 620 and a video encoder 622 form a video processing pipeline for high speed and high resolution (e.g., High Definition) graphics processing. Data are carried from the graphics processing unit 620 to the video encoder 622 via a digital video bus (not shown). An audio processing unit 624 and an audio codec (coder/decoder) 626 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data are carried between the audio processing unit 624 and the audio codec 626 via a communication link (not shown). The video and audio processing pipelines output data to an A/V (audio/video) port 628 for transmission to a television or other display/speakers. In the illustrated implementation, the video and audio processing components 620, 622, 624, 626 and 628 are mounted on the module 614.
In the example implementation depicted in
Memory units (MUs) 650(1) and 650(2) are illustrated as being connectable to MU ports “A” 652(1) and “B” 652(2), respectively. Each MU 650 offers additional storage on which games, game parameters, and other data may be stored. In some implementations, the other data can include one or more of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file. When inserted into the console 601, each MU 650 can be accessed by the memory controller 603.
A system power supply module 654 provides power to the components of the gaming system 600. A fan 656 cools the circuitry within the console 601.
An application 660 comprising machine instructions is typically stored on the hard disk drive 608. When the console 601 is powered on, various portions of the application 660 are loaded into the RAM 606, and/or the caches 610 and 612, for execution on the CPU 602. In general, the application 660 can include one or more program modules for performing various display functions, such as controlling dialog screens for presentation on a display (e.g., high definition monitor), controlling transactions based on user inputs and controlling data transmission and reception between the console 601 and externally connected devices.
The gaming system 600 may be operated as a standalone system by connecting the system to high definition monitor, a television, a video projector, or other display device. In this standalone mode, the gaming system 600 enables one or more players to play games, or enjoy digital media, e.g., by watching movies, or listening to music. However, with the integration of broadband connectivity made available through the network interface 632, gaming system 600 may further be operated as a participating component in a larger network gaming community or system.
While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention.