Statistically defined game channels

Information

  • Patent Grant
  • 11612816
  • Patent Number
    11,612,816
  • Date Filed
    Monday, October 26, 2020
    3 years ago
  • Date Issued
    Tuesday, March 28, 2023
    a year ago
Abstract
The subject disclosure relates to the creation of channels providing customizable video feeds in an online gaming platform. In some aspects, a process of the disclosed technology can include steps for receiving a first set of event attributes from a first user, the first set of event attributes comprising information identifying a first game title and a first game-play event associated with the first game title, monitoring play of the first game title by a second user to detect occurrence of the game-play event, and in response to a detected occurrence of the first game-play event, automatically providing a video stream of the first game title to the first user. Systems and machine-readable media are also provided.
Description
BACKGROUND
1. Technical Field

Aspects of the subject technology relate to an online gaming platform, and in particular, to a platform for facilitating user creation of customizable gaming channels for viewing competitive game-play.


2. Description of the Related Art

Like any popular competitive activity, such as football, card games and board games, online games have a large following of fans that appreciate competitive games and highly skilled players. As with other games, such fans also enjoy structured competition amongst peers of comparable skill level. For example, by encouraging a competitive atmosphere amongst peers, fantasy sports leagues and competitions have become a widespread activity. While fantasy leagues for a variety of sports are now widely available, comparable leagues for online games are not readily available for non-professional players.


SUMMARY OF THE CLAIMED INVENTION

Embodiments of the claimed invention include methods and systems for managing an online gaming league and in particular, for enabling customizable channels for viewing competitive game-play. Such systems can include a network interface configured to receive event attributes that define information about gaming titles and game environment pre-conditions that are required to initiate video streaming of an associated game-play, for example, into a user-curated gaming channel. In some aspects, one or more processors of such systems can be configured to execute operations to perform steps for receiving a first set of event attributes from a first user, the first set of event attributes comprising information identifying a first game title and a first game-play event associated with the first game title, monitoring play of the first game title by a second user to detect occurrence of the game-play event, and automatically providing a video stream of the first game title to the first user in response to a detected occurrence of the first game-play event.





BRIEF DESCRIPTION OF THE DRAWINGS

Certain features of the subject technology are set forth in the appended claims. However, the accompanying drawings, which are included to provide further understanding, illustrate disclosed aspects and together with the description serve to explain the principles of the subject technology. In the drawings:



FIG. 1 illustrates an example of a network environment in which some aspects of the technology can be implemented.



FIG. 2 conceptually illustrates various gaming channels that can be defined based on user-specified preconditions (e.g., events or event attributes).



FIG. 3 illustrates steps of an example process for defining a game channel and automatically providing a video stream for a selected game title to a user.



FIG. 4 illustrates an example of an electronic system with which some aspects of the subject technology can be implemented.





DETAILED DESCRIPTION

The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the technology can be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a more thorough understanding of the technology. However, it will be clear and apparent that the technology is not limited to the specific details set forth herein and may be practiced without these details. In some instances, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.


Competitive gaming, commonly referred to as electronic sports or “eSports,” involve the play of video games in a competitive environment. Videogame competitions have existed for nearly as long as video games themselves. Early competitions were set up as tournament style matches between players focused on one particular game, often tied to a new product release. Eventually, competitive leagues and a constant stream of tournaments evolved to provide structure for the eSports community. Players now have the option of competing in a vast array of professional and amateur competitions held at physical locations or in virtual competitions conducted online. Leagues and content providers such as Major League Gaming (MLG), and Global StarCraft® League, provide opportunities for competition and to find information about competitive gaming. Until recently, participation in competitive gaming has demonstrated a gradual and steady pace in growth. However, competitive gaming is presently undergoing a rapid expansion in participation and interest.


Unlike fantasy gaming leagues, such as fantasy football, non-professional players of online games have limited ability to create and manage their own leagues in which players a similar skill level can compete. Online gamers often lack access to a centralized platform that can be readily configured to perform player rankings, or than can enable users to make easy comparisons between different players or different game titles on the basis of user-defined statistical criteria. Additionally, gamers (e.g., users/players) are often spectators of games played by both professional and non-professionals, but lack the ability to filter available gaming video feeds based on their own criteria.


The subject technology addresses the foregoing limitations by providing a gaming league platform that facilitates the creation and management of customized gaming leagues, as well as dashboards to enable players to define statistical criteria for selecting video feeds for games that wish to watch.


In some aspects, the gaming platform can enable player-spectators to define event attributes that can be used to filter available gaming feeds. In some aspects, filtered gaming feeds can be curated to form gaming channels that are available to subscribing users/players. As used herein, event attributes can include any type of game identifying information (e.g., title names or versions), as well as player identifying information (e.g., player names or online handles), associated with competitive gameplay. Event attributes can also include statistical parameters or pre-conditions that can be used to filter what gaming feeds (video feeds) are captured and provided.


By way of example, event attribute information can be provided to the gaming platform using a dashboard. Event attribute information can identify one or more game titles, one or more game players, and/or one or more statistical parameters that are pre-conditions for video capture and streaming. As such, players can conveniently “flag” potentially interesting game events, player bouts, or other phenomena in an online gaming environment for capture and viewing. Such features can enable players to tune into potentially interesting gaming events while they are happening (in real time), or to capture interesting events for later viewing. In some aspects, players may combine feeds from multiple different games that are played either concurrently or asynchronously to create content channels based on their own pre-defined criteria. By way of example, user defined event attribute information can be used to create gaming mashups, such as to enable a user to watch a speed run versus a pitcher's duel versus a pugilistic battle, e.g., to create customized game streaming channels that would not otherwise have been available.


In some aspects, user selected gaming feeds can be re-run or re-rendered, e.g., to provide the consuming user with a preferred viewing angle or other perspectives with which to select and view interesting game events.



FIG. 1 illustrates an example of network environment 100 in which some aspects of the technology can be implemented. Environment 100 includes public network 102, which can include one or more private networks, such as, a local area network (LAN), a wide area network (WAN), or a network of public/private networks, such as the Internet. Public network 102 is communicatively coupled to gaming network 104 that represents a network of computing resources configured for implementing gaming league management system 112 of the subject technology.


Public network 102, and gaming network 104 provide player and developer access to league management system 112. As illustrated, players 108, 109, and 110 are permitted access to league management system 112 using respective client devices (e.g., 108A, 109A, and 110A). Although client devices 108A, 109A, and 110A, are depicted as personal computing devices, it is understood that the client devices can include various types of processor-based systems, including but not limited to: game consoles, smart phones, tablet computer systems, and the like. Example of hardware systems that can be used to implement the client device are discussed in further detail below with respect to FIG. 4. Similarly, developers (e.g., 105, 106 and 107), are permitted access to league management system via respective their computing systems (e.g., 105A, 106A, and 107A).


It is understood that a greater (or fewer) number of players and/or developers can be engaged with environment 100, without departing from the scope of the technology.


In practice, management system 112 is configured to create and support tournament style competitions between various players and for multiple different game titles, and to provide the interfaces (dashboard) necessary to enable users/players to define parameters for game capture and playback. As such, management system 112 is configured to facilitate player competition across game genres, as well as to facilitate the selection, capture and distribution of competitive gameplay events.


Because the metrics that are used to evaluate player performance for a particular game can vary widely between game title and type, in some aspects, game developers are encouraged to provide an indication of the specific game-performance attributes that should be used to evaluate player performance for the developer's game.


By way of example, developers 105, 106, and/or 107 can be developers of different game titles, each of which is associated with a different genre. To facilitate league competition for their games, each of the developers can submit game-performance attributes to the league, e.g., using respective computing systems 105A, 106A, and 107A. In a similar manner, the creation of gaming leagues and corresponding league parameters can be managed at the player level, for example, by one or more of players 108, 109, and/or 110. That is, individual players or player collectives can provide specific game-performance attributes that they would like implemented to structure their own individualized league play.


Although game-performance attributes can include virtually any type of information that can be used in conjunction with the creation, execution, management, and/or promotion of a gaming league, in some aspects, game-performance attribute information can include game characteristics, including but not limited to one or more of: a “match type,” “event information,” and/or a player access list, etc. Game attributes information can also include game statistics, including but not limited to “high-level” player statistics such as: win/loss records, play time duration, levels achieved, points scored, and/or an ELO rating, etc. In some aspects, game statistics include “low-level” player statistics, including but not limited to: map-position information, map-path information, shots fired, time with rifle, time in kneel, time running, time to first fix, time to acquire a target, (e.g., “quick-draw” metrics), and/or measures of player/avatar health, etc.


The game-performance attributes, including game characteristics and game-play statistics can be received and recorded by league management system 112, for example, using an application programming interface (API), such as API 114. Subsequently, the game-performance attributes can be passed from API 114 to statistics module 116 for use in player analysis. In a similar manner, game-play statistics can be collected by API 114 and passed to statistics module 116.


Depending on implementation, game-play statistics can be collected via active monitoring of an online game environment, or received from one or more external systems, for example, as part of a game performance data set that includes information describing individual player performance for one or more game titles and/or genre categories.


As discussed above, monitoring of a game environment (e.g., for a single game title or for multiple game titles), can be performed based on user defined event attributes that indicate when game capture is to be performed, and what aspects behaviors/events are to be recorded. In practice, capture/streaming module 120 is configured to record and/or stream those portions of gameplay that have been preselected by user defined event attributes provided to a dashboard provided by management system 112.



FIG. 2 conceptually illustrates the user creation of customized game streaming channels. In the example of FIG. 2, multiple users 206 are communicatively coupled to a gaming platform 204 via network 202. Platform 204 can be configured to provide a dashboard and monitoring systems necessary to receive user-defined statistical variables (e.g., event attributes), and to provide streaming video content of captured gaming events.


In the illustrated example, each of users 206 are subscribed to a different “channel,” for example, that is defined based on different predetermined events, e.g., as defined by respectively provided even attribute information. That is, user 206A is subscribed to Channel 1, which contains gaming feeds based on events (event attributes) A, B, and C; user 206B is subscribed to Channel 2, which contains various gaming feeds based on events A, C, and D; and user 206N is subscribed to Channel N comprised of gaming feeds based on events X, Y, and Z.


As discussed above, the event attributes may define any characteristics relating to the gameplay environment, including but not limited to: player parameters, title parameters, and/or event parameters on which the respective gaming feeds are to be based, etc. For example, Event A may specify the selection of all gaming events played in a first person shooter game, for which each participating opponent has accumulated a point total (or “kill count”) exceeding a predetermined threshold. Event B may specify a racing game title in which competing player lap times are within a 8 seconds of one another; Event C may specify all events associated with 4 specific players in the online gaming environment, and Even E may specify all gaming events associated with soccer related games. As such, player 206A, subscribed to Channel 1, would receive streaming coverage for Events A, B, and C, whereas player 206B would receive streaming coverage of feeds from various game titles/players/environments matching the criteria of Events A, C, and D, etc. It is understood that the scope of the technology is not limited by player/subscriber count, and that each channel may be defined based on virtually any number and any type of user-defined characteristic or statistical attribute. As such, each user/player may optionally create one or more channels that are filtered by highly specific gaming events, and player actions or achievements, etc.



FIG. 3 illustrates steps of an example process 300 for defining a game channel and automatically providing a video stream for a selected game title to a user. Process 300 begins with step 302, in which a first set of event attributes are received, for example, at a game platform dashboard. As discussed above, the first set of event attributes may be received from a first user, and can include information identifying a first game title and a first gameplay event associated with the first game title.


In step 304, gameplay of the first game title is monitored according to the first set of event attributes provided by the first user. Gameplay monitoring can be performed either for the user defining the event attributes (e.g. the first user), or for one or more other users that engage with the event title, e.g. a second user. As such, a first user can “flag” gameplay events for a specific gaming title for which later notifications or video streaming are to be provided.


In step 306, a video stream of the first game title is automatically provided to the first user in response to detected occurrence of the first gameplay event. It is understood that the provided video stream may include clips, highlights, or notifications drawn from one or more game titles that are either played concurrently or asynchronously. That is, aspects of the technology are not limited to the monitoring of a single game environment, or a single game title, etc.



FIG. 4 is an exemplary user device 400. User device 400 (e.g., desktop, laptop, tablet, mobile device, console gaming system) is a device that the user can utilize to facilitate carrying out features of the present invention pertaining to the viewing of third party content.


The user device 400 may include various elements as illustrated in FIG. 4. It should be noted that the elements are exemplary and that other embodiments may incorporate more or less than the elements illustrated. With reference to FIG. 4, the user device 400 includes a main memory 402, a central processing unit (CPU) 404, at least one vector unit 406, a graphics processing unit 408, an input/output (I/O) processor 410, an I/O processor memory 412, a controller interface 414, a memory card 416, a Universal Serial Bus (USB) interface 418, and an IEEE 1394 interface 420, an auxiliary (AUX) interface 422 for connecting a tracking device 424, although other bus standards and interfaces may be utilized. The user device 400 further includes an operating system read-only memory (OS ROM) 426, a sound processing unit 428, an optical disc control unit 430, and a hard disc drive 432, which are connected via a bus 434 to the I/O processor 410. The user device 400 further includes at least one tracking device 424.


The tracking device 424 may be a camera, which includes eye-tracking capabilities. The camera may be integrated into or attached as a peripheral device to user device 400. In typical eye-tracking devices, infrared non-collimated light is reflected from the eye and sensed by a camera or optical sensor. The information is then analyzed to extract eye rotation from changes in reflections. Camera-based trackers focus on one or both eyes and record their movement as the viewer looks at some type of stimulus. Camera-based eye trackers use the center of the pupil and light to create corneal reflections (CRs). The vector between the pupil center and the CR can be used to compute the point of regard on surface or the gaze direction. A simple calibration procedure of the viewer is usually needed before using the eye tracker.


Alternatively, more sensitive trackers use reflections from the front of the cornea and that back of the lens of the eye as features to track over time. Even more sensitive trackers image features from inside the eye, including retinal blood vessels, and follow these features as the eye rotates.


Most eye tracking devices use a sampling rate of at least 30 Hz, although 50/60 Hz is most common. Some tracking devises run as high as 1250 Hz, which is needed to capture detail of very rapid eye movement.


A range camera may instead be used with the present invention to capture gestures made by the user and is capable of facial recognition. A range camera is typically used to capture and interpret specific gestures, which allows a hands-free control of an entertainment system. This technology may use an infrared projector, a camera, a depth sensor, and a microchip to track the movement of objects and individuals in three dimensions. This user device may also employ a variant of image-based three-dimensional reconstruction.


The tracking device 424 may include a microphone integrated into or attached as a peripheral device to user device 400 that captures voice data. The microphone may conduct acoustic source localization and/or ambient noise suppression.


Alternatively, tracking device 424 may be the controller of the user device 400. The controller may use a combination of built-in accelerometers and infrared detection to sense its position in 3D space when pointed at the LEDs in a sensor nearby, attached to, or integrated into the console of the entertainment system. This design allows users to control functionalities of the user device 400 with physical gestures as well as button-presses. The controller connects to the user device 400 using wireless technology that allows data exchange over short distances (e.g., 30 feet). The controller may additionally include a “rumble” feature (i.e., a shaking of the controller during certain points in the game) and/or an internal speaker.


The controller may additionally or alternatively be designed to capture biometric readings using sensors in the remote to record data including, for example, skin moisture, heart rhythm, and muscle movement.


As noted above, the user device 400 may be an electronic gaming console. Alternatively, the user device 400 may be implemented as a general-purpose computer, a set-top box, or a hand-held gaming device. Further, similar user devices may contain more or less operating components.


CPU 404, vector unit 406, graphics processing unit 408, and I/O processor 410 communicate via system bus 436. Further, the CPU 404 communicates with the main memory 402 via a dedicated bus 438, while the vector unit 406 and the graphics processing unit 408 may communicate through a dedicated bus 440. The CPU 404 executes programs stored in the OS ROM 426 and the main memory 402. The main memory 402 may contain pre-stored programs and programs transferred through the I/O Processor 410 from a CD-ROM, DVD-ROM, or other optical disc (not shown) using the optical disc control unit 432. The I/O processor 410 primarily controls data exchanges between the various devices of the user device 400 including the CPU 404, the vector unit 406, the graphics processing unit 408, and the controller interface 414.


The graphics processing unit 408 executes graphics instructions received from the CPU 404 and the vector unit 406 to produce images for display on a display device (not shown). For example, the vector unit 406 may transform objects from three-dimensional coordinates to two-dimensional coordinates, and send the two-dimensional coordinates to the graphics processing unit 408. Furthermore, the sound processing unit 430 executes instructions to produce sound signals that are outputted to an audio device such as speakers (not shown).


A user of the user device 400 provides instructions via the controller interface 414 to the CPU 404. For example, the user may instruct the CPU 404 to store certain information on the memory card 416 or instruct the user device 400 to perform some specified action.


Other devices may be connected to the user device 400 via the USB interface 418, the IEEE 1394 interface 420, and the AUX interface 422. Specifically, a tracking device 424, including a camera or a sensor may be connected to the user device 400 via the AUX interface 422, while a controller may be connected via the USB interface 418.


It is understood that any specific order or hierarchy of steps in the processes disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged, or that only a portion of the illustrated steps be performed. Some of the steps may be performed simultaneously. For example, in certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.


The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but are to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.”


A phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology. A disclosure relating to an aspect may apply to all configurations, or one or more configurations. A phrase such as an aspect may refer to one or more aspects and vice versa. A phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology. A disclosure relating to a configuration may apply to all configurations, or one or more configurations. A phrase such as a configuration may refer to one or more configurations and vice versa.


The word “exemplary” is used herein to mean “serving as an example or illustration.” Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.

Claims
  • 1. A system for providing game play channels, the system comprising: a user device that communicates over a communication network; anda non-transitory memory coupled to one or more processors, the memory comprising instructions stored therein executable by the processors to perform operations comprising:storing a set of criteria in the memory, the set of criteria specifying statistical parameters of one or more attributes of in-stream events;evaluating a plurality of streams to identify one or more in-stream events that meet the statistical parameters specified by the set of criteria, wherein at least one of the streams is evaluated in real time;filtering the streams based on the identified in-stream events that meet the statistical parameters specified by the set of criteria, wherein a filtered set of multiple streams are identified as including the identified in-stream events that meet the set of criteria; andproviding a combination stream over the communication network to the user device, the combination stream corresponding to the identified in-stream events from the filtered set of streams.
  • 2. The system of claim 1, wherein the set of criteria is defined by a user of the user device.
  • 3. The system of claim 2, wherein the user defines the set of criteria by flagging the in-stream events within the plurality of streams.
  • 4. The system of claim 1, wherein the one or more attributes identify one or more game titles, one or more game players, or one or more statistical parameters.
  • 5. The system of claim 4, wherein the one or more attributes are pre-conditions for video capture.
  • 6. The system of claim 1, further providing the combination stream over the communication network to one or more other user devices.
  • 7. The system of claim 1, wherein the combination stream is formed into a gaming channel to be provided over the communication network.
  • 8. The system of claim 7, wherein the gaming channel is provided to one or more user devices with a subscription to the gaming channel.
  • 9. The system of claim 1, wherein the combination stream corresponds to a preferred viewing angle of the identified in-stream events.
  • 10. A method for providing customized streams, the method comprising: storing a set of criteria in memory, the set of criteria specifying statistical parameters of one or more attributes of in-stream events;evaluating a plurality of streams to identify one or more in-stream events that meet the statistical parameters specified by the set of criteria, wherein at least one of the streams is evaluated in real time;filtering the streams based on the identified in-stream events that meet the statistical parameters specified by the set of criteria, wherein a filtered set of multiple streams are identified as including the identified in-stream events that meet the set of criteria; andproviding a combination stream over a communication network to a user device, the combination stream corresponding to the identified in-stream events from the filtered set of streams.
  • 11. The method of claim 10, wherein the set of criteria is defined by a user of the user device.
  • 12. The method of claim 11, wherein the user defines the set of criteria by flagging the in-stream events within the plurality of streams.
  • 13. The method of claim 10, wherein the one or more attributes identify one or more game titles, one or more game players, or one or more statistical parameters.
  • 14. The method of claim 13, wherein the one or more attributes are pre-conditions for video capture.
  • 15. The method of claim 10, further providing a combination stream over the communication network to one or more other user devices.
  • 16. The method of claim 10, wherein the combination stream is formed into a gaming channel to be provided over the communication network.
  • 17. The method of claim 16, wherein the gaming channel is provided to one or more user devices with a subscription to the gaming channel.
  • 18. The method of claim 10, wherein the combination stream corresponds to a preferred viewing angle of the identified in-stream events.
  • 19. A non-transitory computer-readable storage medium, having embodied thereon a program executable by a processor to perform operations comprising: storing a set of criteria in memory, the set of criteria specifying statistical parameters of one or more attributes of in-stream events;evaluating a plurality of streams to identify one or more in-stream events that meet the statistical parameters specified by the set of criteria, wherein at least one of the streams is evaluated in real time;filtering the streams based on the identified in-stream events that meet the statistical parameters specified by the set of criteria, wherein a filtered set of multiple streams are identified as including the identified in-stream events that meet the set of criteria; andproviding a combination stream over a communication network to a user device, the combination stream corresponding to the identified in-stream events from the filtered set of streams.
  • 20. The non-transitory computer-readable storage medium of claim 19, further providing a combination stream over the communication network to one or more other user devices.
CROSS-REFERENCE TO RELATED APPLICATION

The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 15/908,635 filed Feb. 28, 2018, now U.S. Pat. No. 10,814,228, which is incorporated herein by reference in its entirety.

US Referenced Citations (179)
Number Name Date Kind
5729471 Jain et al. Mar 1998 A
5995518 Burns et al. Nov 1999 A
6415317 Yelon et al. Jul 2002 B1
6546421 Wynblatt Apr 2003 B1
6631522 Erdelyi Oct 2003 B1
6877134 Fuller et al. Apr 2005 B1
7499475 Kashiwagi et al. Mar 2009 B2
8025572 Spanton et al. Sep 2011 B2
8187104 Pearce May 2012 B2
8202167 Ackely et al. Jun 2012 B2
8645844 Strobel et al. Feb 2014 B1
8702520 Seelig et al. Apr 2014 B2
9216356 Grier Dec 2015 B2
9233305 Laakkonen et al. Jan 2016 B2
9433855 Keeker et al. Sep 2016 B1
9473825 Gossweiler et al. Oct 2016 B2
9479602 Paradise et al. Oct 2016 B1
9497424 Kosseifi et al. Nov 2016 B2
9782678 Long et al. Oct 2017 B2
9814987 Lawson Nov 2017 B1
9860477 Kummer et al. Jan 2018 B2
9871997 Kosseifi et al. Jan 2018 B2
9968856 Ninoles et al. May 2018 B1
10277813 Thomas et al. Apr 2019 B1
10471360 Trombetta Nov 2019 B2
10751623 Trombetta Aug 2020 B2
10765938 Trombetta Sep 2020 B2
10765957 Trombetta Sep 2020 B2
10792576 Trombetta Oct 2020 B2
10792577 Trombetta Oct 2020 B2
10814228 Trombetta Oct 2020 B2
10818142 Trombetta Oct 2020 B2
10953322 Trombetta Mar 2021 B2
10953335 Thielbar Mar 2021 B2
11065548 Trombetta Jul 2021 B2
11241630 Trombetta Feb 2022 B2
11426654 Trombetta Aug 2022 B2
11439918 Trombetta Sep 2022 B2
11439919 Trombetta Sep 2022 B2
11452943 Trombetta Sep 2022 B2
20020034980 Lemmons et al. Mar 2002 A1
20020107040 Crandall et al. Aug 2002 A1
20030177347 Schneier et al. Sep 2003 A1
20030220143 Shteyn et al. Nov 2003 A1
20040147306 Randall et al. Jul 2004 A1
20040189701 Badt Sep 2004 A1
20040257994 Paskett Dec 2004 A1
20040266535 Reeves Dec 2004 A1
20050020359 Ackely et al. Jan 2005 A1
20050026699 Kinzer et al. Feb 2005 A1
20060105827 Metzger et al. May 2006 A1
20060247060 Hanson et al. Nov 2006 A1
20060287096 O'Kelley et al. Dec 2006 A1
20070018952 Arseneau et al. Jan 2007 A1
20070021058 Aresneau et al. Jan 2007 A1
20070070914 Abigail Mar 2007 A1
20070117617 Spanton et al. May 2007 A1
20070117635 Spanton et al. May 2007 A1
20070149288 Nickell et al. Jun 2007 A1
20070191102 Coliz et al. Aug 2007 A1
20070248261 Zhou et al. Oct 2007 A1
20080079752 Gates et al. Apr 2008 A1
20080113815 Weingardt et al. May 2008 A1
20080119286 Brunstetter et al. May 2008 A1
20080200254 Cayce et al. Aug 2008 A1
20090036214 Dahl Feb 2009 A1
20090042646 Sarkar et al. Feb 2009 A1
20090082110 Relyea et al. Mar 2009 A1
20090144448 Smith Jun 2009 A1
20090208181 Cottrell Aug 2009 A1
20090225828 Perlman et al. Sep 2009 A1
20090271821 Zalewski Oct 2009 A1
20100099330 Digiovanni Apr 2010 A1
20100100512 Brodin et al. Apr 2010 A1
20100240443 Baerlocher et al. Sep 2010 A1
20110092282 Gary Apr 2011 A1
20110207523 Filipour et al. Aug 2011 A1
20110250939 Hobler Oct 2011 A1
20110263332 Mizrachi Oct 2011 A1
20110263333 Dokei et al. Oct 2011 A1
20110275431 Hirzel et al. Nov 2011 A1
20120093481 McDowell Apr 2012 A1
20120142411 Thompson et al. Jun 2012 A1
20120283017 Ahiska et al. Nov 2012 A1
20130002949 Raveendran et al. Jan 2013 A1
20130007013 Geisner et al. Jan 2013 A1
20130083173 Geisner et al. Apr 2013 A1
20130123019 Sullivan et al. May 2013 A1
20130231189 Beeler Sep 2013 A1
20130254680 Buhr et al. Sep 2013 A1
20130296051 Gault et al. Nov 2013 A1
20130324239 Ur et al. Dec 2013 A1
20130331192 Betti et al. Dec 2013 A1
20140004951 Kern et al. Jan 2014 A1
20140031121 Kern et al. Jan 2014 A1
20140087846 Bryan et al. Mar 2014 A1
20140087851 Low et al. Mar 2014 A1
20140113718 Norman et al. Apr 2014 A1
20140142921 Gleadall et al. May 2014 A1
20140171039 Bjontegard Jun 2014 A1
20140171182 Versaci Jun 2014 A1
20140179440 Perry Jun 2014 A1
20140228112 Laakkonen Aug 2014 A1
20140274307 Gonzalez Sep 2014 A1
20140274368 Cotter Sep 2014 A1
20140274370 Shah Sep 2014 A1
20140297408 Zabala Oct 2014 A1
20140331265 Mozell et al. Nov 2014 A1
20150005052 Harrington et al. Jan 2015 A1
20150011283 Sanford et al. Jan 2015 A1
20150018990 Shachar et al. Jan 2015 A1
20150024850 Kokami et al. Jan 2015 A1
20150113548 Stern et al. Apr 2015 A1
20150141140 Lampe May 2015 A1
20150238859 Fear Aug 2015 A1
20150248792 Abovitz et al. Sep 2015 A1
20150281029 Callahan et al. Oct 2015 A1
20150348373 Weingardt et al. Dec 2015 A1
20150375117 Thompson et al. Dec 2015 A1
20160051895 Hood Feb 2016 A1
20160193530 Parker et al. Jul 2016 A1
20160214012 Nishikawa Jul 2016 A1
20160243450 Cotter Aug 2016 A1
20160253865 Men et al. Sep 2016 A1
20160286244 Chang et al. Sep 2016 A1
20160310843 Webb Oct 2016 A1
20160365121 DeCaprio Dec 2016 A1
20160373499 Wagner et al. Dec 2016 A1
20170001111 Willette et al. Jan 2017 A1
20170001112 Gilmore et al. Jan 2017 A1
20170001122 Leung et al. Jan 2017 A1
20170003740 Verfaillie et al. Jan 2017 A1
20170003784 Garg et al. Jan 2017 A1
20170006074 Oates Jan 2017 A1
20170072324 Navok et al. Mar 2017 A1
20170113143 Marr Apr 2017 A1
20170157512 Long et al. Jun 2017 A1
20170182426 Loeb et al. Jun 2017 A1
20170209795 Harvey et al. Jul 2017 A1
20170266549 Paradise Sep 2017 A1
20170266552 Paradise et al. Sep 2017 A1
20170270751 Paradise Sep 2017 A1
20170282075 Michot et al. Oct 2017 A1
20170282082 Hubbard Oct 2017 A1
20170304724 Cotter Oct 2017 A1
20170332131 Opsenica et al. Nov 2017 A1
20170354875 Marks et al. Dec 2017 A1
20170374402 Pogorelik et al. Dec 2017 A1
20180001199 Gary Jan 2018 A1
20180077438 Hensen et al. Mar 2018 A1
20180139257 Ninoles et al. May 2018 A1
20180167656 Ortiz et al. Jun 2018 A1
20180192144 McElroy Jul 2018 A1
20180250598 Trombetta Sep 2018 A1
20180250600 Trombetta Sep 2018 A1
20180367820 Abulikemu Dec 2018 A1
20190099675 Khan et al. Apr 2019 A1
20190102941 Khan et al. Apr 2019 A1
20190118098 Payzer et al. Apr 2019 A1
20190262705 Trombetta Aug 2019 A1
20190262706 Trombetta Aug 2019 A1
20190262712 Trombetta Aug 2019 A1
20190262713 Trombetta Aug 2019 A1
20190262717 Thielbar Aug 2019 A1
20190262720 Trombetta Aug 2019 A1
20190262723 Trombetta Aug 2019 A1
20190262724 Trombetta Aug 2019 A1
20190262727 Trombetta Aug 2019 A1
20190266845 Trombetta Aug 2019 A1
20200179812 Trombetta Jun 2020 A1
20200384364 Trombetta Dec 2020 A1
20200398157 Trombetta Dec 2020 A1
20200398169 Trombetta Dec 2020 A1
20200406152 Trombetta Dec 2020 A1
20210016190 Trombetta Jan 2021 A1
20210043044 Trombetta Feb 2021 A1
20210178257 Trombetta Jun 2021 A1
20210205711 Thielbar Jul 2021 A1
20210346809 Trombetta Nov 2021 A1
Foreign Referenced Citations (60)
Number Date Country
110201395 Sep 2019 CN
110201398 Sep 2019 CN
110201399 Sep 2019 CN
110201401 Sep 2019 CN
110201402 Sep 2019 CN
110201404 Sep 2019 CN
110573221 Dec 2019 CN
111971097 Nov 2020 CN
112423854 Feb 2021 CN
112423855 Feb 2021 CN
112543669 Mar 2021 CN
3 058 996 Aug 2016 EP
3 758 814 Jan 2021 EP
3 758 816 Jan 2021 EP
3 758 818 Jan 2021 EP
3 758 819 Jan 2021 EP
3 758 821 Jan 2021 EP
3 759 930 Jan 2021 EP
3 759 934 Jan 2021 EP
3 758 820 Feb 2021 EP
2001-170360 Jun 2001 JP
2007-036830 Feb 2007 JP
2011-512172 Apr 2011 JP
2011-224204 Nov 2011 JP
2012-176127 Sep 2012 JP
2021-514748 Jun 2021 JP
2021-514749 Jun 2021 JP
2021-514750 Jun 2021 JP
2021-514751 Jun 2021 JP
2021-514752 Jun 2021 JP
2021-514753 Jun 2021 JP
2021-514754 Jun 2021 JP
2021-515485 Jun 2021 JP
2021-516087 Jul 2021 JP
2021-516088 Jul 2021 JP
10-2020-0126975 Nov 2020 KR
10-2020-0127169 Nov 2020 KR
10-2020-0127172 Nov 2020 KR
10-2020-0127173 Nov 2020 KR
10-2020-0128523 Nov 2020 KR
10-2020-0135946 Dec 2020 KR
10-2020-0136894 Dec 2020 KR
WO 2010030313 Mar 2010 WO
WO 2014109435 Jul 2014 WO
WO 2016048204 Mar 2016 WO
WO 2016201333 Dec 2016 WO
WO 2017004433 Jan 2017 WO
WO 2018004812 Jan 2018 WO
WO 2018160274 Sep 2018 WO
WO 2018165191 Sep 2018 WO
WO 2019168614 Sep 2019 WO
WO 2019168615 Sep 2019 WO
WO 2019168619 Sep 2019 WO
WO 2019168620 Sep 2019 WO
WO 2019168630 Sep 2019 WO
WO 2019168631 Sep 2019 WO
WO 2019168636 Sep 2019 WO
WO 2019168637 Sep 2019 WO
WO 2019168638 Sep 2019 WO
WO 2019168646 Sep 2019 WO
Non-Patent Literature Citations (99)
Entry
EP Application No. 19761465.4 Extended European Search Report dated Sep. 1, 2021.
EP Application No. 19761341.7 Extended European Search Report dated Aug. 17, 2021.
EP Application No. 19761523.0 Extended European Search Report dated Aug. 12, 2021.
IN Application No. 202027037125 First Examination Report dated Aug. 5, 2021.
EP Application No. 19760493.7 Extended European Search Report dated Sep. 2, 2021.
U.S. Appl. No. 15/448,356 Final Office Action dated Jul. 28, 2021.
U.S. Appl. No. 17/207,679, Christopher Thielbar, Online Tournament Integration, filed Mar. 21, 2021.
U.S. Appl. No. 17/188,544, Steven Trombetta, Scaled VR Engagement and Views in an E-Sports Event, filed Mar. 1, 2021.
EP Application No. 18763374.8, Extended European Search Report dated Dec. 14, 2020.
U.S. Appl. No. 16/681,477 Final Office Action dated Nov. 30, 2020.
NAHL ranked #1 by The Junior Hockey News_North American Hockey League_NAHL.pdf, http://nahl.com/news/story.cfm?id=15090, Jul. 16, 2015 (Year: 2015).
PCT Application No. PCT/US2018/013378 International Preliminary Report On Patentability dated Sep. 3, 2019.
PCT Application No. PCT/US2018/013378 International Search Report and Written Opinion dated Mar. 8, 2018.
PCT Application No. PCT/US2018/021197 International Preliminary Report on Patentability dated Sep. 10, 2020.
PCT Application No. PCT/US2018/021197 International Search Report and Written Opinion dated May 30, 2018.
PCT Application No. PCT/US2019/015120 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/015120 International Search Report and Written Opinion dated Apr. 15, 2019.
PCT Application No. PCT/US2019/015273 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/015273 International Search Report and Written Opinion dated Apr. 23, 2019.
PCT Application No. PCT/US2019/015124 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/015124 International Search Report and Written Opinion dated Apr. 15, 2019.
PCT Application No. PCT/US2019/015275 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/015275 International Search Report and Written Opinion dated Apr. 23, 2019.
PCT Application No. PCT/US2019/016167 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/016167 International Search Report and Written Opinion dated Aug. 26, 2019.
PCT Application No. PCT/US2019/016180 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/016180 International Search Report and Written Opinion dated Apr. 23, 2019.
PCT Application No. PCT/US2019/016686 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/016686 International Search Report and Written Opinion dated Apr. 10, 2019.
PCT Application No. PCT/US2019/016698 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/016698 International Search Report and Written Opinion dated Apr. 11, 2019.
PCT Application No. PCT/US2019/016694 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/016694 International Search Report and Written Opinion dated Apr. 15, 2019.
PCT Application No. PCT/US2019/017100 International Preliminary Report on Patentability dated Sep. 1, 2020.
PCT Application No. PCT/US2019/017100 International Search Report and Written Opinion dated Apr. 17, 2019.
U.S. Appl. No. 15/448,356 Final Office Action dated Oct. 21, 2020.
U.S. Appl. No. 15/448,356 Office Action dated May 28, 2020.
U.S. Appl. No. 15/448,356 Final Office Action dated Aug. 6, 2019.
U.S. Appl. No. 15/448,356 Office Action dated Jan. 8, 2019.
U.S. Appl. No. 15/448,356 Final Office Action dated Aug. 31, 2018.
U.S. Appl. No. 15/448,356 Office Action dated Mar. 5, 2018.
U.S. Appl. No. 15/450,602 Final Office Action dated Nov. 2, 2018.
U.S. Appl. No. 15/450,602 Office Action dated Apr. 6, 2018.
U.S. Appl. No. 16/681,477 Office Action dated Apr. 16, 2020.
U.S. Appl. No. 15/908,569 Office Action dated Mar. 27, 2020.
U.S. Appl. No. 15/908,569 Office Action dated Jun. 28, 2019.
U.S. Appl. No. 15/908,722 Office Action dated Oct. 30, 2020.
U.S. Appl. No. 15/908,722 Final Office Action dated Jun. 12, 2020.
U.S. Appl. No. 15/908,722 Office Action dated Jun. 28, 2019.
U.S. Appl. No. 15/908,635 Office Action dated Jun. 28, 2019.
U.S. Appl. No. 15/908,531 Office Action dated Jun. 28, 2019.
U.S. Appl. No. 15/908,657 Office Action dated Jun. 28, 2019.
U.S. Appl. No. 15/908,438 Office Action dated Oct. 3, 2019.
U.S. Appl. No. 15/908,345 Office Action dated Jan. 10, 2020.
U.S. Appl. No. 15/908,704 Final Office Action dated Jun. 12, 2020.
U.S. Appl. No. 15/908,704 Office Action dated Jun. 28, 2019.
U.S. Appl. No. 15/908,712 Office Action dated Aug. 8, 2019.
U.S. Appl. No. 15/908,707 Final Office Action dated Nov. 18, 2019.
U.S. Appl. No. 15/908,707 Office Action dated Jul. 17, 2019.
U.S. Appl. No. 17/014,149, Steven Trombetta, Integrating Commentary Content and Gameplay Content Over a Multi-User Platform, filed Sep. 8, 2020.
U.S. Appl. No. 17/014,182, Steven Trombetta, De-Interleaving Gameplay Data, filed Sep. 8, 2020.
U.S. Appl. No. 17/000,841, Steven Trombetta, Incentivizing Players to Engage in Competitive Gameplay, filed Aug. 24, 2020.
U.S. Appl. No. 17/015,845, Steven Trombetta, Player to Spectator Handoff and Other Spectator Controls, filed Sep. 9, 2020.
U.S. Appl. No. 17/060,458, Steven Trombetta, Discovery and Detection of Events in Interactive Content, filed Oct. 1, 2020.
U.S. Appl. No. 17/080,580, Steven Trombetta, Creatioin of Winner Tournaments With Fandom Influence, filed Oct. 26, 2020.
U.S. Appl. No. 15/448,356 Office Action dated Apr. 12, 2021.
U.S. Appl. No. 16/681,477 Office Action dated Mar. 19, 2021.
U.S. Appl. No. 17/380,373, Steven Trombetta, Statistical Driven Tournaments, filed Jul. 20, 2021.
Dietrich et al., Carlos; “Baseball4D: A tool for baseball game reconstruction & visualization”, 2014 IEEE Conference on Visual Analytics Science and Technology (VAST), IEEE, Oct. 25, 2014, pp. 23-32.
EP Application No. 19760732.8 Extended European Search Report dated Oct. 1, 2021.
KR Application No. 10-2019-7029332 Office Action dated Oct. 19, 2021.
EP Application No. 19760888.8 Extended European Search Report dated Oct. 27, 2021.
EP Application No. 19760016.6 Extended European Search Report dated Oct. 8, 2021.
EP Application No. 19759929.3 Extended European Search Report dated Oct. 25, 2021.
EP Application No. 19760240.2 Extended European Search Report dated Feb. 2, 2022.
EP Application No. 19760240.2 Partial Supplementary European Search Report dated Oct. 13, 2021.
EP Application No. 19760890.4 Extended European Search Report dated Nov. 10, 2021.
IN Application No. 202027037244 First Examination Report dated Dec. 15, 2021.
U.S. Appl. No. 17/014,149 Office Action dated Sep. 29, 2021.
U.S. Appl. No. 17/014,182 Office Action dated Sep. 29, 2021.
U.S. Appl. No. 17/000,841 Office Action dated Sep. 20, 2021.
U.S. Appl. No. 17/015,845 Office Action dated Oct. 6, 2021.
U.S. Appl. No. 17/060,458 Office Action dated Oct. 6, 2021.
U.S. Appl. No. 17/188,544 Office Action dated Jun. 27, 2022.
U.S. Appl. No. 17/080,580 Office Action dated May 26, 2022.
PCT Application No. PCT/US22224943 International Search Report and Written Opinion dated Jul. 28, 2022.
U.S. Appl. No. 17/207,679 Office Action dated Jul. 22, 2022.
U.S. Appl. No. 17/380,373 Office Action dated Aug. 5, 2022.
U.S. Appl. No. 17/362,416 Office Action dated Sep. 6, 2022.
Racing game video curation application “Motor Tube” starts delivering its service, 4Gamer.net [online], Nov. 17, 2016 [retrieved on: Nov. 25, 2022].
Support is announced for battle game video curation application “Kak-gei Tube”, MSY [onlin], Jul. 14, 2016 [retrieved on: Nov. 25, 2022].
JP Application No. 2020-545251 Non-Final Notification of Reasons for Refusal dated Oct. 11, 2022.
JP Application No. 2020-545273 Non-Final Notification of Reasons for Refusal dated Oct. 18, 2022.
JP Application No. 2020-55313 Non-Final Notification of Reasons for Refusal dated Dec. 6, 2022.
JP Application No. 2020-545317 Non-Final Notification of Reasons for Refusal dated Nov. 15, 2022.
JP Application No. 2020-545263 Non-Final Notification of Reasons for Refusal dated Dec. 6, 2022.
JP Application No. 2020-545320 Notification of Reasons for Refusal dated Nov. 15, 2022.
JP Application No. 2020-545275 Non-Final Notification of Reasons for Refusal dated Oct. 18, 2022.
JP Application No. 2020-545245 Non-Final Notification of Reason(s) for Refusal dated Oct. 4, 2022.
Related Publications (1)
Number Date Country
20210052982 A1 Feb 2021 US
Continuations (1)
Number Date Country
Parent 15908635 Feb 2018 US
Child 17080551 US