The present technology pertains to peer recommendations. More specifically, the present technology may provide for experience-based peer recommendations.
Social gaming experiences—whether competitive, team-based, or otherwise involving one's social network—are a popular segment of digital gameplay. Conventional social games may require a user to solicit peers to join their social gaming network or may suggest random peers with no common interest to the user. Typically, a user may want to add more peers to their social gaming network, but may not know which peers to add based on a common interest (e.g., style of gameplay, complementary skills, etc.) as such interests may not be readily available or immediately discernible.
Due to the popularity of social gaming, lack of peers or social interactions may detract from the user experience. For example, a user may want to play a multiplayer game with their peers, but the user may not know enough peers to fulfill a required minimum number of players. In another example, random selection of players for a multiplayer game may mismatch the peers to the user (e.g., peers with different skillsets, styles, etc.). Such mismatch of peers or lack of peers may discourage the user from continuing play of the multiplayer game.
There is, therefore, a need in the art for systems and methods for experience-based peer recommendations.
Embodiments of the present invention include systems and methods for providing experience-based peer recommendations in a network environment. At least one set of activity data and user generated content (UGC) may be stored in memory in a database. Such UGC may depict an activity and the set of activity data may include data about the activity depicted by the UGC. One of the set of activity data and an associated UGC may be received. Such activity file may regard an activity that a user and at least one peer have participated in simultaneously on the network environment. A user-peer indication may be identified based that identifies a peer with whom the user experienced a significant event during the activity. Such significant event may be based on one or more event factors. A peer recommendation may be generated based on such user-peer indication and the peer recommendation and at least a portion of the UGC associated with the significant event may be provided to the user.
Various embodiments may include methods for providing peer recommendations in a network environment. Such methods may include receiving a set of activity data and user generated content (UGC) associated with the set of activity data. Such activity file may include data about an activity that a user and at least one peer have participated in simultaneously on the network environment. Such methods may include identifying a user-peer indication that identifies a peer with whom the user experienced a significant event during the activity. Such significant event may be based on one or more event factors. Such methods may include generating a peer recommendation based on the user-peer indication and providing the peer recommendation and at least a portion of the UGC associated with the significant event to the user.
Additional embodiments may include systems for providing experience-based peer recommendations. Such systems may include memory that stores at least one set of activity data and at least one user generated content (UGC). Each of the at least one UGC may be associated with one of the sets of activity data. Each set of activity data may include data about an activity. Such system may include a processor that executes instructions stored in memory. Execution of the instructions by the processor may receive a set of activity data regarding an activity that a user and at least one peer have participated in simultaneously on the network environment. Execution of the instructions by the processor may identify a user-peer indication that identifies a peer with whom the user experienced a significant event during the activity. Such significant event may be based on one or more event factors. Execution of the instructions by the processor may generate a peer recommendation based on the user-peer indication and may provide the peer recommendation and at least a portion of the UGC associated with the significant event to the user.
Further embodiments include non-transitory computer-readable storage media having embodied thereon a program executable by a processor to provide peer recommendations in a network environment.
The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the technology can be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a more thorough understanding of the technology. However, it will be clear and apparent that the technology is not limited to the specific details set forth herein and may be practiced without these details. In some instances, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.
Embodiments of the present invention include systems and methods for providing experience-based peer recommendations. A set of activity data and user generated content (UGC) associated with the set of activity data may be stored in a database. Such set of activity data may include data regarding an activity that a user and at least one peer participated in together over a network environment. The UGC may depict the activity. The set of activity data and UGC may be received by a server. A user-peer indication may be identified by the server that identifies a peer that the user experienced a significant event with during the activity. Such significant event may be based on one or more event factors. A peer recommendation may be generated based on the user-peer indication and the peer recommendation and at least a portion of the UGC associated with the significant even may be provided to the user.
Interactive content source servers 110 may maintain and host interactive content titles (e.g., video games, interactive books, interactive movies, etc.) available for play to a user device 140 over a communication network. Such interactive content servers 110 may be implemented in the cloud (e.g., one or more cloud servers). Each interactive content title may include one or more activities available within the content title. The one or more activities may be playable by a single user or by multiple users. In one example, the interactive content title is a video game title having different modes of competitive gameplay available within that game title. In another example, the interactive content title is another video game title having an interactive storyline for single user play. Such interactive content servers 110 may also stream and capture UGC (i.e., video, screenshots, commentary, audio, etc.) during a user and/or peer participation in an activity, though such content may also be captured and/or streamed by the platform servers 120.
The platform servers 120 may be responsible for communicating with the different interactive content servers 110, databases 140, and user devices 130. Such platform servers 120 may be implemented on one or more cloud servers. The interactive content servers 110 may communicate with multiple platform servers 120. The platform servers 120 may also carry out instructions, for example, for receiving an activity file 216, shown in
The interactive content titles, streaming media, and corresponding activity information may be provided through an application programming interface (API) 160, which allows various types of interactive content source servers 110 to communicate with different platform servers 120 and different user devices 130. API 160 may be specific to the particular computer programming language, operating system, protocols, etc., of the interactive content source servers 110 providing the streaming media, the platform servers 120 providing the associated activity and/or object information, and user devices 130 receiving the same. In a network environment 100 that includes multiple different types of interactive content source servers 110 (or platform servers 120 or user devices 130), there may likewise be a corresponding number of APIs 160.
The user device 130 may include a plurality of different types of computing devices. For example, the user device 130 may include any number of different gaming consoles, mobile devices, laptops, and desktops. In another example, the user device 130 may be implemented in the cloud (e.g., one or more cloud servers). Such user device 130 may also be configured to access data from other storage media, such as, but not limited to memory cards or disk drives as may be appropriate in the case of downloaded services. Such devices 130 may include standard hardware computing components such as, but not limited to network and media interfaces, non-transitory computer-readable storage (memory), and processors for executing instructions that may be stored in memory. These user devices 130 may also run using a variety of different operating systems (e.g., iOS, Android), applications or computing languages (e.g., C++, JavaScript). An exemplary user device 130 is described in detail herein with respect to
The databases 140 may be stored on the platform server 120, the interactive content source servers 110, any of the servers 218 (shown in
In the exemplary network environment 200 of
Concurrent to the content recorder 202 receiving and recording content from the interactive content title 230, a UDS library 204 receives data from the interactive content title 230, and a UDS activity recorder 206 tracks the data to determine when an activity beings and ends. The UDS library 204 and the UDS activity recorder 206 may be implemented on the platform server 120, a cloud server, or on any of the servers 218. When the UDS activity recorder 206 detects an activity beginning, the UDS activity recorder 206 receives activity data (e.g., user interaction with the activity, activity ID, activity start times, activity end times, actvity results, activity types, etc.) from the UDS library 204 and records the activity data onto a UDS ring-buffer 210 (e.g., ActivityID1, START_TS; ActivityID2, START_TS; ActivityID3, START_TS). Such activity data recorded onto the UDS ring-buffer 210 may be stored in a UDS activity file 216. Such UDS activity file 216 may also include activity start times, activity end times, an activity ID, activity results, activity types (e.g., competitive match, quest, task, etc.), user or peer data related to the activity. For example, a UDS activity file 216 may store data regarding an item used during the activity. Such UDS activity file 216 may be stored on the UDS server 226, though the UDS activity file 216 may be stored on any server, a cloud server, any console 228, or any user device 130.
Such UDS activity data (e.g., the UDS activity file 216) may be associated with the content data (e.g., the media file 212 and/or the content time stamp file 214) by the console 228 or any of the servers 218. In one example, the UGC server 232 stores and associates the content time stamp file 214 with the UDS activity file 216 based on a match between the streaming ID or GSS ID of the content time stamp file 214 and a corresponding activity ID of the UDS activity file 216. In another example, the UDS server 226 may store the UDS activity file 216 and may receive a query from the UGC server 232 for a UDS activity file 216. Such query may be executed by searching for an activity ID of a UDS activity file 216 that matches a streaming ID or GSS ID of a content time stamp file 214 transmitted with the query. In yet another example, a query of stored content time stamp files 214 may be executed by matching a start time and end time of a content time stamp file 214 with a start time and end time of a corresponding UDS activity file 216 transmitted with the query. Such UDS activity file 216 may also be associated with the matched content time stamp file 214 by the UGC server 232, though the association may be performed by any server, a cloud server, any console 228, or any user device 130. In another example, a UDS activity file 216 and a content time stamp file 214 may be associated by the console 228 during creation of each file 216, 214.
In step 310, a set of activity data and UGC associated with the set of activity data are received by the platform server 120 or the interactive content servers 110, though such set of activity data and UGC may be received by any server, a cloud server, any console 228, or any user device 130. Such UGC may have at least one time stamp and the set of activity data may include a corresponding at least one time stamp. Such UGC may be capture at the same time as the set of activity data and stored with the set of activity data, though the UGC may be stored elsewhere. The UGC may be generated by the user and/or peers during participation in the activity.
The set of activity data may have data regarding an activity that a user and at least one peer have simultaneously participated in on the network environment 100. Such set of activity data may include information about the activity (e.g., activity start times, activity end times, an activity ID, activity results, activity types (e.g., competitive match, quest, task, etc.), and user or peer data related to the activity). Such activity may be stored in an activity file 216 and may be part of a gaming environment.
Each set of activity data may also include a direct link to the associated activity. Such link allows a user to access an activity from UGC. For example, the user may wish to participate in an activity shown by the UGC. In the same example, the user can select an option to play the activity, and the activity may be automatically launched after selection by the user. Alternatively, the activity file may block the user from accessing the activity if the user does not own the interactive content title and may prompt the user to purchase such title.
In step 320, a user-peer indication is identified by the platform server 120 or the interactive content servers 110. Such user-peer indication may identify a peer that the user experienced a significant event with at least one peer during participation in the activity. Such significant event may be determined based on one or more event factors such as user interaction with at least one peer such, a duration of the interaction, a type of event (e.g., battling the same boss, travelling through a dungeon together, participating in a match competition, etc.), a status of the user and each peer (e.g., a quantity of health or mana of the user and/or peer's character, a skill level of the user and/or peer's character, etc.), amount and type of contribution to the event by the user and each peer (e.g., a peer leads a raid of a dungeon compared to another peer who follows, the user's character kills a boss and a peer's character continually heals the user's character, etc.), a result of the event (e.g. a peer's character saving the life of a user's character, the user discovering a rare item with help from the peer, experiencing a twist in a storyline, the user assisting with a scoring event, etc.), UGC captured (e.g., whether the user and/or peers captured UGC, when the user and/or peers captured UGC, a duration of the UGC, whether the UGC was uploaded, etc.) and conversations captured via camera, microphone, or keyboard during the event (e.g., the user thanking the peer for the peer's help or the user and/or the peer expressing emotion (i.e., laughing, exclaiming, etc.)).
Such significant event may be flagged or marked by the platform server 120. For example, a time stamp indicating a start of the significant event and another time stamp indicated an end of the significant event may be stored in the activity file 216 by the platform server 120. Such start and end of the significant event may be determined based on the one or more factors described above. For example, the start time may be flagged when the user's character health is almost fully depleted by a boss and the end time may be flagged when the user thanks the peer for saving the user's character. In another example, the start time may be flagged when a boss is almost defeated and the end time may be flagged when the boss is defeated. In yet another example, the start time and the end time may be flagged based on when UGC was captured by the user. Further, such UGC captured by the user and/or the peer during participation by the user and/or peer in the activity may be associated with the significant event by matching the start time stamp and the end time stamp of the significant event to a time stamp captured with the UGC in the content time stamp file 214.
Such user-peer indication may be further based on a relationship between the user and peer within or outside of the network environment 100. For example, a third party account (e.g., social media) of the user may indicate that the user has a relationship with the peer outside of the network environment (e.g., the user and peer are related, enrolled in the same school, have mutual friends, etc.). In another example, the user and the peer may have participated in previous activities together and/or have previously communicated with each other within the network environment 100. Such user-peer indication may also be based on similarities between the user and the peer (e.g., similar skill level, character levels, interactive content titles, geographical location, etc.) In one example, the user and peer may own the same interactive content title and/or live within a predetermined distance from each other. In yet another example, the user and peer may participate in the same types of activities across different interactive content titles, thus suggesting that the user and peer enjoy similar types of activities.
In step 330, a peer recommendation is generated based on the user-peer indication and is provided to the user by the platform server 120 or the interactive content servers 110. At least a portion of the UGC associated with the significant event is also provided to the user. Such at least the portion UGC may depict the entire significant event via a streaming video or may only show a clip or a screenshot of the significant event. In one example, such at least the portion of the UGC may be determined based on a match between a time stamp of the significant event and one of the time stamps of the UGC. In another example, such portion of the UGC may be a streaming video or a commentary having a duration. Such duration may be determined based on a match between a start time stamp and an end time stamp of the significant event and a first time stamp and a second time stamp of at least one time stamps of the UGC.
Such peer recommendation and portion of the UGC may be displayed to the user after the user and the peer have completed the activity via a separate pop up or within the interactive content title (e.g., in game), on a home screen after the user has exited the interactive content title, or after the user selects a new activity and/or a new interactive content title to participate in. Further, such peer recommendation and portion of the UGC may be displayed to the user when the user selects a new activity that is a similar type of activity as the activity that the significant event occurred in or when the user selects the same interactive content title that the significant event occurred in.
Providing at least the portion of the UGC with the peer recommendation may remind the user of the user's experience with the peer, which may incentivize the user to accept the recommendation. For example, the UGC may depict a character of the peer saving a character of the user, which may be of significance to the user. As previously described, the UGC may include a link to the activity depicted. Such link may allow the user to automatically launch the activity depicted. Such link may further send a notification and/or invitation to one or more peers depicted in the activity that the user is beginning participation in the activity. Such one or more peers may have been recommended to the user.
Such user-peer indication and portion of the UGC may be blocked by the user. Such blocking may occur if a user has selected a user setting that may block all recommendations or may block some recommendations based on certain criteria (e.g., the user and the peer do not have any mutual peers in common, the user and the peer do not have a relationship outside of the network environment 100, the peer does not have a skillset desired by the user, etc.). For example, the user may only wish to receive peer recommendations if the peer owns two or more of the same interaction content titles. In another example, the user may only wish to receive peer recommendations from peers that the user knows personally outside of the network environment 100. As such, the user may further customize identification of a peer recommendation.
Systems and methods for experience-based peer recommendations may enhance a user experience by providing targeted peer suggestions for a user to add to their selected peer list (i.e., a friends list). Providing experience-based peer recommendations filters out many peers that a user would likely not have any commonality with and presents peers that a user has already experienced a significant and/or meaningful event with. Further, experience-based peer recommendations may provide a peer that matches the user's skills and/or style and may incentivize each player to continue participation in an interactive content title or other similar title.
Entertainment system 400 may be an electronic game console. Alternatively, the entertainment system 400 may be implemented as a general-purpose computer, a set-top box, a hand-held game device, a tablet computing device, a virtual reality device, an augmented reality device, or a mobile computing device or phone. Entertainment systems may contain more or less operating components depending on a particular form factor, purpose, or design.
The CPU 410, the vector unit 415, the graphics processing unit 420, and the I/O processor 425 of
The graphics processing unit 420 of
A user of the entertainment system 400 of
The present invention may be implemented in an application that may be operable by a variety of end user devices. For example, an end user device may be a personal computer, a home entertainment system (e.g., Sony PlayStation2® or Sony PlayStation3® or Sony PlayStation4®), a portable gaming device (e.g., Sony PSP® or Sony Vita®), or a home entertainment system of a different albeit inferior manufacturer. The present methodologies described herein are fully intended to be operable on a variety of devices. The present invention may also be implemented with cross-title neutrality wherein an embodiment of the present system may be utilized across a variety of titles from various publishers.
The present invention may be implemented in an application that may be operable using a variety of devices. Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU. Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.
This application is a continuation-in-part of U.S. patent application Ser. No. 16/220,465 filed Dec. 14, 2018, which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5913013 | Abecassis | Jun 1999 | A |
8448095 | Haussila et al. | May 2013 | B1 |
8460108 | Hendrickson et al. | Jun 2013 | B2 |
8764555 | Quan et al. | Jul 2014 | B2 |
8918728 | Hamilton et al. | Dec 2014 | B2 |
9155963 | Baynes et al. | Oct 2015 | B2 |
9168460 | Pearce | Oct 2015 | B2 |
9333433 | Cotter | May 2016 | B2 |
9381425 | Curtis et al. | Jul 2016 | B1 |
9468851 | Pieron | Oct 2016 | B1 |
9795879 | Colenbrander | Oct 2017 | B2 |
10109003 | Jenkins et al. | Oct 2018 | B1 |
10564820 | Cabanero et al. | Feb 2020 | B1 |
10569164 | Bleasdale-Shepherd | Feb 2020 | B1 |
10843085 | Stephens | Nov 2020 | B2 |
10848805 | Mattar et al. | Nov 2020 | B1 |
10881962 | Stephens | Jan 2021 | B2 |
11080748 | Stephens | Aug 2021 | B2 |
11090568 | Mattar et al. | Aug 2021 | B1 |
11213748 | Jarzebinski | Jan 2022 | B2 |
11247130 | Stephens | Feb 2022 | B2 |
11269944 | Stephens | Mar 2022 | B2 |
11420130 | Clingman | Aug 2022 | B2 |
11442987 | Clingman | Sep 2022 | B2 |
11465053 | Stephens | Oct 2022 | B2 |
11602687 | Clingman | Mar 2023 | B2 |
20020077170 | Johnson | Jun 2002 | A1 |
20020183105 | Cannon et al. | Dec 2002 | A1 |
20040021684 | Millner | Feb 2004 | A1 |
20070198740 | Peters et al. | Aug 2007 | A1 |
20070198939 | Gold | Aug 2007 | A1 |
20080045335 | Garbow et al. | Feb 2008 | A1 |
20080262858 | Broady | Oct 2008 | A1 |
20090115776 | Bimbra et al. | May 2009 | A1 |
20090170609 | Kang | Jul 2009 | A1 |
20090176557 | Hall et al. | Jul 2009 | A1 |
20090197681 | Krishnamoorthy | Aug 2009 | A1 |
20090276713 | Eddy | Nov 2009 | A1 |
20100070613 | Chen et al. | Mar 2010 | A1 |
20100105484 | Horneff et al. | Apr 2010 | A1 |
20100304348 | Lehavi | Dec 2010 | A1 |
20110067061 | Karaoguz et al. | Mar 2011 | A1 |
20110092282 | Gary | Apr 2011 | A1 |
20110113149 | Kaal | May 2011 | A1 |
20110250971 | van Os et al. | Oct 2011 | A1 |
20110314029 | Fischer et al. | Dec 2011 | A1 |
20110319229 | Corbalis et al. | Dec 2011 | A1 |
20120004956 | Huston | Jan 2012 | A1 |
20120030123 | Ocko | Feb 2012 | A1 |
20120094762 | Khan | Apr 2012 | A1 |
20120115580 | Homik et al. | May 2012 | A1 |
20120206574 | Shikata et al. | Aug 2012 | A1 |
20120252583 | Mikkelsen | Oct 2012 | A1 |
20120284292 | Rechsteiner et al. | Nov 2012 | A1 |
20120309533 | Horita et al. | Dec 2012 | A1 |
20120317198 | Patton et al. | Dec 2012 | A1 |
20120322561 | Kohlhoff | Dec 2012 | A1 |
20130064527 | Maharajh et al. | Mar 2013 | A1 |
20130084969 | Knoles et al. | Apr 2013 | A1 |
20130086484 | Antin et al. | Apr 2013 | A1 |
20130165234 | Hall | Jun 2013 | A1 |
20130190094 | Ronen et al. | Jul 2013 | A1 |
20130212342 | McCullough et al. | Aug 2013 | A1 |
20130244785 | Gary | Sep 2013 | A1 |
20140012922 | Wu | Jan 2014 | A1 |
20140080601 | Knutsson | Mar 2014 | A1 |
20140179440 | Perry | Jun 2014 | A1 |
20140199045 | Lee et al. | Jul 2014 | A1 |
20140204014 | Thorn et al. | Jul 2014 | A1 |
20140206456 | Koplar | Jul 2014 | A1 |
20140228112 | Laakkonen | Aug 2014 | A1 |
20140235338 | Hansson et al. | Aug 2014 | A1 |
20140243097 | Yong et al. | Aug 2014 | A1 |
20140243098 | Yong et al. | Aug 2014 | A1 |
20140274297 | Lewis et al. | Sep 2014 | A1 |
20140364210 | Murray et al. | Dec 2014 | A1 |
20150026728 | Carter et al. | Jan 2015 | A1 |
20150081777 | Laine | Mar 2015 | A1 |
20150094139 | Kargar | Apr 2015 | A1 |
20150142799 | Eronen et al. | May 2015 | A1 |
20150224396 | Okada | Aug 2015 | A1 |
20150245084 | Downing et al. | Aug 2015 | A1 |
20150296250 | Casper | Oct 2015 | A1 |
20150306499 | Chimes et al. | Oct 2015 | A1 |
20150331856 | Choi et al. | Nov 2015 | A1 |
20150381689 | Ganesh et al. | Dec 2015 | A1 |
20160005326 | Syrmis et al. | Jan 2016 | A1 |
20160029153 | Linn et al. | Jan 2016 | A1 |
20160078471 | Hamedi | Mar 2016 | A1 |
20160147890 | Wissner et al. | May 2016 | A1 |
20160149956 | Birnbaum et al. | May 2016 | A1 |
20160277349 | Bhatt et al. | Sep 2016 | A1 |
20160287997 | Laakkonen et al. | Oct 2016 | A1 |
20160350813 | Balasubramanian et al. | Dec 2016 | A1 |
20160366483 | Joyce et al. | Dec 2016 | A1 |
20170001111 | Willette et al. | Jan 2017 | A1 |
20170001122 | Leung et al. | Jan 2017 | A1 |
20170050111 | Perry et al. | Feb 2017 | A1 |
20170087460 | Perry | Mar 2017 | A1 |
20170126757 | Kuo et al. | May 2017 | A1 |
20170157512 | Long et al. | Jun 2017 | A1 |
20170188116 | Major et al. | Jun 2017 | A1 |
20170189815 | Tweedale et al. | Jul 2017 | A1 |
20170246544 | Agarwal et al. | Aug 2017 | A1 |
20170301041 | Schneider | Oct 2017 | A1 |
20170339093 | Pesavento et al. | Nov 2017 | A1 |
20170354888 | Benedetto et al. | Dec 2017 | A1 |
20180001194 | Sherwani et al. | Jan 2018 | A1 |
20180001216 | Bruzzo | Jan 2018 | A1 |
20180014077 | Hou et al. | Jan 2018 | A1 |
20180021684 | Benedetto | Jan 2018 | A1 |
20180033250 | O'Heeron et al. | Feb 2018 | A1 |
20180101614 | Kuipers et al. | Apr 2018 | A1 |
20180126279 | Stelovsky et al. | May 2018 | A1 |
20180192142 | Paul | Jul 2018 | A1 |
20180295175 | Smith et al. | Oct 2018 | A1 |
20180302761 | Rizzolo | Oct 2018 | A1 |
20180318708 | Rom et al. | Nov 2018 | A1 |
20180343505 | Loheide et al. | Nov 2018 | A1 |
20180359477 | Yang | Dec 2018 | A1 |
20190052471 | Panattoni et al. | Feb 2019 | A1 |
20190208242 | Bates et al. | Jul 2019 | A1 |
20190246149 | Reza et al. | Aug 2019 | A1 |
20190282906 | Yong | Sep 2019 | A1 |
20190297376 | McCarty et al. | Sep 2019 | A1 |
20200061465 | Benedetto et al. | Feb 2020 | A1 |
20200101382 | Wheeler et al. | Apr 2020 | A1 |
20200111306 | Oberberger et al. | Apr 2020 | A1 |
20200114267 | Sakurai | Apr 2020 | A1 |
20200147489 | Mahlmeister et al. | May 2020 | A1 |
20200169793 | Akerfeldt | May 2020 | A1 |
20200184041 | Andon et al. | Jun 2020 | A1 |
20200188781 | Stephens | Jun 2020 | A1 |
20200188792 | Stephens | Jun 2020 | A1 |
20200188794 | Stephens | Jun 2020 | A1 |
20200188800 | Stephens | Jun 2020 | A1 |
20200192929 | Stephens | Jun 2020 | A1 |
20200193476 | Stephens | Jun 2020 | A1 |
20200193477 | Stephens | Jun 2020 | A1 |
20210077907 | Stephens | Mar 2021 | A1 |
20210129023 | Jarzebinski | May 2021 | A1 |
20210370169 | Clingman | Dec 2021 | A1 |
20210370185 | Clingman | Dec 2021 | A1 |
20210374180 | Clingman | Dec 2021 | A1 |
20220088474 | Dicken et al. | Mar 2022 | A1 |
20220143516 | Thielbar | May 2022 | A1 |
20220193546 | Jarzebinski | Jun 2022 | A1 |
20220401845 | Clingman | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
113710337 | Nov 2021 | CN |
113710340 | Nov 2021 | CN |
113727764 | Nov 2021 | CN |
113727765 | Nov 2021 | CN |
114599432 | Jun 2022 | CN |
116457066 | Jul 2023 | CN |
2014342 | Jan 2009 | EP |
3894030 | Oct 2021 | EP |
3894031 | Oct 2021 | EP |
3894032 | Oct 2021 | EP |
4240505 | Sep 2023 | EP |
H11179050 | Jul 1999 | JP |
2009522853 | Jun 2009 | JP |
2011217803 | Nov 2011 | JP |
2017-182603 | Oct 2017 | JP |
2022-512425 | Feb 2022 | JP |
2022-512492 | Feb 2022 | JP |
2022-513485 | Feb 2022 | JP |
2022-513849 | Feb 2022 | JP |
2023-500868 | Jan 2023 | JP |
2018-0094833 | Aug 2018 | KR |
WO 2009094611 | Jul 2009 | WO |
WO-2009094611 | Jul 2009 | WO |
WO 2014047490 | Mar 2014 | WO |
2015200737 | Dec 2015 | WO |
WO 2017182642 | Oct 2017 | WO |
WO 2017188677 | Nov 2017 | WO |
WO 2020123115 | Jun 2020 | WO |
WO 2020123116 | Jun 2020 | WO |
WO 2020123117 | Jun 2020 | WO |
WO 2020123118 | Jun 2020 | WO |
WO 2021086561 | May 2021 | WO |
WO 2021242476 | Dec 2021 | WO |
WO 2021242477 | Dec 2021 | WO |
WO 2021242478 | Dec 2021 | WO |
WO 2022098707 | May 2022 | WO |
Entry |
---|
PCT Application No. PCT/US2019/062626 International Search Report and Written Opinion dated Jan. 29, 2020. |
PCT Application No. PCT/US2019/062606 International Search Report and Written Opinion dated Jan. 30, 2020. |
PCT Application No. PCT/US2019/062613 International Search Report and Written Opinion dated Feb. 3, 2020. |
U.S. Appl. No. 16/220,460 Office Action dated Jan. 28, 2020. |
U.S. Appl. No. 16/220,397, Mischa Stephens, Targeted Gaming News and Content Feeds, filed Dec. 14, 2018. |
U.S. Appl. No. 16/359,160, Mischa Stephens, Targeted Gaming News and Content Feeds, filed Mar. 20, 2019. |
U.S. Appl. No. 16/379,683, Mischa Stephens, Interactive Objects in Streaming Media and Marketplace Ledgers, filed Apr. 9, 2019. |
U.S. Appl. No. 16/380,760, Mischa Stephens, Media-Activity Binding and Content Blocking, filed Apr. 10, 2019. |
U.S. Appl. No. 16/220,443, Mischa Stephens, Interactive Objects in Streaming Media and Marketplace Ledgers, filed Dec. 14, 2018. |
U.S. Appl. No. 16/220,460, Mischa Stephens, Media-Activity Binding and Content Blocking, filed Dec. 14, 2018. |
U.S. Appl. No. 16/220,465, Mischa Stephens, Experience-Based Peer Recommendations, filed Dec. 14, 2018. |
U.S. Appl. No. 16/220,465 Office Action dated Jun. 15, 2020. |
U.S. Appl. No. 16/220,397 Office Action dated Sep. 25, 2020. |
U.S. Appl. No. 16/359,160 Office Action dated Nov. 13, 2020. |
U.S. Appl. No. 16/220,443 Office Action dated Oct. 19, 2020. |
U.S. Appl. No. 16/379,683 Office Action dated Nov. 6, 2020. |
U.S. Appl. No. 16/885,629, Dustin S. Clingman, Media-Object Binding for Displaying Real-Time Play Data for Live-Streaming Media, filed May 28, 2020. |
U.S. Appl. No. 16/885,653, Dustin S. Clingman, Media-Object Binding for Predicting Performance in a Media, filed May 28, 2020. |
U.S. Appl. No. 16/885,641, Dustin S. Clingman, Media-Object Binding for Dynamic Generation and Displaying of Play Data Associated With Media, filed May 28, 2020. |
PCT Application No. PCT/US2019/062602 International Search Report and Written Opinion dated Feb. 14, 2020. |
U.S. Appl. No. 16/380,760 Office Action dated Mar. 6, 2020. |
PCT Application No. PCT/US2020/054603 International Search Report and Written Opinion dated Jan. 28, 2021. |
U.S. Appl. No. 16/220,465 Final Office Action dated Dec. 24, 2020. |
U.S. Appl. No. 16/359,160 Final Office Action dated Mar. 12, 2021. |
U.S. Appl. No. 16/220,443 Final Office Action dated Apr. 13, 2021. |
U.S. Appl. No. 16/379,683 Final Office Action dated May 7, 2021. |
U.S. Appl. No. 16/679,795 Office Action dated May 10, 2021. |
U.S. Appl. No. 17/102,881, Mischa Stephens, Media-Acitivty Binding and Content Blocking, filed Nov. 24, 2020. |
U.S. Appl. No. 16/679,795, Alexander Jarzebinkski, Content Streaming With Gameplay Launch, filed Nov. 11, 2019. |
PCT/US20/54603, Content Streaming With Gameplay Launch, Oct. 7, 2020. |
Li et al., “Distributed Multimedia Systems”, IEEE, Jul. 1997, retrieved on [Feb. 7, 2021]. Retrieved from the internet <URL: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.128.9759&rep1&type=pdf>. |
PCT Application No. PCT/US2019/062602 International Preliminary Report on Patentability dated Jun. 8, 2021. |
PCT Application No. PCT/US2019/062606 International Preliminary Report on Patentability dated Jun. 8, 2021. |
PCT Application No. PCT/US2019/062613 International Preliminary Report on Patentability dated Jun. 8, 2021. |
PCT Application No. PCT/US2019/062626 International Preliminary Report on Patentability dated Jun. 8, 2021. |
PCT Application No. PCT/US2021/030378 International Search Report and Written Opinion dated Aug. 5, 2021. |
PCT Application No. PCT/US2021/030379 International Search Report and Written Opinion dated Aug. 5, 2021. |
PCT Application No. PCT/US2021/030380 International Search Report and Written Opinion dated Aug. 12, 2021. |
U.S. Appl. No. 16/359,160 Office Action dated Jul. 12, 2021. |
U.S. Appl. No. 16/220,443 Office Action dated Aug. 6, 2021. |
U.S. Appl. No. 16/220,465 Office Action dated Jul. 26, 2021. |
U.S. Appl. No. 17/566,964, Alexander Jarzebinski, Content Streaming With Gameplay Launch, filed Dec. 31, 2021. |
U.S. Appl. No. 17/517,875, Christopher Thielbar, Replayable Activities for Interactive Content Titles, filed Nov. 3, 2021. |
PCT/US21/57832, Replayable Activities for Interactive Content Titles, Nov. 3, 2021. |
EP Application No. 19896543.6 Extended European search report dated Aug. 12, 2022. |
EP Application No. 19895486.9Extended European search report dated Oct. 5, 2022. |
EP Application No. 19897134.3 Extended European search report dated Oct. 5, 2022. |
PCT Application No. PCT/US2020/054603 International Preliminary Report on Patentability dated May 2, 2022. |
U.S. Appl. No. 17/517,875 Office Action dated Oct. 13, 2022. |
PCT Application No. PCT/US2021/057832 International Search Report and Written Opinion dated Feb. 16, 2022. |
U.S. Appl. No. 16/885,635 Office Action dated Mar. 30, 2022. |
EP Application No. 19896349.8 Extended European search report dated Jul. 5, 2022. |
PCT Application No. PCT/US2021/030378 International Preliminary Report on Patentability dated Nov. 17, 2022. |
PCT Application No. PCT/US2021/030379 International Preliminary Report on Patentability dated Nov. 17, 2022. |
PCT Application No. PCT/US2021/030380 International Preliminary Report on Patentability dated Nov. 17, 2022. |
U.S. Appl. No. 17/566,964 Office Action dated Nov. 23, 2022. |
U.S. Appl. No. 18/114,482, Dustin S. Clingman, Media-Object Binding for Predicting Performance in a Media, filed Feb. 27, 2023. |
U.S. Appl. No. 17/517,875 Final Office Action dated Mar. 31, 2023. |
Anonymous: “New Replay and Resume Features Coming in Heart of the Swarm—StarCraft II—Blizzard News”, Jan. 24, 2013 (Jan. 24, 2013), XP093053633, Retrieved from the Internet: URL:https://news.blizzard.com/en-gb/starcraft2/10054757/new-replay-and-resume-features-coming-in-heart-of-the-swarm [retrieved on Jun. 12, 2023] * pp. 3,5 *. |
Chinese Application No. 201980089787.4 First Office Action dated Aug. 18, 2023. |
Japanese Application No. 2021-533796 Non Final Notification of Reasons for Refusal dated Nov. 2, 2023. |
Japanese Application No. 2021-533797 Non Final Notification of Reasons for Refusal dated Oct. 17, 2023. |
European Application No. 20882882.2 Extended European Search Report dated Oct. 13, 2023. |
U.S. Appl. No. 17/517,875 Office Action dated Oct. 31, 2023. |
Number | Date | Country | |
---|---|---|---|
20200188796 A1 | Jun 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16220465 | Dec 2018 | US |
Child | 16358546 | US |