The present document relates to a user interface for interacting with customized highlight shows for sporting events, entertainment events, news, and/or the like.
There are many services that provide game highlights, box scores, and commentary for sporting events. Such services include, for example, news programs, sports channels, websites, and the like. In general, however, such services provide highlights based on some generalized determination as to what sporting events are likely to be of interest to the general viewer, and what portions of a sporting event are most likely to be of interest to the general viewer.
In general, such services do not take into account the interests, preferences, and context of an individual viewer. What may be of interest to one sports fan may be uninteresting to another sports fan. Currently available services merely broadcast a game's highlights without considering the myriad preferences of individual viewers that can make a sporting event more interesting or less interesting for them. This results in the inefficient use of the sports fans' time, potential loss of viewership, and a concomitant decrease in advertisement revenue. In addition, such services do not generally provide any mechanisms for automatically generating customized highlight shows having a narrative component.
Further, existing services that provide game highlights generally do not provide user interfaces that enable the user to customize the content and/or display options for the content. Yet further, existing services generally do not facilitate the tracking of fantasy sports teams or related sports content. Thus, the user is unable to customize the service to suit his or her preferences.
Various embodiments of the technology described herein provide mechanisms for automatically generating and outputting customized highlight shows for sporting events, entertainment events, and/or the like. User interfaces may be provided that facilitate the generation of a customized highlight show based on the preferences of the user.
In some embodiments, the customized highlight show may be provided by obtaining source content representing one or more events such as sporting events, receiving a first user selection designating a first attribute of the source content and/or a viewing length, selecting a subset of the source content such that the subset has highlights from within the source content and has the first attribute and/or the viewing length, and generating a customized highlight show with the subset. The customized highlight show may then be provided to the user, for example, on a display screen.
The first user selection may select a viewing length from among a plurality of possible viewing lengths displayed on the display screen. Similarly, the first user selection may select an attribute from one or more possible attributes. In the context of sporting events, such attributes may include, but are not limited to, a team playing in the one or more sports events, a player playing in the one or more sports events, and a type of play occurring in the one or more sports events. Any of these attributes may be selected from a list, such as a list of teams, play types, and/or players. Excitement levels may be displayed in connection with any of the attributes to indicate levels of excitement expected to be experienced by a user viewing the corresponding highlights. The customized highlight show may then be generated by gathering highlights that feature the selected team(s), play type(s), and/or player(s). Multiple attributes may be optionally selected and used to generate the customized highlight show.
A time indicator may be displayed in connection with the customized highlight show. The time indicator may indicate a time currently being viewed in relation to the total length of the customized highlight show. Further, indicators may be displayed in association with the time indicator to indicate portions of the customized highlight show for which levels of excitement expected to be experienced by a user are relatively high and/or portions that pertain to one or more of the selected attributes.
In some embodiments, a customized highlight show may be generated for a fantasy team, such as a fantasy sports team provided by the user via manual or other input. The customized highlight show may simulate a match between two fantasy sports teams by gathering highlights in which real players of the two fantasy sports teams were involved, from real sports events occurring within a user-selectable timeframe.
In such a case, two separate sets of indicators may be used to permit the user to readily differentiate the plays involving one fantasy sports team's players from those involving players on the other fantasy sports team. A fantasy sports score may be provided to represent aggregate, comparative performance of the two teams within the customized highlight show. A menu may enable the user to select from multiple matches between different pairings of fantasy sports teams. Each match may have an excitement level that indicates the levels of excitement expected to be experienced by a user watching the match.
The user may have the option to focus the customized highlight show on one or more starting players on a fantasy sports team, one or more benched players on the fantasy sports team, and/or one or more players that are not on the fantasy sports team. Players on other teams may optionally be selected based on field position or other characteristics so that a user may easily evaluate players for inclusion on his or her fantasy sports team.
Further details and variations are described herein.
The accompanying drawings, together with the description, illustrate several embodiments. One skilled in the art will recognize that the particular embodiments illustrated in the drawings are merely exemplary, and are not intended to limit scope.
Definitions
The following definitions are presented for explanatory purposes only, and are not intended to limit scope.
In at least one embodiment, the technology disclosed herein relates to generating and outputting a customized highlight show having a narrative component, for events such as sporting events, entertainment events, news events, and/or the like. The highlight show can be automatically generated to convey a narrative, and can also incorporate one or more theme(s) and project a particular perspective.
The technology disclosed herein is able to obtain or extract segments from any suitable source, whether local or remote, and whether recorded or live. Examples include: live or recorded broadcasts of sporting events; online clips from video-sharing sites such as Vimeo or YouTube; archived video; local media such as a hard drive, optical drive, or magnetic drive; set-top boxes; local or remote servers; mobile computing devices such as smartphones or tablets; cameras; camcorders; or the like. Combinations of any such media can also be used. Source video can include the entire event (such as an entire game), or it can be a pre-curated highlight video from which a customized highlight show can be extracted.
Source video and/or other source content can come from any available source, whether linear (such as available via a cable box), or ondemand/IP-enabled (such as available from a website or on-demand service).
In another embodiment, video can be obtained from an online video-sharing website (such as Vimeo or YouTube). Such videos may be identified by title, metadata, and/or other means. In many cases, however, title or metadata for such video-sharing websites may be inaccurate; accordingly, in at least one embodiment, the system analyzes the video coming from such sources and determines correct information from the video analysis before using such video in generating a customized highlight show. In at least one embodiment, the system identifies and associates pre-curated, individual segments with specific occurrences in the event. For example, if the event is a sporting event such as a baseball game, the system can identify a set of videos that are available via a video-sharing website, depicting individual plays of the game. In order to prioritize these videos correctly, the system associates the videos with individual occurrences (such as plays) that took place in the course of the game. In at least one embodiment, this is done by automated analysis of metadata associated with the videos. In at least one embodiment, such analysis is supplemented by additional techniques to improve accuracy, such as natural language processing and/or fuzzy logic; in this manner, each video can be correctly associated with the correct occurrence within the sporting event.
In another embodiment, video for a customized highlight show can come from the user's (or another user's) own video capture device, such as a smartphone, camera, or camcorder belonging to someone who attended the event.
In another embodiment, video from different sources can be used, and can be combined to generate the customized highlight show. In at least one embodiment, the system may include multiple angles of the same occurrence (such as a particularly remarkable occurrence), which angles may come from different video sources. For example, a customized highlight show can include the television feed for a grand slam, followed by a YouTube video of the same grand slam as captured by a fan who attended the game; since the YouTube video captures the occurrence from a different perspective, it may be effective to include it for emphasis and to show the crowd's reaction. In another example, the system can combine the video feed from one source (such as a network broadcast) with the audio feed from another source (such as a local radio commentator for one of the teams); such a combination may be more entertaining or interesting to a fan of that local commentator's team.
In at least one embodiment, the system takes into account what video sources or other content are available to the user. For example, if the user is a subscriber to a premium sports package, he or she may have access to on-demand video for certain sporting events that are not available to a nonsubscriber. In at least one embodiment, the described system detects this, and uses such on-demand video (or other content) when available to construct the customized highlight show. For non-subscribers, the system looks for other available sources of content.
In order to effectively customize a highlight show, the technology disclosed herein is able to ascertain preferences and interests of an individual user (or group of users). This can be done, for example, by any of: querying the user; observing his or her behavior; pulling preferences from a profile such as that collected and maintained by a social network, making inferences based on content viewed by the user, demographics, or other factors; observation of the user's friends or associates; and/or any combination thereof. In short, any suitable mechanism(s) can be used for determining such preferences and interests. In addition, the technology disclosed herein takes into account the amount of time available to the user for viewing a highlight show; this can be specified explicitly by the user, or ascertained based on previous viewing habits, industry standards, and/or any other factors. In addition, the technology disclosed herein takes in to account the desire of the user to see spoiler transitions (which reveal outcomes and/or events before they are displayed) vs. discreet transitions (which do not).
In at least one embodiment, the disclosed technology is capable of generating different customized highlight shows for different users, based on factors that might make particular sporting events (or occurrences within such events) more exciting or less exciting for different categories of users. In at least one embodiment, the disclosed technology takes into account the degree to which a user is likely to be interested in a particular type of performance, team, league, player, division, conference, game, sport, genre or other variable. In one example, different highlight shows might be generated for home team fans as opposed to visiting team fans or neutral fans. As used herein, a home team fan refers to a subscriber who is a fan of (or otherwise has an affinity for) the team that hosts the event, the visiting team fan refers to a subscriber who is a fan of (or otherwise has an affinity for) the team opposing the home team, and the neutral fan does not have a preference or affinity for the home team or the visiting team. In some embodiments, the event may involve more than two teams and/or one or more individuals. In some embodiments, the customized highlight shows described herein can be generated separately for home team fans, visiting team fans and neutral fans. When the event involves more than two teams and/or one or more individuals, the customized highlight shows described herein can be generated separately for fans of each of the multiple teams and/or individuals.
Customized highlight shows can also be generated for other groups of people. For example, customized highlight shows can be generated separately for different users based on user's affinity for fast-paced games, games with large momentum swings, games with great historical context or importance, or other categories. For example, a customized highlight show can include segments that are of a type that a particular user finds exciting, such as a crash within an auto race or a fight during a hockey game.
In at least one embodiment, customized highlight shows include segments from a single event. In at least one other embodiment, customized highlight shows can include segments from more than one event, such as a number of games that took place on a given day or over some other period of time, or that are part of a series, or the like. The particular assembly of segments from the different events is selected based on the user's individual affinities and characteristics.
In at least one embodiment, customized highlight shows can be automatically constructed to focus on a particular player, team, division, league, playoff series, or the like. Customized highlight shows can be generated which show highlights for all of a user's favorite players, even if they are on different teams, or for players belonging to a user's fantasy team in a fantasy league. In such an embodiment, the system obtains information about which players are in the user's fantasy team league, so that appropriate selections can be made as to which highlights to include; these selections can be made based on excitement level and/or priority as described below, but can also take into account the degree to which the user's players were involved in each occurrence (play) of the game.
In at least one embodiment, customized highlight shows can be automatically constructed so that they present or reinforce a narrative or theme. The narrative may relate to a particular player, team, division, league, playoff series, or the like, or it may be external to any of those entities. Customized highlight shows can be generated which show highlights relating to such a narrative; alternatively, as described in the above-cited related applications, such customized highlight shows can relate to a user's favorite players, even if they are on different teams, or for players belonging to a user's fantasy team in a fantasy league. In such an embodiment, the system obtains information about which players are in the user's fantasy team league, so that appropriate selections can be made as to which highlights to include; these selections can be made based on excitement level and/or priority as described below, but can also take into account the degree to which the user's players were involved in each occurrence (play) of the game.
In at least one embodiment, customized highlight shows can be accompanied by scores, explanatory text, commentary, or other auxiliary content. Such content may be automatically generated, or may be composed by a human author. Such content can take any suitable form, such as audio (spoken commentary or voice-over), text (caption, title, or heading), graphical (icon or symbol on the screen), or the like. An example is a caption that appears at the beginning of a highlight segment, giving a context (such as a score, on-base situation, pitch count, possession, introductory text, or the like) for the highlight segment. Such auxiliary content may appear within the highlight show itself (such as before each segment of the highlight show), and/or it can appear on a screen that summarizes the overall highlight show, such as a navigation screen that allows a user to see individual segments within the highlight show, as illustrated in more detail below. Other arrangements are possible.
In at least one embodiment, such contextual information can be derived from any suitable source, and can include items such as the current game situation, the current ball situation, and/or the like. In at least one embodiment, a transition effect can be used between highlight segments; such transition effect can be informational or non-informational.
In at least one embodiment, such contextual information can contain spoilers elucidating what is about to be shown in the upcoming highlight segment. In an alternative embodiment, such contextual information can be devoid of spoilers and simply help set up the situation at the time the upcoming highlight initiates. In at least one embodiment, a user can specify whether he or she wishes to see spoilers; in another embodiment, the system can make an automatic determination as to whether or not to include spoilers.
In at least one embodiment, the user can interact with a customized highlight show as it is being displayed. For example, the user can click on a link or perform some other input operation while a highlight segment is being displayed, to access more information about that portion of the sporting event. Clicking on the link can take the user to a more detailed description of the highlight segment, or to full video of the depicted event, or the like. A user interface can be provided that allows different highlight segments to be accessed, for example via a menu or other user interface element. In this manner, the displayed customized highlight show can be used as a mechanism for navigating within a more complete depiction of an event.
Customized highlight shows can be provided to users via any suitable mechanism. In at least one embodiment, such highlight shows can be shown via a website, app (mobile or desktop), set-top box, software application, or the like. Any suitable hardware can be used for presenting customized highlight shows, including a desktop computer, laptop computer, television, smartphone, tablet, music player, audio device, kiosk, set-top box, game system, wearable device, consumer electronic device, and/or the like. Such devices are generally referred to herein as client devices. Content can be transmitted to client devices via any suitable means, such as for example a computing network, cable network, satellite connection, wireless network, and/or the like. Any suitable video format can be used, such as for example MP4 or HTTP Live Streaming (HLS).
Content, including customized highlight shows, can be transmitted to a client device from a server, cable provider, on-demand provider, satellite provider, and/or the like. Alternatively, the described technology can be implemented on a stand-alone device, such as a DVR containing a recording of a sporting event, wherein the device generates a customized highlight show from such a locally stored recording and presents it to the user. Thus, the technology can be implemented without requiring a connection to a remote server or other source of content.
User Interface
Referring now to
The visual depiction of excitement level is optional, and is presented here for illustrative purposes. Excitement level is one possible factor that can be considered when determining which portions of the event to include in the customized highlight show. Other factors can be used, such as novelty, as well as personalized factors that relate to an individual user's affinity for a particular team, player, type of play, and/or the like, and such factors can be combined with the excitement level (or can modify the excitement level) if appropriate to determine which segments to include. In other embodiments, other metrics can be used instead of or in addition to excitement level. In at least one embodiment, a derived metric called “priority” is used to determine which portions of an event to include in the customized highlight show; priority can be derived from excitement level, novelty, and/or other factors, as described in more detail below.
In the example of
In at least one embodiment, a summary caption 107 is shown for each thumbnail 101. In at least one embodiment, the user can choose to omit such captions 107 so as to avoid “spoilers”.
Interface 100 also contains other controls that can be used to specify parameters for the customized highlight show. Buttons 105 allow the user to specify whether he or she is more interested in a baseline (neutral) view, or is a fan of one team or the other, or is a fan of a particular player on one of the teams; different sets of highlight segments can be selected based on which affinity is specified. Buttons 106 allow the user to specify how long the customized highlight show should be; in at least one embodiment, different sets of highlight segments may be chosen depending on how much time is available. In at least one embodiment, in response to the user clicking on one of buttons 105 or 106, a different customized highlight show is dynamically generated; accordingly, graph 103 and thumbnails 101, 102 can be automatically updated in response to such an input operation.
In at least one embodiment, a “why was I shown this” link can be provided (not shown in
As described herein, the highlight show can contain highlight segments from a single event or multiple events, and in at least one embodiment can contain highlight segments that describe or reinforce a narrative. Thus, as described herein, highlight segments can be selected in such a way that takes into account the degree to which they reinforce a particular narrative and incorporate one or more themes. The selection mechanism can also take into account other factors at the same time, including for example a determined excitement level for each highlight segment, as well as novelty, and/or priority.
In other embodiments, the customized highlight show can simply be presented as a video (or audio track), without any graphical representation of levels, and without any indication of a timeline or the like. Such a presentation may be particularly useful in a context where a user is viewing the highlight show on a television rather than on a website. Such a presentation can still allow for user navigation and interaction, for example by allowing a user to press a “forward” or “back” button to skip to the next or previous highlight segment within the highlight show. Such a presentation can also allow a user to obtain more information or see more detailed highlights (or even switch to a view of the unexpurgated event itself, such as the entire game) by pressing an “Enter” button or the like (or performing some other suitable input operation).
Referring now to
In the example of
System Architecture
According to various embodiments, the system can be implemented on any electronic device, or set of electronic devices, equipped to receive, store, and present information. Such an electronic device may be, for example, a desktop computer, laptop computer, television, smartphone, tablet, music player, audio device, kiosk, set-top box, game system, wearable device, consumer electronic device, and/or the like.
Although the system is described herein in connection with an implementation in particular types of computing devices, one skilled in the art will recognize that the techniques described herein can be implemented in other contexts, and indeed in any suitable device capable of receiving and/or processing user input, and presenting output to the user. Accordingly, the following description is intended to illustrate various embodiments by way of example, rather than to limit scope.
Referring now to
Client device 206 can be any electronic device, such as a desktop computer, laptop computer, television, smartphone, tablet, music player, audio device, kiosk, set-top box, game system, wearable device, consumer electronic device, and/or the like. In at least one embodiment, client device 206 has a number of hardware components well known to those skilled in the art. Input device(s) 251 can be any component(s) that receive input from user 250, including, for example, a keyboard, mouse, stylus, touch-sensitive screen (touchscreen), touchpad, gesture receptor, trackball, accelerometer, five-way switch, microphone, or the like. Input can be provided via any suitable mode, including for example, one or more of: pointing, tapping, typing, dragging, gesturing, tilting, shaking, and/or speech. Display screen 252 can be any component that graphically displays information, video, content, and/or the like, including depictions of events, highlights, and/or the like. Such output may also include, for example, audiovisual content, data visualizations, navigational elements, graphical elements, queries requesting information and/or parameters for selection of content, or the like. Additionally or alternatively, display screen 252 may display status information in a wide variety of formats, including but not limited to status reports, summary reports, comparative reports, and the like. In at least one embodiment where only some of the desired output is presented at a time, a dynamic control, such as a scrolling mechanism, may be available via input device(s) 251 to change which information is currently displayed, and/or to alter the manner in which the information is displayed.
Processor 257 can be a conventional microprocessor for performing operations on data under the direction of software, according to well-known techniques. Memory 256 can be random-access memory, having a structure and architecture as are known in the art, for use by processor 257 in the course of running software for performing the operations described herein. Client device can also include local storage (not shown), which may be a hard drive, flash drive, optical or magnetic storage device, web-based (cloud-based) storage, and/or the like.
Any suitable type of communications network 204, such as the Internet, can be used as the mechanism for transmitting data between client device 206 and various server(s) 202, 214, 216 and/or content provider(s) 224 and/or data provider(s) 222, according to any suitable protocols and techniques. In addition to the Internet, other examples include cellular telephone networks, EDGE, 3G, 4G, long term evolution (LTE), Session Initiation Protocol (SIP), Short Message Peer-to-Peer protocol (SMPP), SS7, Wi-Fi, Bluetooth, ZigBee, Hypertext Transfer Protocol (HTTP), Secure Hypertext Transfer Protocol (SHTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), and/or the like, and/or any combination thereof. In at least one embodiment, client device 206 transmits requests for data and/or content via communications network 204, and receives responses from server(s) 202, 214, 216 containing the requested data and/or content.
In at least one embodiment, the system of
In at least one embodiment, system 200 generates customized highlight shows including a narrative component by analyzing live feeds and/or recordings of events, including any or all of video content, audio content, play-by-play statistics and metrics, closed-captioning, and/or any other available data related to the event.
In one embodiment, system 200 includes one or more web server(s) 202 coupled via a network 204 to one or more client devices 206. Network 204 may be a public network, a private network, or a combination of public and private networks such as the Internet. Network 204 can be a LAN, WAN, wired, wireless and/or combination of the above. Client device 206 is, in at least one embodiment, capable of connecting to network 204, either via a wired or wireless connection. In at least one embodiment, client device may also include a recording device capable of receiving and recording events, such as a DVR, PVR, or other media recording device. Such recording device can be part of client device 206, or can be external; in other embodiments, such recording device can be omitted. Although
Web server(s) 202 include one or more physical computing devices and/or software that can receive requests from client device(s) 206 and respond to those requests with data, as well as send out unsolicited alerts and other messages. Web server(s) 202 may employ various strategies for fault tolerance and scalability such as load balancing, caching and clustering. In at least one embodiment, web server(s) 202 may include caching technology, as known in the art, for storing client requests and information related to events.
Web server(s) 202 maintain, or otherwise designate, one or more application server(s) 214 to respond to requests received from client device(s) 206. In at least one embodiment, application server(s) 214 provide access to business logic for use by client application programs in client device(s) 206. Application server(s) 214 may be co-located, co-owned, or co-managed with web server(s) 202. Application server(s) 214 may also be remote from web server(s) 202. In at least one embodiment, application server(s) 214 interact with one or more analytical server(s) 216 and one or more data server(s) 218 to perform one or more operations of the disclosed technology.
In an exemplary operation of system 200, one or more users 250 of client devices 206 make a request to view a customized highlight show for an event or set of events, which may include sporting event(s) or non-sporting event(s). In another embodiment, such customized highlight show can be presented to user 250 without a specific request having been made by user 250. In one embodiment, user 250 can specify, via input device(s) 251 at client device 206, certain parameters for the customized highlight show (such as, for example, what event/games/teams to include, how much time the user 250 has available to view the highlight show, and/or any other parameters). User preferences can also be extracted from storage, such as from user data 255 stored in storage device 253, so as to customize the highlight show without necessarily requiring user 250 to specify preferences. User preferences can be determined based on observed behavior and actions of user 250 (for example, by observing website visitation patterns, television watching patterns, music listening patterns, online purchases, and/or the like); in addition or alternatively, user preferences can be retrieved from previously stored preferences that were provided by user 250. Such user preferences may indicate which teams, sports, players, and/or types of events are of interest to user 250, and/or they may indicate what type of narrative user 250 might be interested in. Such preferences can therefore be used to guide selections of highlight segments for highlight shows.
Analytical server(s) 216, which may include one or more computing devices, analyze live or recorded feeds of play-by-play statistics related to one or more events from data provider(s) 222. Examples of data provider(s) 222 may include, but are not limited to, providers of real-time sports information such as STATS™, Perform (available from Opta Sports of London, UK), and SportRadar of St. Gallen, Switzerland. In one embodiment, analytical server(s) 216 generate different sets of excitement levels for events; such excitement levels can then be used (possibly in combination with other data) for selecting highlight segments according to the techniques described herein. The operations performed by analytical server(s) 216 are described in more detail in the above-cited related U.S. Utility Applications.
Application server(s) 214 receive the different sets of excitement levels generated by analytical server(s) 216, and use such data to generate customized highlight shows for user 250 according to the techniques described herein. In at least one embodiment, application server(s) 214 derive a priority metric for each sequence, possession, occurrence, string, or other portion of events; the priority metric can be derived from the excitement level and/or from other information. The priority metric can then be used to select highlight segments for inclusion in a customized highlight show. In other embodiments, application server(s) 214 use excitement level alone, and do not generate a priority. In at least one embodiment, application server(s) 214 takes into account the degree to which various sequences, possessions, occurrences, strings, or other portions of events support a particular narrative, in order to determine whether to include such elements in the highlight show.
Content for highlight shows can come from any suitable source, including from content provider(s) 224 (which may include websites such as YouTube, MLB.com, and the like; sports data providers; television stations; client- or server-based DVRs; and/or the like). Alternatively, content can come from a local source such as a DVR or other recording device associated with (or built into) client device 206. In at least one embodiment, application server(s) 214 makes the customized highlight show available to user 250, either as a download, or streaming content, or on-demand content, or by some other means. In another embodiment, application server(s) 214 sends information to client device 206 to identify specific highlight segments for a highlight show, and client device 206 then retrieves or obtains the identified highlight segments for presentation to user 250. Such an embodiment avoids the need for video content or other high-bandwidth content to be transmitted via network 204 unnecessarily, particularly if such content is already available at client device 206.
For example, referring now to
Returning to
In at least one embodiment, one more data server(s) 218 are provided. Data server(s) 218 respond to requests for data from any of server(s) 202, 214, 216, for example to obtain event data 254 and/or user data 255. In at least one embodiment, such information can be stored at any suitable storage device 253 accessible by data server 218, and can come from any suitable source, such as from client device 206 itself, content provider(s) 224, data provider(s) 222, and/or the like. Event data 254 can include any information describing any number of events, as well as occurrences, excitement levels, and/or other information. User data 255 can include any information describing users 250, including for example, demographics, purchasing behavior, web viewing behavior, interests, preferences, and/or the like.
Referring now to
The specific hardware architectures depicted in
In one embodiment, the system can be implemented as software written in any suitable computer programming language, whether in a standalone or client/server architecture. Alternatively, it may be implemented and/or embedded in hardware.
Referring now to
In at least one embodiment, excitement level results module 226 receives different sets of excitement levels related to one or more events, from excitement level generation module 230 of analytical server 216 (described below). Priority determination module 225 uses data from excitement level results module 226, along with other data concerning sequences, possessions, strings, or occurrences within the event, to generate priority metrics for each sequence, possession, string, or occurrence. In at least one embodiment, based on the priority metrics generated by priority determination module 225, along with user preferences obtained by user preferences module 224, highlight show generation module 227 generates customized highlight show(s) for presentation to user 250, according to the techniques described herein. In another embodiment, priority determination module 225 can be omitted, and highlight show generation module 227 generates customized highlight show(s) based on excitement levels received from excitement level results module 226, along with user preferences obtained by user preferences module 224.
Referring now to
Referring now to
In at least one embodiment, device 300 includes memory 256, a processor 257, and a system bus 306 that couples various system components including memory 256 to processor 257. System bus 306 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
Memory 256 stores, in part, instructions and data for execution by processor 257 in order to perform the process described herein. Memory 256 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) and random access memory (RAM). A basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within device 300, such as during start-up, is typically stored in the ROM. RAM typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processor 257.
Device 300 further includes a mass storage device 308. Storage device 308, which may be implemented with a magnetic disk drive, an optical disk drive or flash memory, or the like, is a non-volatile storage device for storing data and instructions for use by processor 257. In one embodiment, storage device 308 stores the system software for implementing the processes described herein for purposes of loading to memory 256. Storage device 308 may be internal or external to device 300.
A user (such as user 250) may enter commands and information into device 300 through any suitable input device(s) 251. Input device(s) 251 can be any element that receives input from user 250, including, for example, a keyboard, mouse, stylus, touch-sensitive screen (touchscreen), touchpad, trackball, accelerometer, five-way switch, microphone, remote control, or the like. Input can be provided via any suitable mode, including for example, one or more of: pointing, tapping, typing, dragging, gesturing, tilting, shaking, and/or speech. These and other input devices are often connected to processor 257 through a user input interface 310 that is coupled to system bus 306, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A display screen 252 and/or other type of output device are also connected to system bus 306 via an interface, such as a video interface 318. Display screen 252 can be any element that graphically displays information, video, content, and/or the like, including depictions of events, segments, and/or the like. In at least one embodiment, in addition to or instead of display screen 252, device 300 may also include other output devices such as speakers 322, printer 324, which may be connected through an output peripheral interface 320 or other suitable interface.
Device 300 may operate in a networked environment using logical connections to one or more remote devices 330. Remote devices 330 may be a desktop computer, laptop computer, television, smartphone, tablet, music player, audio device, kiosk, set-top box, game system, wearable device, consumer electronic device, and/or the like, and/or other common network node, and typically includes many or all of the elements described above relative to the device 300. In at least one embodiment, when used in a networking environment, device 300 is connected to a remote network (such as network 204) through a network interface or adapter 328.
The components contained in the device of
Conceptual Architecture
In various embodiments, customized highlight shows including a narrative component can be generated in different ways. In general, as described in more detail below, the techniques involve identifying a number of highlight segments for an event, determining start/end times for the identified highlight segments, and presenting a customized highlight show including a narrative component to a user, including the identified highlight segments. In some embodiments, the full representation of the event (for example, an audiovisual recording of a sporting event) can be stored at a server; in other embodiments, it is stored at a client or at another location. Further details on the method are provided below.
Referring now to
Highlight show generation logic 606 identifies segments 601 by specifying start/end times for each segment 601. In at least one embodiment, such start/end times are measured with respect to a video clock, which measures the actual elapsed time since the beginning of an event. In other embodiments, start/end times may be measured with respect to other timekeeping measures. Further descriptions of the video clock, along with other timekeeping measures such as a game clock, are provided below.
In the example of
In the embodiment shown in
Referring now to
Any suitable data structure format can be used for storing and delivering time codes 604. For example, a set of discrete playlists can be provided, for serving specific requested highlight segments. Alternatively, a complete playlist for the entire highlight show can be provided, from which time codes 604 for specific highlight segments can be extracted.
Referring now to
Referring now to
Referring now to
Data Structures
Any suitable data structures can be used for storing data concerning events. The following are examples of data structures that can be used for various sporting events; however, one skilled in the art will recognize that different types of data may be relevant to different types of events.
For illustrative purposes, methods are described herein in terms of application to a sporting event such as a baseball game. These descriptions make reference to particular exemplary data structures that are applicable to such sporting events, as described above. However, one skilled in the art will recognize that the claimed system and method can be applied to other types of events, and can use different data structures; therefore, the specifics of the following description are merely exemplary.
Referring now to
As shown in
In at least one embodiment, user 250 may be prompted to approve or decline such attempts to automatically obtain information about him or her.
User information may be obtained at any suitable time. If such information is obtained in advance (for example, when registering upon initial use of the system), such information can be stored, for example in user data 255 of server-based storage device 253. Alternatively, such information can be stored locally at client device 206. Stored user information can be updated as appropriate when new or additional information becomes available (for example, if additional tracking information is available, or if the user updates his or her profile). Alternatively, user information can be obtained at the time that user 250 requests a customized highlight show; in such a case, step 410 can take place after step 418. In at least one embodiment, no user information is collected, and the system generates the highlight show automatically without taking into account personal characteristics of the user.
A request is received 418 for a customized highlight show. In at least one embodiment, web server 202 receives the request, and passes the request to application server(s) 214 for processing, although in other embodiments, any suitable component can receive the request. The request can be made by user 250, for example at a website or by activating an app on device 206, or by any suitable means. The request may be for a highlight show for a particular event, or for any number of events. For example, in at least one embodiment, the described system generates a customized highlight show that includes a number of sporting events that took place on a given day; in such an embodiment, user 250 may request a “daily roundup” of sporting events that are determined to be of interest to him or her. Alternatively, user 250 may request a customized highlight show for a particular sport, such as baseball, causing the system to generate a customized highlight show for that sport, including those highlight segments of that sport that are likely to be of interest. Alternatively, user 250 may request a customized highlight show for a particular series, such as a playoff series, causing the system to generate a customized highlight show for that series, including those highlight segments of that series that are likely to be of interest. Alternatively, user 250 may request a customized highlight show for a single game, portion of a game, or other event. Alternatively, user 250 may request a customized highlight show for a single player, across a number of games, for a single game, or for a portion of a game. Alternatively, user 250 may request a customized highlight show relating to user's 250 fantasy team in a fantasy sports league, an opposing fantasy team, multiple fantasy teams or match-ups, and/or the like. Segments can also include non-event coverage for a given event, which may include, for example, pre-game, in-game, and post-game interviews, analysis, commentary, and/or the like. These segments can be appended to an overall highlight show in much the same way that individual event segments from within a specific event are.
In another embodiment, wherein “push” technology is enabled, a customized highlight show can be provided to user 250 without user 250 having specifically requested it. For example, the system can be configured to make a customized highlight show available to user 250 on a periodic basis (such as daily, weekly, or according to some other schedule), or automatically at the conclusion of any game that user 250 is likely to be interested in, or in response to some other trigger event. The customized highlight show can be transmitted to user 250 for immediate viewing, or placed on device 206 for viewing at user's 250 convenience. Alternatively, an email message or other message can be transmitted to user 250 with a link that permits viewing of the highlight show. User 250 may sign up in advance for such customized highlight shows to be provided; alternatively, in at least one embodiment, user 250 may be automatically enrolled based on a determination that user 250 would likely be interested in such customized highlight shows (for example, based on viewing or purchasing behaviors). As with the on-request mechanisms described above, such “push”-based highlight shows can be for a single event or any number of events.
The length of time available for the customized highlight show is determined 412. In at least one embodiment, user 250 can click on a button or link to select a length of time (as shown in
Dynamic excitement level(s) for the selected event(s) is/are determined 413. In at least one embodiment, this step includes generating excitement levels for possessions, occurrences, and/or strings within the event(s), so that the excitement level rises and falls during the course of the event(s). In at least one additional embodiment, this step includes generating excitement levels for interviews, analysis, and/or commentary before, during, and after a given event. The dynamic excitement level(s) can be combined with an excitement level for the event as a whole, which may be based on a determination of how interested a particular user 250 would be in that event. The dynamic excitement level(s) can further be combined with an outline of a storyline, themes, and/or narrative that upwardly adjusts the potential interest level for those occurrences, analysis, and interviews that contribute best to communicating the drama, intrigue, suspense, and excitement of the given event(s). In this manner, those occurrences that contribute the most to a given narrative will tend to be scored with higher interest levels, and thus be more likely to be included, than those occurrences that do not contribute to the narrative. The result of such combination, which may also take into account other available information about the sequence, possession, string, or event, is a priority value. Additional details concerning generation of priority values are provided below.
Techniques for determining excitement levels are described in the above-cited related U.S. Utility Applications. In at least one embodiment, the excitement level(s) are determined based on personal characteristics of user 250 as obtained in step 410; for example, if user 250 is a fan of a particular team or player, occurrences involving scoring by that team or player may be deemed to have a higher excitement level for user 250 than occurrences involving scoring by the other team or other players. In embodiments where the customized highlight show includes highlights from multiple events, the selection of which events to draw highlights from may depend on whether user's 250 teams are involved in one event or another. Thus, step 413 may involve determining that the overall excitement level for an event may be higher if user 250 is a fan of one of the teams involved in the event.
Some events may have high levels of excitement even for non-fans of one of the teams. For example, if an event is a World Series game, the game may be of interest because of the importance of the game, even if user 250 is not a fan of either team competing in the event.
One skilled in the art will recognize that priority is merely one possible metric for determining which highlight segments should be included in a particular highlight show. Other metrics can also be used, either instead of or in addition to priority. In at least one embodiment, for example, excitement level alone is used, without taking into account other factors.
Segments (including possessions, occurrences, and/or strings having high priority (based on excitement level and/or other factors)) are then identified 414 and selected 415. These steps may be performed by, for example, setting a threshold priority and determining which possessions, occurrences, and/or strings in the selected event(s) have a priority (for user 250) that meets or exceeds the threshold. The threshold priority can be selected in such a manner as to ensure that the generated highlight show is of the correct length. Alternatively, the steps may be performed by, for example, selecting a certain number of possessions, occurrences, and/or strings in the selected event(s) that have the highest priorities (for user 250).
Once segments have been selected 415, a determination is made 416 as to the start/end times the selected segments. For example, if an occurrence is a goal, the few seconds or minutes preceding the goal, wherein the play is set up, may be included in the highlight segment, and a few seconds or minutes of celebration after the goal may also be included. The determination as to when the highlight segment should stop and start can be made based on any suitable factors, including for example a determination of when the particular possession began, or when the play began, or the most recent clock stoppage, inning, at-bat, video analysis of camera cuts or angle changes, end of a sentence in the audio feed, or any other suitable demarcation. A change in excitement level may be used to determine suitable start/end points for the highlight segment. In at least one embodiment, start/end times for highlight segments are chosen based on the duration of a possession, or on some portion of a possession. Where appropriate, an instant replay of the occurrence may be included, which may show a different angle than the primary angle, or may be in slow motion; such instant replay may be extracted from the event content in the same manner as primary content. In addition, where appropriate, independent analysis of a given occurrence or one or more relevant interviews of a player, coach, analyst, fan, etc. may be included.
In at least one embodiment, start/end times can be identified 416 before segments have been selected, so that demarcations of segments that include occurrences may be made in advance. For example, a video of a sporting event may be available, along with start/end times for various possessions, plays, occurrences, innings, time-outs, and the like. Such data can be available from any suitable source, such as for example data provider(s) 222. Such data can be generated manually or in an automated fashion. In at least one embodiment, data available from data provider(s) 222 can be supplemented with derived data. For example, if data from data provider(s) 222 includes raw data such as descriptions, event text, event identifiers, and the like, additional information can be derived by applying natural language processing or other automated techniques to event text and/or descriptions.
In at least one embodiment, in some situations, the system adjusts start/end times based on the available time for the customized highlight show. For example, if it is determined that a particular occurrence has very high priority, but the start/end times of the occurrence are too long to reasonably fit in the allotted time for the customized highlight show, a shorter excerpt of the event (still including the occurrence but having shorter duration than the specified start/end times indicate) may be included in the customized highlight show. Conversely, start/end times can be adjusted to lengthen the highlight segment if more time is needed to fill the allotted time for the customized highlight show.
Further details and variations concerning the determination of start/end times for segments are described below.
Highlight segments are then assembled 417 to generate the highlight show. In at least one embodiment, highlight segments are assembled in chronological order, although in certain situations it may be preferable to use a different order. The highlight show can be supplemented, for example with automatically or manually generated segment transitions, and/or with captions, titles, descriptions, voice-overs, contextual information, and/or the like, for example to indicate the score, possession, game situation, or the like. Such supplemental information can be visual, text-based, graphical, audio, spoken word, or any combination thereof. User 250 may have the option to turn on or off such supplemental information. Spoiler and spoiler-free supplemental information can be provided, giving user 250 a choice as to which he or she prefers.
The highlight show is presented 418 to user 250. In at least one embodiment, this is done by displaying a video (with accompanying audio) to user 250, containing the highlight show. In another embodiment, user 250 may be presented with a screen that allows him or her to navigate to individual highlight segments or to see the entire highlight show via a “play all” function; examples of such screens are shown in
The method then ends 499.
In at least one embodiment, as depicted in
If, despite any adjustments and/or tolerances, the generated highlight show is not of the correct length, the threshold priority is adjusted 421, and steps 414 through 417, along with step 420, are repeated with the new threshold priority. In this manner, an iterative process is performed and, if necessary, repeated until a highlight show of acceptable length has been assembled.
Referring now to
Once the parameters of the event have been acquired, and all variables have been initialized, data associated with the event is loaded 452 so as to enable selection of segments. Such data can come from any suitable source, and can include any data that is or may be useful in selecting segments for inclusion in the highlight show. Examples of such data can be play-by-play data and/or the like. Such data can include anything relevant to the event or the segment, and may come from any source. For example, if the event is a baseball game, such data can include a current score, current inning, how many runners are on base, how many tweets have been communicated about this game and what was said, the current excitement level, the current novelty level, how the current segment fits a particular narrative, and the like.
Referring now to
Loaded data is then cleaned 453. This step can include correcting errors and reconciling inconsistencies in the data loaded in step 452. This can include, for example, determining that multiple occurrences (plays) should actually be combined into a single occurrence, or correcting OCR data that indicates an inconsistent score. It can also include, for example, correcting an error that may be introduced due to a video transition, such as a blur or fade, that was present in the video feed, or ensuring that the current score is correct when a play is reviewed and a goal is disallowed after it previously appeared on the on-screen scoreboard.
Referring now to
Next, occurrences (plays) are identified 454. Depending on the nature of the event, this can include, for example, determining when a whistle is blown, or a pitch is thrown, or a batter has completed an at-bat, or a change of possession takes place, or the like. Any available information can be used for this step. Data from pbpData, videoData, audioData, and/or any combination thereof can be used to identify 454 occurrences.
Once occurrences have been identified 454, play data is augmented and reconciled 455 with play-by-play data (for example, from pbpData). This can include, for example, correlating the play data to the play-by-play data from pbpData. If any discrepancies are found, they can be reconciled; for example if videoData or pbpData is missing an occurrence, and the other data source includes that occurrence, the discrepancy can be resolved by using information from the source that includes the occurrence. In addition, such discrepancies can be noted so that modifications can be made to the detection algorithms to improve detection in the future.
Step 454 can also include reconciling the game clock with elapsed time on the game video, and performing other reconciliation tasks.
Referring now to
Next, the play data is corrected 456; this can include correcting pbpData, videoData, and/or the like. Play data can be corrected based on some detected discrepancy, for example, between the OCR data and the play-by-play data. In at least one embodiment, this step can be a manual step, performed by a user, or it can be performed semi-automatically.
Next, highlight segments are identified 457 from the available data, using information such as the start and stop points of all occurrences, possessions, sequences, strings, and the like, within the video stream. In at least one embodiment, highlight segments can have different ingress/egress points; for example, if a highlight segment depicts a goal, the segment may be extended for a few seconds so as to include celebrations and/or fan reactions following the goal. Accordingly, certain attributes can be used for different types of segments, such as for example, a start-of-sequence extension, start-of-play attenuation, end-of-play attenuation, end-of-inning extension, end-of-game extension, and the like. The start and end of any segment can be automatically adjusted, via a “start offset” and/or “end offset”, based on any number of conditions or parameters, including overall duration of the highlight show and/or other factors. Thus, for example, depending on duration constraints for the highlight show, segments can be automatically extended or attenuated as appropriate. Other techniques can also be used for determining start and end points for highlight segments.
Referring now to
As mentioned above, each possession is a time-delimited portion of an event. An example is a set of occurrences in sequential order by one specific team.
As mentioned above, a sequence is a time-delimited portion of an event that includes one continuous time period of action, for example from a face-off or tip-off to the next whistle.
As mentioned above, a string is a series of occurrences that are somehow linked or related to one another. The occurrences may take place within a possession (defined below), or may span multiple possessions. The occurrences may take place within a sequence (defined below), or may span multiple sequences. In at least one embodiment, the occurrences within a string are in sequential order by one specific team.
Not every occurrence need be part of a possession, sequence, or string. Some occurrences, such as free throws, do not take part during any sequence or string.
Once highlight segments have been generated 457, one or more narrative(s) are created 458. Referring now to
For example, if a particular pitcher was recently traded to a team, his pitches can be emphasized as reinforcing the narrative highlighting his performance with his new team.
As another example, if a particular part of the game was very interesting for some reason, plays from that portion of the game can be emphasized as fitting the narrative.
Additional narratives that could be used, for example for a baseball game, include:
Whichever narrative(s) are identified, the system can determine 476 a “narrative bonus” for each occurrence in the event(s), indicating a degree to which the excitement level for the occurrence should be boosted because it relates to narrative. The narrative bonus can be different for different narratives (depending on their relative interest level or importance), and can also be different depending on whether a particular occurrence is closely or tangentially related to the narrative.
Any number of narratives can take place simultaneously, and the final highlight show can include more than one narrative, or can offer the viewer a choice among several narratives.
Then, segment priorities are calculated 477. Here, based on the narrative bonus as well as other factors (such as excitement level), particular segments are identified as being the best choices for inclusion in the highlight show. Segment priorities can be determined for occurrences (plays), strings, possessions, and/or sequences.
Next, play novelty is stored 478. As mentioned above, novelty is a metric indicating a level of interest for an occurrence, independent of the context of the occurrence within the event. Thus, for example, novelty can be a measure of how unusual an occurrence is, as described in more detail below.
Tags are then added 459 to selected segments, to indicate that they should be included in the highlight show. In at least one embodiment, these tags are a type of metadata that flags a segment for inclusion, and also indicates the order in which the segments should appear within the highlight show.
In at least one embodiment, the tags define a highlight show that is assembled based on the segment priorities and on the play novelty. In this manner, the highlight show includes particular occurrences based on the excitement of the game at that point, as well as the importance of the occurrence to the narrative, and the novelty of the occurrence. For sporting events, different highlight shows can be assembled for a neutral fan, for a fan of one team or the other, or for fans of particular players; in each case, excitement level, narrative bonuses, and novelty may differ depending on the disposition of the fan (viewer). For example, a fan of one team may be interested in different narratives than a fan of the other team. In this manner, the techniques described herein allow the system to provide different highlight shows, and thereby tell different types of stories for different fans, providing a truly customized experience.
In at least one embodiment, other factors can be considered in assembling the highlight show. For example, so as to provide a suitable beginning and ending for the highlight show, it may be desirable to include the beginning and end of the event in the highlight show, even if the first and last occurrences of the event are not particularly exciting or would not otherwise be selected for inclusion.
In at least one embodiment, a narrative may dictate that the highlight show not be chronological. For example, a highlight show might include all double plays shown in a row, followed by all strikeouts, even though such an assembly would not be in chronological order.
Referring now to
Segment tags, which provide descriptors of segments, are then added 480 to the various highlight segments that may be included in the highlight show. Such tags represent various types of metadata that can be useful in constructing the highlight show or modifying it (for example, if the selected duration changes). Examples of segment tags include:
Next, output file(s) are generated 460, using the segment tags added in step 459. The file(s) may be stored locally, transmitted to a playback device, made available via cloud services, or the like. Referring now to
First, individual segment profiles are created 481; these can include, for example, profiles for plays, strings, possessions, and sequences. These are sorted 482 by segment start time, and a default JSON file is created 483 and saved, containing ordered segments and summary information for the event.
Then, customer-specific file format(s) are created 484 from the default file. In at least one embodiment, different file formats may be made available to different industry partners, according to their particular needs. These can be derived from the default file(s) saved in step 483, and can be used to orchestrate consumer-specific highlight shows in a server-side or client-side environment.
Once step 484 is complete, user-specific highlight shows can be generated by assembling highlight segments 417. In at least one embodiment, users can be given a list of highlight shows from which to select, or they can be provided with a user interface including controls to customize highlight shows to their individual specifications. As described herein, highlight shows can be presented to users via any suitable means, such as for example by streaming from a server-side video delivery platform, selective playback from a client device (such as a set-top box) on which the video has been previously recorded, or the like.
In at least one embodiment, highlight shows can be created on-the-fly by intelligently sequencing through specific portions of the full recording of an event, based on the information contained in the default JSON file generated in step 483 and/or the preferences specified by the industry partner or customer in the file format(s) generated in step 484. In at least one embodiment, individual, discrete highlight shows can also be created by the industry partner.
In at least one embodiment, the system provides a mechanism by which a single highlight segment is presented. For example, based on a determination of user interests, excitement levels, event analysis, and the like, the system determines a recommended excerpt of an event (or multiple events). The recommended excerpt may be defined, for example, in terms of start and end times (measured according to a video clock or other clock), or in terms of a start time and playing through to the end of the event. The system can then present the recommended excerpt to the user, either as a selectable clip or in connection with other excerpts for other events. In at least one embodiment, such a technique allows the system to present a particularly exciting or interesting set of occurrences that take place within an event.
Occurrences, Strings, Possessions, and Sequences
In at least one embodiment, step 416 of the above-described method involves identifying start/end times for highlight segments to be included in the highlight show. In at least one embodiment, an event is subdivided into a series of segments, which can include occurrences (plays), strings, possessions, and sequences.
As mentioned above, an occurrence is something that takes place during an event; a string is a series of occurrences that are somehow linked or related to one another; a possession is a time-delimited portion of an event; and a sequence is a time-delimited portion of an event that includes one continuous time period of action.
Possessions can be defined as beginning with a specific action within an event and ending with another specific action; this definition yields a start and end time for the possession. Demarcation of start/end times of possessions can depend on the type of event. For example, for certain sporting events wherein one team may be on the offensive while the other team is on the defensive (such as basketball or football, for example), a possession can be defined as a time period while one of the teams has the ball. In sports such as hockey or soccer, where puck or ball possession is more fluid, a possession can be considered to extend to a period of time wherein one of the teams has substantial control of the puck or ball, ignoring momentary contact by the other team (such as blocked shots or saves). For baseball, possessions are clearly defined as corresponding to a half-inning. For other types of sporting events as well as for non-sporting events, the term “possession” may be somewhat of a misnomer, but is still used herein for illustrative purposes. One skilled in the art will recognize that the term is intended to apply broadly to any time-delimited portion of an event. Examples in a non-sporting context may include a chapter, scene, act, television segment, or the like. A possession can include any number of occurrences.
Similarly, sequences can also be defined as beginning with a specific action within an event and ending with another specific action, but including a continuous time period of action. For example, in a sporting event, a sequence may begin when action begins (such as a face-off, tipoff, or the like), and may end when the whistle is blown to signify a break in the action. In a sport such as baseball or football, a sequence may be equivalent to a play, which is a form of occurrence. A sequence can include any number of possessions, or may be a portion of a possession.
For illustrative purposes, the following are examples of ways in which possessions and sequences can be defined for various types of sporting events:
Identification of strings, possessions, and sequences can help construct a narrative in the context of generating a customized highlight show. Excitement levels, novelty levels, and priorities can be determined for strings, possessions, and sequences, and such excitement levels, novelty levels, and priorities (which may be customized for a particular user 250) can be used as factors in selecting which segments to include in the customized highlight show. Thus, in an embodiment where excitement levels can be determined for entire events (such as games), strings, possessions, sequences, and/or individual occurrences, and wherein such excitement levels can be customized for a particular user 250, a more coherent narrative that is more likely to be interesting to user 250 can be constructed.
For example, in some situations, a less exciting occurrence may be selected for inclusion because it is part of a string, possession, or sequence that is judged, on the whole, to have a high level of excitement. Such a methodology facilitates improved narrative cohesion, for example by showing a number of plays (occurrences) that led up to an exciting scoring opportunity at the end of a string, possession, or sequence. The individual plays may not individually have high excitement levels, but they are included because they lead up to an exciting conclusion of the string, possession, or sequence. In at least one embodiment, the priority value takes into account such situations, as described in more detail below.
In at least one embodiment, one or more of strings, possessions, and sequences are used as the time unit by which excitement levels and priorities are assessed and compared. For example, step 414 described above, in which occurrences having high priority are identified, can be performed by ranking strings, possessions, or sequences to identify those having the highest priority; then selecting individual occurrences within those identified strings, possessions, or sequences, based on the determined priority. Other techniques can be used for combining priority for strings, possessions, or sequences with priority for occurrences.
Referring now to
Table 500 is an example of the output of step 413 according to at least one embodiment, and includes the following columns:
Data such as that shown in
Excitement Level
Any suitable mechanism can be used for determining excitement level for an event, possession, string, sequence, or occurrence. In at least one embodiment, techniques as described in the above-cited related U.S. Utility Application Ser. No. 13/601,915 for “Generating Excitement Levels for Live Performances,” filed Aug. 31, 2012, are used. In general, in order to customize the generation of highlight shows for a user 250, the described system determines excitement levels based on particular characteristics of that user 250.
In at least one embodiment, various factors are considered when determining excitement level. Such factors, may include, for example:
Any or all of the above factors can be used, singly or in any suitable combination, to determine excitement level for an event, possession, string, sequence, or occurrence.
In various embodiments, priority can be determined for events, possessions, strings, sequences, and/or occurrences, and can be used by components of the described system to determine which segments to include in a customized highlight show. Any suitable and available data can be used for deriving priority, including for example data available from data provider(s) 222.
At a high level, priorities for occurrences are a function of the novelty of a specific occurrence, the novelty and excitement level of the possession in which the occurrence took place, the excitement level of the event (for example, from the perspective of a neutral fan) at the time that the occurrence took place, and the contribution of the specific occurrence to the overall narrative of the event.
Priority for possessions and/or occurrences can be modified by the overall excitement level of the event as a whole. For example, in the context of sporting events, priority for possessions and/or occurrences can be modified by the excitement level of the game at the time that the occurrences took place. Such modifications can take into account a particular team or player as well as an affinity for that team or player on the part of user 250. For specific teams, priority can be modified based on the effect of the occurrence or possession on that team. For specific players, priority can be modified by the contribution of the player to a particular occurrence.
In some embodiments, other types of priority can also be used, including team priority, player priority, and/or fantasy priority. Each of these different categories of priorities is discussed in turn.
One skilled in the art will recognize that there are many ways to calculate priorities. Thus, the particular methodologies described below are intended to be illustrative, but not limiting of scope. Additional methodologies can be derived to suit different situations or individual preference profiles.
Possession Priority
Possession priority refers to a priority, or rank, for a given possession within an event such as a game. In at least one embodiment, this priority can be generated from the perspective of a neutral fan, but can be adjusted based on user affinity to one or other of the teams involved. The following is an example of a possession priority calculation for football:
Thus, in this example, possession priority is calculated as:
possession_priority=Sum(Possession Bonus)
Referring again to
In at least one embodiment, possession priority can be used for generating customized highlight shows, without necessarily computing occurrence priority (see below). For example, a customized highlight show can be generated by applying one or more of the following parameters:
Occurrence Priority (or Play Priority)
Occurrence priority refers to a priority, or rank, for a given occurrence, such as a play within a sporting event. In at least one embodiment, this priority can be generated from the perspective of a neutral fan. In at least one embodiment, occurrence priority is generated for the following:
In at least one embodiment, if an occurrence qualifies in both categories, the segment bonus values assigned to this individual occurrence within each qualifying category are added together to produce an aggregate bonus value for the occurrence. This aggregate bonus value is then multiplied by the current excitement rating (EQ) of the game to generate an occurrence priority value. This particular calculation is merely exemplary; one skilled in the art will recognize that occurrence priority can be calculated in other ways.
In at least one embodiment, the bonus value for each occurrence is the sum of the occurrence's calculated novelty value, plus any bonus modifier an occurrence might receive from being within a key possession. For occurrences that are not in a key possession, a bonus value is not calculated unless the individual occurrence novelty is greater than or equal to 2.
In summary:
Thus, in this example, occurrence priority (also referred to as play priority) is calculated as:
play_priority=(Bonus Value+Bonus Modifier)*EQ_neutral
In at least one embodiment, occurrence priority can be used for generating customized highlight shows, once occurrence priority has been calculated. For example, a customized highlight show can be generated by applying one or more of the following parameters:
Similar techniques can be used to determine sequence priority (referring to a priority, or rank, for a given sequence), and/or string priority (referring to a priority, or rank, for a given string).
Team Priority
Team priority refers to a priority, or rank, for a given occurrence, possession, string, or sequence within an event from the perspective of a fan of one or the other teams playing. In other words, the priority takes into account user affinity to one or other of the teams involved. In at least one embodiment, team priority is calculated in the same manner as occurrence priority or possession priority, except that a fan perspective excitement rating (referred to as EQ_away or EQ_home) is used to compute occurrence priority and possession priority, rather than neutral perspective statistic, EQ_neutral. Further adjustments can also be made; for example, in at least one embodiment, a +2 Possession_Bonus is added to any score by the fan's favorite team.
The following is an example of a team possession priority calculation for football:
Team possession priority is the sum of a number of individual bonus elements that are set by specific possession stats calculated by the main game excitement algorithms:
Thus, in this example, team possession priority is calculated as:
possession_priority=Sum(Possession Bonus)
In at least one embodiment, team occurrence priority is generated for the following:
In at least one embodiment, if an occurrence qualifies in both categories, the segment bonus values assigned to this individual occurrence within each qualifying category are added together to produce an aggregate bonus value for the occurrence. This aggregate bonus value is then multiplied by the current excitement rating (EQ) of the game (from the perspective of that team) to generate a team occurrence priority value. This particular calculation is merely exemplary; one skilled in the art will recognize that team occurrence priority can be calculated in other ways.
In at least one embodiment, the bonus value for each occurrence is the sum of the occurrence's calculated novelty value, plus any bonus modifier an occurrence might receive from being within a key possession (team_possession_priority>=4). For occurrences that are not in a key possession, a bonus value is not calculated unless the individual occurrence novelty is greater than or equal to 2.
In Summary:
Bonus Modifier for occurrences within possessions where team_possession_priority>=4 (Key Possessions):
Thus, in this example, occurrence priority (also referred to as play priority) is calculated as:
team_play_priority=(Bonus Value+Bonus Modifier)*EQ_team
In at least one embodiment, team occurrence priority can be used for generating customized highlight shows based on team occurrences, once occurrence priority has been calculated. For example, a customized highlight show based on specific team occurrence priorities can be generated by applying one or more of the following parameters:
Player Priority
Player priority refers to a priority, or rank, for a given occurrence, possession, string, or sequence within an event involving a specific player from the perspective of a fan of the player's team or a neutral fan, i.e., a user who is a fan of a specific player but not the team on which the player plays. In other words, the priority can take into account user affinity for the player's team, if that preference happens to be relevant. In at least one embodiment, player priority is calculated as the product of player novelty (novelty_player_offense or novelty_player_defense) and the current team excitement rating (EQ_away, EQ_home, or EQ_neutral), as follows:
player_priority=novelty_player*EQ_team
In at least one embodiment, player priority can be used for generating customized highlight shows, once player priority has been calculated. For example, a customized highlight show based on specific player priorities can be generated by applying one or more of the following parameters:
Fantasy Priority
Fantasy priority refers to a priority, or rank, for a given occurrence or possession within an event involving a specific fantasy player (i.e. a player belong to a user's team in a fantasy league), from the perspective of a fan of the player's team or a neutral fan. In at least one embodiment, fantasy priority is calculated as the product of abs(player priority) and occurrence priority (i.e. play priority) divided by 100, as follows:
fantasy_priority=[abs(player_priority)*play_priority]/100.0
In at least one embodiment, fantasy priority can be used for generating customized highlight shows for a specific roster of fantasy players, once fantasy priority has been calculated. For example, a customized highlight show based on specific fantasy priorities can be generated by applying one or more of the following parameters:
Similar rules can be applied to fantasy players on the opponent's team roster to create a fantasy highlight show that includes segments for both teams and yields a highlight show of a virtual fantasy game.
In various embodiments, any or all of the above types of priority can be used to determine which segments to include in a highlight show.
Referring now to
Referring now to
In at least one embodiment, rather than using a threshold priority to determine which segments to include, the system may select the N most exciting occurrences (plays) for inclusion in a highlight show, based on occurrence priority, possession priority, or any other priority. Referring now to
As described above, the system can use fantasy priority to determine which segments to include. A fantasy roster can be consulted or generated, indicating which players are included in a user's fantasy league team. An example of an excerpt of such a roster is shown in table 535 of
Referring now to
Narrative and Theme
As described above, in at least one embodiment, the system constructs the customized highlight show so that it provides a cohesive narrative depicting an event (or set of events). The narrative provides a structural framework for presenting highlight segments in a manner that navigates from a beginning to a middle to an end. For example, the narrative can follow a journey from introduction to storyline development to suspenseful challenge(s) to challenge resolution(s) to closure with any number of “surprising developments” layered throughout. Such narrative can be automatically or manually constructed, and can be based, for example, on availability of particular segments, external factors, historical context, and/or the like. Individual segments are automatically selected so that they support the narrative; in at least one embodiment, a determination as to whether to include a particular segment is based, at least in part, on its ability to support the narrative.
In at least one embodiment, the system identifies those occurrences, possessions, strings, and/or sequences, that precede an exciting occurrence, possession, string, or sequence and are part of the set-up to that exciting occurrence, possession, string, or sequence. An example is a baseball player who gets walked immediately before a home run. These precedent plays may not necessarily be exciting in-and-of-themselves, but they may be included in the customized highlight show based on a determination that they contribute to the narrative of the overall sporting event, and in particular are precursors to an exciting event such as a home run. Accordingly, in at least one embodiment, the system described herein takes into account such narrative contributions by making associations between exciting occurrences and those occurrences that precede the exciting occurrence and are part of the narrative that leads up to the exciting occurrence.
Various techniques can be used for improving and enhancing the narrative quality of the customized highlight show. One approach is to take into account a notion of theme when constructing the highlight show. For example, a theme may emerge when a particular type of play appears several times within a single sporting event, or if a player has a particularly good game, or if some unusual event or coincidence occurs. Such identification of a theme can affect the selection of segments for the highlight show: for example, if a player has a particularly good game, the highlight show may be constructed so that it includes several occurrences (plays) involving that player. In at least one embodiment, this is accomplished by adjusting priorities for occurrences, possessions, strings, and/or sequences that reinforce the identified theme.
In at least one embodiment, theme can be identified in an automated way, by analyzing statistics associated with the event. Alternatively, theme can be specified manually. Unusual or remarkable patterns can be identified by virtue of their divergence from normally expected statistical distributions. Once a theme has been identified, priorities are automatically adjusted to emphasize the theme.
In at least one embodiment, the determination as to whether an identifiable theme should be used in constructing a highlight show can depend on any of several factors. For example, if the specified allotted time for the highlight show is insufficient to effectively construct a narrative including a theme, the theme can be abandoned for that highlight show.
Game Clock and Video Clock
As described above, in at least one embodiment, start/end times for highlight segments can be defined in terms of elapsed time since the beginning of an event. Such a measure is referred to herein as a “video clock”, although it can also be referred to as a “real-time clock”. Such video clock, which measures actual elapsed time since the beginning of an event, is in contrast to a game clock. Many sports (such as basketball, hockey, or football) have a game clock that indicates the time remaining in the game, but such a game clock does not correspond to real elapsed time because of stoppages, time-outs, intermissions, and/or the like. In other sports, such as baseball, there is no game clock.
In many situations, event data from sources such as data provider(s) 222 is specified in terms of game time (for sports such as basketball, hockey, soccer, or football), or in terms of inning (for baseball). It is beneficial, therefore, to develop a correlation between such identifications of game time or inning with actual elapsed time, so that start/end times for highlight segments can accurately be determined. Any of a number of techniques can be used for determining such correlations, including the following, either singly or in any combination:
In at least one embodiment, the described system provides a mechanism by which a user can watch highlights of an event while the event is still in progress. For example, a user may be interested in watching a sporting event that is currently in progress. However, instead of watching the event live, the user may wish to start watching highlights of the first part of the event, and then catch up to the live event, so that he or she can watch the latter part of the event live. This format allows the user to view important occurrences from the first part of the event, which he or she would otherwise have missed had he or she merely started watching live.
In at least one embodiment, the described system provides an option for real-time catch-up viewing, wherein a highlight show is presented for a first part of an event, transitioning to a full (unexpurgated) version from a certain point until the end of the event. The user selects an event to watch. If the event is currently in progress, the system generates and displays a customized highlight show for the event up to that point in time, using any or all of the techniques described above. Upon conclusion of the highlight show, a transition may be presented and the system can then begin showing the unedited event from that point on, either live or delayed/recorded.
As described above, the user can be prompted to specify a total length of time for the highlight show; alternatively, the system can automatically select the length of time based on any suitable factors, such as for example the amount of time remaining in the live event.
Additional occurrences may take place in the event while the user is still watching the highlight show. In at least one embodiment, the system can dynamically add highlight segments depicting such additional occurrences to the highlight show, even as the highlight show is being viewed. In at least one embodiment, the system continues to analyze occurrences as they take place to determine whether highlight segments depicting such occurrences should be added to the highlight show. Dynamically adding such highlight segments extends the length of the highlight show; thus, in at least one embodiment, newly added highlight segments can replace previously selected highlight segments that are deemed to be of less interest (assuming the previously selected highlight segments have not yet been shown to the user). Alternatively, the user can be prompted as to whether he or she would like the highlight show to be extended by the dynamic addition of selected highlight segments.
Such an embodiment allows a user to watch a condensed version of a game or other event to a certain point, and then to see the remainder of the event live. Such a technique can be used even if the event is not currently in progress: the user can watch the highlight show that covers the event to a certain point, and can then transition to an unedited version of the event from that point on, whether in a live format or in a delayed/recorded format.
In at least one embodiment, the user can be presented with multiple highlight shows for events in progress. For example, he or she can choose to see a highlight show for every game that is currently available for him or her to watch, or every game within a particular sport or league, or for teams in a particular geographic area. Then, after viewing some or all of the highlight shows, the user can make a decision as to which game to watch live. Other variations are possible.
Exemplary User Interface for Customizing Highlight Shows
In at least one embodiment, such architectures, systems, and methods can be augmented and/or supplemented as described herein to provide techniques for enabling improved interaction with customized highlight shows. Any of the techniques described herein can be used in conjunction with the methods and systems described in the above-referenced related applications, and/or in other systems.
In at least one embodiment, the system and method described herein use a multifactor approach to select highlights from a plurality of games (or other events) occurring over a specified period of time, such as a week. In addition, as described in the above-referenced related application, the system can take into account an excitement score for each highlight, when determining whether to include the highlight in a customized highlight show. The excitement score can measure overall excitement, and/or excitement from the point of view of a fan from one or the other of the teams involved in the game. In at least one embodiment, only those highlights having an excitement level above a particular threshold level are selected for inclusion in the customized highlight show. In at least one embodiment, the threshold level can be chosen based on the overall time available for the customized highlight show; in other words, a higher threshold level might be used when the customized highlight show should be shorter.
Although the user interface of
As shown, the screen shot 700 may be divided into a title area 702, a menu area 704, a display area 706, a time indicator area 708, and a playback control area 710. The title area 702 may indicate the event(s) from which the highlights being viewed have been obtained. The menu area 704 may be used to control which highlights are selected as part of the customized highlight show. In the display area 706, the highlights may be presented, for example, in the form of video and/or images. The time indicator area 708 may indicate the total length of the customized highlight show, and the position, within the customized highlight show of the highlight being viewed. The playback control area 710 may have controls that facilitate user navigation through the customized highlight show.
The time indicator area 708 may have a time indicator that indicates the total time of the customized highlight show, and the portion currently being viewed. The time indicator may be represented as a timeline 712, which may be a progress bar representing the entire customized highlight show. To the right of timeline 712 is an indication of current position 714 as well as total length 716 of the customized highlight show. In at least one embodiment, a current position of timeline 712 can be indicated by a slidable thumb or other slider (not shown). The user may have the option to click and drag the slider to the left or right to advance or rewind the customized highlight show.
In this example, timeline 712 also includes indicators 718 indicating temporal positions of portions (i.e., “highlights”) of the customized highlight show that are determined to be of particular interest. For example, the indicators 718 may be used to indicate plays that have a sufficiently high excitement level, involve particular players of interest, and/or support a particular narrative that accompanies the customized highlight show.
A pause button 720 may be used to pause playback of the customized highlight show, and may also optionally be located in the time indicator area 708. Within the playback control area 710, a skip back button 722 may cause the previous highlight to be played again, and a skip forward button 724 may cause the display to skip to the next highlight.
A variety of menus may be displayed within the menu area 704. Such menus may be used to customize the highlights that are selected for inclusion in the customized highlight show and/or the manner in which the highlights are shown. This may be done by selecting one or more attributes of the source content that are to be preferentially included in the customized highlight show. In the context of sporting events, attributes may include, but are not limited to:
Those of skill in the art will recognize that other attributes may also be selected and used to generate a customized highlight show for a sporting event. Further, those of skill in the art will recognize that other attributes may be selected for events of other types, such as, but not limited to, those listed previously.
In the exemplary screen shot 700 of
The user may also have the option to specify whether he or she is a fan of one of the two teams playing in the game, or a neutral fan. For example, the highlights menu may provide a visiting team fan option 930, a home team fan option 932, and a neutral fan option 934, as shown. Radio buttons may be used such that a user is only able to select one of these options.
After the user makes selections of the viewing length and/or team allegiance in the highlights menu 732, the techniques described above and/or in the above-referenced related applications may be used to selects highlights based on excitement level and/or other factors. The user's fan allegiance, the viewing duration, and/or other factors may be taken into account to automatically assemble the customized highlight show. The customized highlight show may then be presented to the user, and the user may interactively control the playback using the various user interface elements shown in the
The play type viewing length 1040 of the resulting customized highlight show may be shown next to each of the options of the plays menu 734. Thus, the user can easily see how long it would take to view all of the plays of a given type in the game.
After the user makes selections of the team allegiance and/or one or more play types in the plays menu 734, the techniques described above and/or in the above-referenced related applications may be used to select highlights that match the specified parameters, taking into account excitement level and/or other factors (including the user's fan allegiance), to assemble a customized highlight show. The customized highlight show may then be presented to the user. The user may interactively control the playback using the various user interface elements shown in the Figures.
In at least one embodiment, an excitement level 1120 can be determined for each of the players 1110. The excitement level 1120 may be based on the plays from the specific game and/or other factors, as detailed above and/or in the above-referenced applications. The excitement level 1120 for a player 1110 may be based on the level of excitement expected to be experienced by a user watching plays in which that player 1110 participates. As shown in
After the user makes selections of one or more of the players in the players menu 736, the techniques described above and/or in the above-referenced related applications may be used to select highlights that match the specified parameters, taking into account excitement level and/or other factors, to assemble a customized highlight show. The customized highlight show may then be presented to the user. The user may interactively control the playback using the various user interface elements shown in the Figures.
According to some embodiments, selections of multiple attributes (for example, teams, players, and types of plays) may be selected at the same time. The customized highlight show may then be generated based on all of the attributes selected. One such example will be shown and described in connection with
After the user makes selections of the team, the viewing length, one or more play types, and/or one or more players from the team menu 1230, the duration menu 1232, the plays menu 1234, and/or one the players menu 1236, the techniques described above and/or in the above-referenced related applications may be used to select highlights that match the specified parameters, taking into account excitement level and/or other factors, to assemble a customized highlight show. In some examples, each highlight of the customized highlight show may have all of the selected team(s), play type(s), and player(s). In other embodiments, the system may include highlights that include less than all (for example, only one or two) of the selected attributes in order to provide a customized highlight show of the desired duration. The customized highlight show may then be presented to the user. The user may interactively control the playback using the various user interface elements shown in the Figures.
Customized Highlight Shows for Fantasy Teams
Recently, fantasy sports have become increasingly popular. In a fantasy sport, each participant assembles an imaginary team (a “fantasy team”) of real players in some sport. As the real players play games in the sport, statistics of the real players are aggregated for each of the fantasy teams. Thus, a participant's overall standings and performance in the fantasy sport is tied directly to the aggregate performance of the individual real players on the participant's fantasy team. Techniques are known for drafting (selecting) players for one's fantasy team, as well as for combining or aggregating statistical measures of performance for the players on one's fantasy team.
A participant in a fantasy sport may therefore have an interest in many different real-world games, because the participant has players on his or her fantasy team from many different real-world teams. In particular, the participant may be interested in seeing highlights featuring the various players on his or her fantasy team, but may not have the time or inclination to watch the large number of complete real-world games involving those players. Manually searching for and viewing such highlights can be difficult, time-consuming, and/or impossible.
Accordingly, in at least one embodiment, a system and method can be provided that automatically assembles a customized highlight show featuring plays that involve or feature particular players on a participant's fantasy team. The participant can manually input the names of players on his or her fantasy team, and/or the system can automatically extract this information from a website or other resource that runs or manages the fantasy sport. Based on this information, the system can automatically assemble the customized highlight show.
In at least one embodiment, the system and method described herein uses a multifactor approach to select highlights from a plurality of games (or other events) occurring over a specified period of time, such as a week. In at least one embodiment, the system can be used to automatically select highlights that feature at least one player that is a member of the viewer's fantasy team. In addition, as mentioned above, the system can take into account an excitement score for each highlight, when determining whether to include the highlight in a customized highlight show. The excitement score can measure overall excitement, and/or excitement from the point of view of a person interested in the particular player from the fantasy team. As mentioned above, in at least one embodiment, only those highlights having an excitement level above a particular threshold level are selected for inclusion in the customized highlight show. In at least one embodiment, the threshold level can be chosen based on the overall time available for the customized highlight show; in other words, a higher threshold level might be used when the customized highlight show should be shorter.
As in the screen shot 700 of
Within the title area 1302, each fantasy team's name 1342 may be indicated. Score 1344 may indicate a fantasy sports score, representing an aggregate, comparative performance of all members of each fantasy team's players. An indicator 1346 may indicate how many players each fantasy participant has remaining in his or her team, who have not yet played this week, as of the current position in the displayed video.
Within the display area 1306, the highlights may be presented, for example, in the form of video and/or images. Selection of highlights to be included in the customized highlight show may be performed using techniques described above and/or in the above-referenced related applications.
The time indicator area 1308 may have a time indicator that indicates the total time of the customized highlight show, and the portion currently being viewed. The time indicator may be represented as a timeline 1312, which may be a progress bar representing the entire customized highlight show. To the right of timeline 1312 is an indication of current position 1314 as well as total length 1316 of the customized highlight show. In at least one embodiment, a current position of timeline 1312 can be indicated by a slidable thumb or other slider (not shown). The user may have the option to click and drag the slider to the left or right to advance or rewind the customized highlight show.
In this example, timeline 1312 also includes a first set of indicators 1318 and a second set of indicators 1319 that, together, indicate temporal positions of portions (i.e., “highlights”) of the customized highlight show that are determined to be of particular interest. For example, the first set of indicators 718 and the second set of indicators 1319 may be used to indicate plays that have a sufficiently high excitement level, involve particular players of interest, and/or support a particular narrative that accompanies the customized highlight show.
As shown in
A pause button 1320 may be used to pause playback of the customized highlight show, and may also optionally be located in the time indicator area 1308. Within the playback control area 1310, a skip back button 1322 may cause the previous highlight to be played again, and a skip forward button 1324 may cause the display to skip to the next highlight.
A variety of menus may be displayed within the menu area 1304. Such menus may be used to customize the highlights that are selected for inclusion in the customized highlight show and/or the manner in which the highlights are shown. This may be done by selecting one or more attributes of the source content that are to be preferentially included in the customized highlight show, as with the menus of
For example, a “My Team” menu 1330, a “Matchups” menu 1332, and an “Other Players” menu 1334 may be displayed in the menu area 1304. These menus will be further described in connection with
In at least one embodiment, an “All Games” option 1420 is provided, which results in generation of a customized highlight show including all fantasy games in the fantasy league. Highlights can be presented in chronological order, or they can be presented for each fantasy game (i.e. head-to-head competition) sequentially. In at least one embodiment, a higher threshold excitement level can be used for the “All Games” option, so as to ensure that the overall customized highlight show is not too long.
In at least one embodiment, each fantasy game can itself have an overall excitement level. In at least one embodiment, the overall length of each highlight show can vary depending on the excitement level for that fantasy game. In at least one embodiment, the excitement level 1430, represented as a numeric value, can be displayed in menu 110 alongside the name of each fantasy game. In the example of
In some fantasy sports leagues, a participant can designate a subset of his or her players to be currently active. These players contribute to the participant's performance in the fantasy sport, while the non-selected players are “on the bench.” Typically, the participant can periodically (e.g. weekly) change the selection of which players should be “on the bench”, for example if an active player is injured. Selecting “All Bench” 1540 may generate a customized highlight show containing highlights featuring players that are currently “on the bench” on the participant's fantasy team. In at least one embodiment, another option can be provided (not shown) for selecting individual benched players, for example, by listing players on the bench for that fantasy sports team underneath the All Bench 1540 selection.
Excitement levels for the players on the bench can be shown in the “My Team” menu 1330, as with the players 1510 listed as starters in the “My Team” menu 1330. The names of the players may be displayed, for example, in connection with excitement levels as on the “My Team” menu 1330. If desired, check boxes or other selection indicators like the check boxes 1250 of
For any of the highlight shows generated using the “My Team” menu 1330, the user can select whether the highlight show should span this week 1550, last week 1560, or all season 1570, by clicking on the appropriate radio button 1580 at the top of the “My Team” menu 1330. Other time periods, and/or a customizable time period selector (not shown), can also be provided if desired.
As shown in
The names of the players may be displayed, for example, in connection with excitement levels as on the “My Team” menu 1330. If desired, check boxes or other selection indicators like the check boxes 1250 of
In this manner, the user may be able to select one or more players from any team(s). A customized highlight show may then be generated with highlights in which the player(s) participated from the selected time frame.
In alternative embodiments (not shown), fantasy teams may be used in connection with other events, such as musical performances, movies, and the like. Further, a customized highlight show for a fantasy sports team may be constructed based on a wide variety of user-selectable attributes, including but not limited to any of the attributes listed above, in connection with
The present system and method have been described in particular detail with respect to possible embodiments. Those of skill in the art will appreciate that the system and method may be practiced in other embodiments. First, the particular naming of the components, capitalization of terms, the attributes, data structures, or any other programming or structural aspect is not mandatory or significant, and the mechanisms and/or features may have different names, formats, or protocols. Further, the system may be implemented via a combination of hardware and software, or entirely in hardware elements, or entirely in software elements. Also, the particular division of functionality between the various system components described herein is merely exemplary, and not mandatory; functions performed by a single system component may instead be performed by multiple components, and functions performed by multiple components may instead be performed by a single component.
Reference in the specification to “one embodiment” or to “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment. The appearances of the phrases “in one embodiment” or “in at least one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
Various embodiments may include any number of systems and/or methods for performing the above-described techniques, either singly or in any combination. Another embodiment includes a computer program product comprising a non-transitory computer-readable storage medium and computer program code, encoded on the medium, for causing a processor in a computing device or other electronic device to perform the above-described techniques.
Some portions of the above are presented in terms of algorithms and symbolic representations of operations on data bits within the memory of a computing device. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps (instructions) leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. Furthermore, it is also convenient at times, to refer to certain arrangements of steps requiring physical manipulations of physical quantities as modules or code devices, without loss of generality.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “displaying” or “determining” or the like, refer to the action and processes of a computer system, or similar electronic computing module and/or device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Certain aspects include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions can be embodied in software, firmware and/or hardware, and when embodied in software, can be downloaded to reside on and be operated from different platforms used by a variety of operating systems.
The present document also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computing device. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, DVD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, flash memory, solid state drives, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. The program and its associated data may also be hosted and run remotely, for example on a server. Further, the computing devices referred to herein may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
The algorithms and displays presented herein are not inherently related to any particular computing device, virtualized system, or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will be apparent from the description provided herein. In addition, the system and method are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings described herein, and any references above to specific languages are provided for disclosure of enablement and best mode.
Accordingly, various embodiments include software, hardware, and/or other elements for controlling a computer system, computing device, or other electronic device, or any combination or plurality thereof. Such an electronic device can include, for example, a processor, an input device (such as a keyboard, mouse, touchpad, track pad, joystick, trackball, microphone, and/or any combination thereof), an output device (such as a screen, speaker, and/or the like), memory, long-term storage (such as magnetic storage, optical storage, and/or the like), and/or network connectivity, according to techniques that are well known in the art. Such an electronic device may be portable or nonportable. Examples of electronic devices that may be used for implementing the described system and method include: a desktop computer, laptop computer, television, smartphone, tablet, music player, audio device, kiosk, set-top box, game system, wearable device, consumer electronic device, server computer, and/or the like. An electronic device may use any operating system such as, for example and without limitation: Linux; Microsoft Windows, available from Microsoft Corporation of Redmond, Washington; Mac OS X, available from Apple Inc. of Cupertino, California; iOS, available from Apple Inc. of Cupertino, California; Android, available from Google, Inc. of Mountain View, California; and/or any other operating system that is adapted for use on the device.
While a limited number of embodiments have been described herein, those skilled in the art, having benefit of the above description, will appreciate that other embodiments may be devised. In addition, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the subject matter. Accordingly, the disclosure is intended to be illustrative, but not limiting, of scope.
The present application claims the benefit of U.S. Provisional Application Ser. No. 62/221,999 for “User Interface for Interaction with Customized Highlight Sequences”, filed Sep. 22, 2015, which is incorporated by reference herein in its entirety. The present application claims priority as a continuation-in-part of U.S. Utility application Ser. No. 14/877,691 for “Customized Generation of Highlight Show with Narrative Component”, filed Oct. 7, 2015, which is incorporated by reference herein in its entirety. U.S. Utility application Ser. No. 14/877,691 claims priority as a continuation-in-part of U.S. Utility application Ser. No. 14/510,481 for “Generating a Customized Highlight Sequence Depicting an Event”, filed Oct. 9, 2014, which is incorporated by reference herein in its entirety. U.S. Utility application Ser. No. 14/877,691 claims priority as a continuation-in-part of U.S. Utility application Ser. No. 14/710,438 for “Generating a Customized Highlight Sequence Depicting Multiple Events”, filed May 12, 2015, which is incorporated by reference herein in its entirety. U.S. Utility application Ser. No. 14/710,438 claims priority as a continuation of U.S. Utility application Ser. No. 14/510,481 for “Generating a Customized Highlight Sequence Depicting an Event”, filed Oct. 9, 2014, which is incorporated by reference herein in its entirety. U.S. Utility application Ser. No. 14/877,691 also claims the benefit of U.S. Provisional Application Ser. No. 62/221,999 for “User Interface for Interaction with Customized Highlight Sequences”, filed Sep. 22, 2015, which is incorporated by reference herein in its entirety. The present application is related to U.S. Utility application Ser. No. 13/601,915 for “Generating Excitement Levels for Live Performances,” filed Aug. 31, 2012, which is incorporated by reference herein in its entirety. The present application is related to U.S. Utility application Ser. No. 13/601,927 for “Generating Alerts for Live Performances,” filed Aug. 31, 2012, which is incorporated by reference herein in its entirety. The present application is related to U.S. Utility application Ser. No. 13/601,933 for “Generating Teasers for Live Performances,” filed Aug. 31, 2012 and issued on Nov. 26, 2013 as U.S. Pat. No. 8,595,763, which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6005562 | Shiga et al. | Dec 1999 | A |
6177931 | Alexander et al. | Jan 2001 | B1 |
6185527 | Petkovic et al. | Feb 2001 | B1 |
6195458 | Warnick et al. | Feb 2001 | B1 |
6557042 | He et al. | Apr 2003 | B1 |
6681396 | Bates et al. | Jan 2004 | B1 |
6721490 | Yao et al. | Apr 2004 | B1 |
6954611 | Hashimoto et al. | Oct 2005 | B2 |
7174512 | Martin et al. | Feb 2007 | B2 |
7197715 | Valeria | Mar 2007 | B1 |
7386217 | Zhang | Jun 2008 | B2 |
7543322 | Bhogal et al. | Jun 2009 | B1 |
7633887 | Panwar et al. | Dec 2009 | B2 |
7646962 | Ellis et al. | Jan 2010 | B1 |
7680894 | Diot et al. | Mar 2010 | B2 |
7742111 | Shiu et al. | Jun 2010 | B2 |
7774811 | Poslinski et al. | Aug 2010 | B2 |
7818368 | Yang et al. | Oct 2010 | B2 |
7825989 | Greenberg | Nov 2010 | B1 |
7831112 | Wang et al. | Nov 2010 | B2 |
7849487 | Vosseller | Dec 2010 | B1 |
7929808 | Seaman et al. | Apr 2011 | B2 |
8024753 | Kummer et al. | Sep 2011 | B1 |
8046798 | Schlack et al. | Oct 2011 | B1 |
8079052 | Chen et al. | Dec 2011 | B2 |
8099315 | Amento | Jan 2012 | B2 |
8103107 | Yamamoto | Jan 2012 | B2 |
8104065 | Aaby et al. | Jan 2012 | B2 |
8140570 | Ingrassia et al. | Mar 2012 | B2 |
8196168 | Bryan et al. | Jun 2012 | B1 |
8209713 | Lai et al. | Jun 2012 | B1 |
8296797 | Olstad et al. | Oct 2012 | B2 |
8296808 | Hardacker et al. | Oct 2012 | B2 |
8312486 | Briggs et al. | Nov 2012 | B1 |
8320674 | Guillou et al. | Nov 2012 | B2 |
8424041 | Candelore et al. | Apr 2013 | B2 |
8427356 | Satish | Apr 2013 | B1 |
8457768 | Hammer et al. | Jun 2013 | B2 |
8522300 | Relyea et al. | Aug 2013 | B2 |
8535131 | Packard et al. | Sep 2013 | B2 |
8595763 | Packard et al. | Nov 2013 | B1 |
8627349 | Kirby et al. | Jan 2014 | B2 |
8688434 | Birnbaum et al. | Apr 2014 | B1 |
8689258 | Kemo | Apr 2014 | B2 |
8702504 | Hughes | Apr 2014 | B1 |
8713008 | Negi | Apr 2014 | B2 |
8752084 | Ai et al. | Jun 2014 | B1 |
8793579 | Halliday et al. | Jul 2014 | B2 |
8973038 | Gratton | Mar 2015 | B2 |
8973068 | Kotecha et al. | Mar 2015 | B2 |
8990418 | Bragg et al. | Mar 2015 | B1 |
9038127 | Hastings et al. | May 2015 | B2 |
9060210 | Packard et al. | Jun 2015 | B2 |
9066156 | Kapa | Jun 2015 | B2 |
9213986 | Buchheit et al. | Dec 2015 | B1 |
9251853 | Jeong et al. | Feb 2016 | B2 |
9253533 | Morgan et al. | Feb 2016 | B1 |
9264779 | Kirby et al. | Feb 2016 | B2 |
9420333 | Martch et al. | Aug 2016 | B2 |
9451202 | Beals | Sep 2016 | B2 |
9565474 | Petruzzelli et al. | Feb 2017 | B2 |
9578377 | Malik et al. | Feb 2017 | B1 |
9583149 | Stieglitz | Feb 2017 | B2 |
9648379 | Howcroft | May 2017 | B2 |
9715902 | Coviello et al. | Jul 2017 | B2 |
9788062 | Dimov et al. | Oct 2017 | B2 |
9888279 | Ishtiaq et al. | Feb 2018 | B2 |
9967621 | Armstrong | May 2018 | B2 |
20010013123 | Freeman et al. | Aug 2001 | A1 |
20010026609 | Weinstein et al. | Oct 2001 | A1 |
20020041752 | Abiko et al. | Apr 2002 | A1 |
20020059610 | Ellis | May 2002 | A1 |
20020067376 | Martin et al. | Jun 2002 | A1 |
20020075402 | Robson et al. | Jun 2002 | A1 |
20020136528 | Dagtas | Sep 2002 | A1 |
20020157095 | Masumitsu et al. | Oct 2002 | A1 |
20020157101 | Schrader et al. | Oct 2002 | A1 |
20020174430 | Ellis et al. | Nov 2002 | A1 |
20020178444 | Trajkovic et al. | Nov 2002 | A1 |
20020180774 | Errico et al. | Dec 2002 | A1 |
20020194095 | Koren | Dec 2002 | A1 |
20030012554 | Zeidler et al. | Jan 2003 | A1 |
20030023742 | Allen et al. | Jan 2003 | A1 |
20030056220 | Thornton et al. | Mar 2003 | A1 |
20030063798 | Li et al. | Apr 2003 | A1 |
20030066077 | Gutta | Apr 2003 | A1 |
20030118014 | Iyer et al. | Jun 2003 | A1 |
20030126605 | Betz et al. | Jul 2003 | A1 |
20030126606 | Buczak et al. | Jul 2003 | A1 |
20030154475 | Rodriguez et al. | Aug 2003 | A1 |
20030172376 | Coffin | Sep 2003 | A1 |
20030177503 | Sull et al. | Sep 2003 | A1 |
20030188317 | Liew et al. | Oct 2003 | A1 |
20030189674 | Inoue et al. | Oct 2003 | A1 |
20030208763 | McElhatten et al. | Nov 2003 | A1 |
20030229899 | Thompson et al. | Dec 2003 | A1 |
20040003403 | Marsh | Jan 2004 | A1 |
20040041831 | Zhang | Mar 2004 | A1 |
20040167767 | Xiong et al. | Aug 2004 | A1 |
20040181807 | Theiste et al. | Sep 2004 | A1 |
20050005308 | Logan et al. | Jan 2005 | A1 |
20050015712 | Plastina | Jan 2005 | A1 |
20050030977 | Casev et al. | Feb 2005 | A1 |
20050044570 | Poslinski | Feb 2005 | A1 |
20050060641 | Sezan | Mar 2005 | A1 |
20050071865 | Martins | Mar 2005 | A1 |
20050071881 | Deshpande | Mar 2005 | A1 |
20050091690 | Delpuch et al. | Apr 2005 | A1 |
20050120368 | Goronzy et al. | Jun 2005 | A1 |
20050125302 | Brown et al. | Jun 2005 | A1 |
20050149965 | Neogi | Jul 2005 | A1 |
20050152565 | Jouppi et al. | Jul 2005 | A1 |
20050154987 | Otsuka et al. | Jul 2005 | A1 |
20050166230 | Gaydou et al. | Jul 2005 | A1 |
20050180568 | Krause | Aug 2005 | A1 |
20050182792 | Israel et al. | Aug 2005 | A1 |
20050191041 | Braun et al. | Sep 2005 | A1 |
20050198570 | Otsuka et al. | Sep 2005 | A1 |
20050204294 | Burke | Sep 2005 | A1 |
20050240961 | Jerding et al. | Oct 2005 | A1 |
20050264705 | Kitamura | Dec 2005 | A1 |
20060020962 | Stark et al. | Jan 2006 | A1 |
20060085828 | Dureau et al. | Apr 2006 | A1 |
20060117365 | Jeda et al. | Jun 2006 | A1 |
20060174277 | Sezan et al. | Aug 2006 | A1 |
20060190615 | Panwar et al. | Aug 2006 | A1 |
20060218573 | Proebstel | Sep 2006 | A1 |
20060238656 | Chen et al. | Oct 2006 | A1 |
20060253581 | Dixon et al. | Nov 2006 | A1 |
20060282852 | Purpura et al. | Dec 2006 | A1 |
20060282869 | Plourde | Dec 2006 | A1 |
20070033616 | Gutta | Feb 2007 | A1 |
20070058930 | Iwamoto | Mar 2007 | A1 |
20070083901 | Bond | Apr 2007 | A1 |
20070127894 | Ando et al. | Jun 2007 | A1 |
20070146554 | Strickland et al. | Jun 2007 | A1 |
20070154163 | Cordray | Jul 2007 | A1 |
20070154169 | Cordrav et al. | Jul 2007 | A1 |
20070157235 | Teunissen | Jul 2007 | A1 |
20070157249 | Cordrav et al. | Jul 2007 | A1 |
20070157253 | Ellis et al. | Jul 2007 | A1 |
20070157285 | Frank et al. | Jul 2007 | A1 |
20070162924 | Radhakrishnan et al. | Jul 2007 | A1 |
20070169165 | Crull et al. | Jul 2007 | A1 |
20070188655 | Ohta | Aug 2007 | A1 |
20070199040 | Kates | Aug 2007 | A1 |
20070204302 | Calzone | Aug 2007 | A1 |
20070212023 | Whillock | Sep 2007 | A1 |
20070226766 | Poslinski et al. | Sep 2007 | A1 |
20070239856 | Abadir | Oct 2007 | A1 |
20070245379 | Agnihortri | Oct 2007 | A1 |
20070250777 | Chen et al. | Oct 2007 | A1 |
20070288951 | Ray et al. | Dec 2007 | A1 |
20080022012 | Wang | Jan 2008 | A1 |
20080060006 | Shanks et al. | Mar 2008 | A1 |
20080064490 | Ellis | Mar 2008 | A1 |
20080086743 | Cheng et al. | Apr 2008 | A1 |
20080092168 | Logan et al. | Apr 2008 | A1 |
20080097949 | Kelly et al. | Apr 2008 | A1 |
20080109307 | Ullah | May 2008 | A1 |
20080115166 | Bhogal et al. | May 2008 | A1 |
20080134043 | Georgis et al. | Jun 2008 | A1 |
20080155602 | Collet et al. | Jun 2008 | A1 |
20080159708 | Kazama | Jul 2008 | A1 |
20080163305 | Johnson et al. | Jul 2008 | A1 |
20080168503 | Sparrell | Jul 2008 | A1 |
20080178219 | Grannan | Jul 2008 | A1 |
20080193016 | Lim et al. | Aug 2008 | A1 |
20080195457 | Sherman et al. | Aug 2008 | A1 |
20080235348 | Dasgupta | Sep 2008 | A1 |
20080239169 | Moon et al. | Oct 2008 | A1 |
20080244666 | Moon et al. | Oct 2008 | A1 |
20080270038 | Partovi et al. | Oct 2008 | A1 |
20080271078 | Gossweiler et al. | Oct 2008 | A1 |
20080300982 | Larson et al. | Dec 2008 | A1 |
20080307485 | Clement et al. | Dec 2008 | A1 |
20080320523 | Morris et al. | Dec 2008 | A1 |
20090025027 | Craner | Jan 2009 | A1 |
20090034932 | Oisel | Feb 2009 | A1 |
20090055385 | Jean et al. | Feb 2009 | A1 |
20090080857 | St. John-Larkin | Mar 2009 | A1 |
20090082110 | Relyea et al. | Mar 2009 | A1 |
20090102984 | Arlina et al. | Apr 2009 | A1 |
20090138902 | Kamen | May 2009 | A1 |
20090144777 | Mikami et al. | Jun 2009 | A1 |
20090158357 | Miller | Jun 2009 | A1 |
20090178071 | Whitehead | Jul 2009 | A1 |
20090210898 | Childress et al. | Aug 2009 | A1 |
20090228911 | Vriisen | Sep 2009 | A1 |
20090234828 | Tu | Sep 2009 | A1 |
20090235313 | Maruyama et al. | Sep 2009 | A1 |
20090249412 | Bhogal et al. | Oct 2009 | A1 |
20090293093 | Igarashi | Nov 2009 | A1 |
20090299824 | Barnes | Dec 2009 | A1 |
20090325523 | Choi | Dec 2009 | A1 |
20100005485 | Tian et al. | Jan 2010 | A1 |
20100040151 | Garrett | Feb 2010 | A1 |
20100064306 | Tiongson et al. | Mar 2010 | A1 |
20100071007 | Meijer | Mar 2010 | A1 |
20100071062 | Choyi et al. | Mar 2010 | A1 |
20100086277 | Craner | Apr 2010 | A1 |
20100089996 | Koolar | Apr 2010 | A1 |
20100115554 | Drouet et al. | May 2010 | A1 |
20100122294 | Craner | May 2010 | A1 |
20100123830 | Vunic | May 2010 | A1 |
20100125864 | Dwyer et al. | May 2010 | A1 |
20100146560 | Bonfrer | Jun 2010 | A1 |
20100153856 | Russ | Jun 2010 | A1 |
20100153983 | Phillmon et al. | Jun 2010 | A1 |
20100153999 | Yates | Jun 2010 | A1 |
20100158479 | Craner | Jun 2010 | A1 |
20100166389 | Knee et al. | Jul 2010 | A1 |
20100169925 | Takegoshi | Jul 2010 | A1 |
20100218214 | Fan et al. | Aug 2010 | A1 |
20100251295 | Amento et al. | Sep 2010 | A1 |
20100251304 | Donoghue et al. | Sep 2010 | A1 |
20100251305 | Kimble et al. | Sep 2010 | A1 |
20100262986 | Adimatvam et al. | Oct 2010 | A1 |
20100269144 | Forsman et al. | Oct 2010 | A1 |
20100319019 | Zazza | Dec 2010 | A1 |
20100322592 | Casagrande | Dec 2010 | A1 |
20100333131 | Parker et al. | Dec 2010 | A1 |
20110016492 | Marita | Jan 2011 | A1 |
20110016493 | Lee et al. | Jan 2011 | A1 |
20110019839 | Nandury | Jan 2011 | A1 |
20110052156 | Kuhn | Mar 2011 | A1 |
20110072448 | Stiers et al. | Mar 2011 | A1 |
20110082858 | Yu et al. | Apr 2011 | A1 |
20110109801 | Thomas et al. | May 2011 | A1 |
20110161242 | Chung et al. | Jun 2011 | A1 |
20110170008 | Koch | Jul 2011 | A1 |
20110173337 | Walsh et al. | Jul 2011 | A1 |
20110202956 | Connelly et al. | Aug 2011 | A1 |
20110206342 | Thompson et al. | Aug 2011 | A1 |
20110212756 | Packard et al. | Sep 2011 | A1 |
20110217024 | Schlieski et al. | Sep 2011 | A1 |
20110231887 | West | Sep 2011 | A1 |
20110239249 | Murison et al. | Sep 2011 | A1 |
20110243533 | Stern et al. | Oct 2011 | A1 |
20110252451 | Turgeman et al. | Oct 2011 | A1 |
20110286721 | Craner | Nov 2011 | A1 |
20110289410 | Paczkowski et al. | Nov 2011 | A1 |
20110293113 | McCarthy | Dec 2011 | A1 |
20120020641 | Sakaniwa et al. | Jan 2012 | A1 |
20120047542 | Lewis et al. | Feb 2012 | A1 |
20120052941 | Mo | Mar 2012 | A1 |
20120060178 | Minakuchi et al. | Mar 2012 | A1 |
20120082431 | Sengupta et al. | Apr 2012 | A1 |
20120106932 | Grevers, Jr. | May 2012 | A1 |
20120110615 | Kilar et al. | May 2012 | A1 |
20120110616 | Kilar et al. | May 2012 | A1 |
20120124625 | Foote et al. | May 2012 | A1 |
20120131613 | Ellis et al. | May 2012 | A1 |
20120185895 | Wong et al. | Jul 2012 | A1 |
20120189273 | Folgner et al. | Jul 2012 | A1 |
20120204209 | Kuba | Aug 2012 | A1 |
20120216118 | Lin et al. | Aug 2012 | A1 |
20120230651 | Chen | Sep 2012 | A1 |
20120237182 | Eyer | Sep 2012 | A1 |
20120246672 | Sridhar et al. | Sep 2012 | A1 |
20120260295 | Rondeau | Oct 2012 | A1 |
20120263439 | Lassman et al. | Oct 2012 | A1 |
20120278834 | Richardson | Nov 2012 | A1 |
20120278837 | Gurtis et al. | Nov 2012 | A1 |
20120284745 | Strange | Nov 2012 | A1 |
20120311633 | Mandrekar et al. | Dec 2012 | A1 |
20120324491 | Bathiche et al. | Dec 2012 | A1 |
20130014159 | Wiser et al. | Jan 2013 | A1 |
20130042179 | Cormack et al. | Feb 2013 | A1 |
20130055304 | Kirby et al. | Feb 2013 | A1 |
20130061313 | Cullimore et al. | Mar 2013 | A1 |
20130073473 | Heath | Mar 2013 | A1 |
20130074109 | Skelton et al. | Mar 2013 | A1 |
20130114940 | Merzon et al. | May 2013 | A1 |
20130128119 | Madathodiyil et al. | May 2013 | A1 |
20130138435 | Weber | May 2013 | A1 |
20130138693 | Sathish | May 2013 | A1 |
20130145023 | Li et al. | Jun 2013 | A1 |
20130160051 | Armstrong et al. | Jun 2013 | A1 |
20130174196 | Herlein | Jul 2013 | A1 |
20130194503 | Yamashita | Aug 2013 | A1 |
20130226983 | Beining et al. | Aug 2013 | A1 |
20130251331 | Sambongi | Sep 2013 | A1 |
20130263189 | Garner | Oct 2013 | A1 |
20130268620 | Osminer | Oct 2013 | A1 |
20130268955 | Conrad et al. | Oct 2013 | A1 |
20130283162 | Aronsson et al. | Oct 2013 | A1 |
20130291037 | Im et al. | Oct 2013 | A1 |
20130298146 | Conrad et al. | Nov 2013 | A1 |
20130298151 | Leske et al. | Nov 2013 | A1 |
20130315560 | Kritt et al. | Nov 2013 | A1 |
20130325869 | Reiley et al. | Dec 2013 | A1 |
20130326406 | Reiley et al. | Dec 2013 | A1 |
20130326575 | Robillard et al. | Dec 2013 | A1 |
20130332962 | Moritz et al. | Dec 2013 | A1 |
20130332965 | Seyller et al. | Dec 2013 | A1 |
20130337920 | Packard et al. | Dec 2013 | A1 |
20130346302 | Purves et al. | Dec 2013 | A1 |
20140023348 | O'Kelly et al. | Jan 2014 | A1 |
20140028917 | Smith et al. | Jan 2014 | A1 |
20140032709 | Saussy et al. | Jan 2014 | A1 |
20140062696 | Packard et al. | Mar 2014 | A1 |
20140067825 | Oztaskent et al. | Mar 2014 | A1 |
20140067828 | Archibong | Mar 2014 | A1 |
20140067939 | Packard et al. | Mar 2014 | A1 |
20140068675 | Mountain | Mar 2014 | A1 |
20140068692 | Archibong et al. | Mar 2014 | A1 |
20140074866 | Shah | Mar 2014 | A1 |
20140082670 | Papish | Mar 2014 | A1 |
20140088952 | Fife et al. | Mar 2014 | A1 |
20140114647 | Allen | Apr 2014 | A1 |
20140114966 | Bilinski et al. | Apr 2014 | A1 |
20140123160 | van Coppenolle et al. | May 2014 | A1 |
20140130094 | Kirby et al. | May 2014 | A1 |
20140139555 | Levy | May 2014 | A1 |
20140140680 | Jo | May 2014 | A1 |
20140150009 | Sharma | May 2014 | A1 |
20140153904 | Adimatvam et al. | Jun 2014 | A1 |
20140157327 | Roberts et al. | Jun 2014 | A1 |
20140161417 | Kurupacheril et al. | Jun 2014 | A1 |
20140215539 | Chen et al. | Jul 2014 | A1 |
20140223479 | Krishnamoorthi et al. | Aug 2014 | A1 |
20140282714 | Hussain | Sep 2014 | A1 |
20140282741 | Shoykhet | Sep 2014 | A1 |
20140282744 | Hardy et al. | Sep 2014 | A1 |
20140282745 | Chipman et al. | Sep 2014 | A1 |
20140282759 | Harvey et al. | Sep 2014 | A1 |
20140282779 | Navarro | Sep 2014 | A1 |
20140294201 | Johnson et al. | Oct 2014 | A1 |
20140298378 | Kelley | Oct 2014 | A1 |
20140310819 | Cakarel et al. | Oct 2014 | A1 |
20140313341 | Stribling | Oct 2014 | A1 |
20140321831 | Olsen et al. | Oct 2014 | A1 |
20140325556 | Hoang et al. | Oct 2014 | A1 |
20140331260 | Gratton | Nov 2014 | A1 |
20140333841 | Steck | Nov 2014 | A1 |
20140351045 | Abihssira et al. | Nov 2014 | A1 |
20140373079 | Friedrich et al. | Dec 2014 | A1 |
20150003814 | Miller | Jan 2015 | A1 |
20150012656 | Phillips et al. | Jan 2015 | A1 |
20150020097 | Freed et al. | Jan 2015 | A1 |
20150040176 | Hybertson et al. | Feb 2015 | A1 |
20150052568 | Glennon et al. | Feb 2015 | A1 |
20150058890 | Kapa | Feb 2015 | A1 |
20150082172 | Shakib | Mar 2015 | A1 |
20150095932 | Ren | Apr 2015 | A1 |
20150106842 | Lee | Apr 2015 | A1 |
20150110461 | Maisenbacher | Apr 2015 | A1 |
20150110462 | Maisenbacher et al. | Apr 2015 | A1 |
20150118992 | Wyatt et al. | Apr 2015 | A1 |
20150181132 | Kummer et al. | Jun 2015 | A1 |
20150181279 | Martch et al. | Jun 2015 | A1 |
20150189377 | Wheatley et al. | Jul 2015 | A1 |
20150243326 | Pacurariu | Aug 2015 | A1 |
20150249803 | Tozer et al. | Sep 2015 | A1 |
20150249864 | Tang et al. | Sep 2015 | A1 |
20150281778 | Xhafa et al. | Oct 2015 | A1 |
20150310725 | Koskan et al. | Oct 2015 | A1 |
20150310894 | Stieglitz | Oct 2015 | A1 |
20150334461 | Yu | Nov 2015 | A1 |
20150358687 | Kummer | Dec 2015 | A1 |
20150358688 | Kummer | Dec 2015 | A1 |
20150379132 | Cho | Dec 2015 | A1 |
20160066020 | Mountain | Mar 2016 | A1 |
20160066026 | Mountain | Mar 2016 | A1 |
20160066042 | Dimov et al. | Mar 2016 | A1 |
20160066049 | Mountain | Mar 2016 | A1 |
20160066056 | Mountain | Mar 2016 | A1 |
20160073172 | Sharples | Mar 2016 | A1 |
20160088351 | Petruzzelli | Mar 2016 | A1 |
20160105708 | Packard et al. | Apr 2016 | A1 |
20160105733 | Packard et al. | Apr 2016 | A1 |
20160105734 | Packard et al. | Apr 2016 | A1 |
20160191147 | Martch | Jun 2016 | A1 |
20160198229 | Keipert | Jul 2016 | A1 |
20160309212 | Martch et al. | Oct 2016 | A1 |
20170032630 | Gervais et al. | Feb 2017 | A1 |
20170164055 | Sohn | Jun 2017 | A1 |
20180014072 | Dimov et al. | Jan 2018 | A1 |
20190289372 | Merler et al. | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
101650722 | Oct 2011 | CN |
105912560 | Aug 2016 | CN |
1469476 | Oct 2004 | EP |
1865716 | Dec 2007 | EP |
2902568 | Dec 2007 | EP |
1919216 | May 2008 | EP |
2107477 | Oct 2009 | EP |
2309733 | Apr 2011 | EP |
2403239 | Jan 2012 | EP |
2464138 | Jun 2012 | EP |
10322622 | Dec 1998 | JP |
2001251581 | Sep 2001 | JP |
2002259720 | Sep 2002 | JP |
2003032654 | Jan 2003 | JP |
2004072540 | Mar 2004 | JP |
2004128795 | Apr 2004 | JP |
2004260297 | Sep 2004 | JP |
2005-317165 | Nov 2005 | JP |
2006211311 | Aug 2006 | JP |
2006-245745 | Sep 2006 | JP |
2006333451 | Dec 2006 | JP |
2007142900 | Jun 2007 | JP |
2007202206 | Aug 2007 | JP |
2007524160 | Aug 2007 | JP |
2008167019 | Jul 2008 | JP |
2011228918 | Nov 2011 | JP |
2012-029150 | Feb 2012 | JP |
5034516 | Sep 2012 | JP |
2013-175854 | Sep 2013 | JP |
2014067272 | Apr 2014 | JP |
2014-157460 | Aug 2014 | JP |
2014187687 | Oct 2014 | JP |
2018501533 | Jan 2018 | JP |
2004-0025073 | Mar 2004 | KR |
2006-0128295 | Dec 2006 | KR |
9837694 | Aug 1998 | WO |
0243353 | May 2002 | WO |
2005059807 | Jun 2005 | WO |
2007064987 | Jun 2007 | WO |
2007098067 | Aug 2007 | WO |
2009032046 | Mar 2009 | WO |
2009073925 | Jun 2009 | WO |
2011040999 | Apr 2011 | WO |
2013016626 | Jan 2013 | WO |
2013103580 | Jul 2013 | WO |
2013166456 | Nov 2013 | WO |
2014072742 | May 2014 | WO |
2014164782 | Oct 2014 | WO |
2014179017 | Nov 2014 | WO |
2016030380 | Mar 2016 | WO |
2016030384 | Mar 2016 | WO |
2016030477 | Mar 2016 | WO |
2016033545 | Mar 2016 | WO |
2016034899 | Mar 2016 | WO |
2016055761 | Apr 2016 | WO |
2016057416 | Apr 2016 | WO |
2016057844 | Apr 2016 | WO |
Entry |
---|
Boxfish/TV's API; www.boxfish.com, retrieved Mar. 28, 2017, 8 pages. |
International Search Report for PCT/US2014/060651 dated Jan. 19, 2015 (9 pages). |
International Search Report for PCT/US2014/060649 dated Jan. 8, 2015 (9 pages). |
Thuuz Sports, “Frequently Asked Questions”, www.thuuz.com/faq/, retrieved Mar. 28, 17, 7 pages. |
Jin, S.H., et al., “Intelligent Broadcasting System and Services for Personalized Semantic Contents Consumption”, Expert Systems with Applications, Oxford, GB, vol. 31, No. 1, Jul. 1, 2006, pp. 164-173. |
Jin, S.H., et al., “Real-time content filtering for live broadcasts in TV terminals”, Multimedia Tools and Applications, Kluwer Academic Publishers, BO, vol. 36, No. 3, Jun. 29, 2007, pp. 285-301. |
R. Natarajan et al. “Audio-Based Event Detection in Videos—A Comprehensive Survey”, Int. Journal of Engineering and Technology, vol. 6 No. 4 Aug-Sep. 2014. |
Q. Huang et al. “Hierarchical Language Modeling for Audio Events Detection in a Sports Game”, IEEE International Conference on Acoustics, Speech and Signal Processing, 2010. |
Q. Huang et al. “Inferring the Structure of a Tennis Game Using Audio Information”, IEEE Trans. on Audio Speech and Language Proc., Oct. 2011. |
M. Baillie et al. “Audio-based Event Detection for Sports Video”, International Conference on Image and Video, CIVR 2003. |
Y. Rui et al. “Automatically Extracting Highlights for TV Baseball Programs”, Proceedings of the eighth ACM International conference on Multimedia, 2000. |
D. A. Sadlier et al. “A Combined Audio-Visual Contribution to Event Detection in Field Sports Broadcast Video. Case Study: Gaelic Football”, Proceedings of the 3rd IEEE International Symposium on Signal Processing and Information Technology, Dec. 2003. |
E. Kijak et al. “Audiovisual Integration for Tennis Broadcast Structuring”, Multimedia Tools and Applications, Springer, vol. 30, Issue 3, pp. 289-311, Sep. 2006. |
A. Baijal et al. “Sports Highlights Generation Based on Acoustic Events Detection: A Rugby Case Study”, IEEE International Conference on Consumer Electronics (ICCE), 2015. |
J. Han et al. “A Unified and Efficient Framework for Court-Net Sports Video Analysis Using 3-D Camera Modeling”, Proceedings vol. 6506, Multimedia Content Access: Algorithms and Systems; 65060F (2007). |
Huang-Chia Shih “A Survey on Content-aware Video Analysis for Sports”, IEEE Trans. on Circuits and Systems for Video Technology, vol. 99, No. 9, Jan. 2017. |
A. Krizhevsky et al. “ImageNet Classification with Deep Convolutional Neural Networks”, In Proc. NIPS, pp. 1097-1105, 2012. |
D. A. Sadlier et al. “Event Detection in Field Sports Video Using Audio-Visual Features and a Support Vector Machine”, IEEE Trans. on Circuits and Systems for Video Technology, vol. 15, No. 10, Oct. 2005. |
P. F. Felzenszwalb et al. “Efficient Graph-Based Image Segmentation”, International Journal of Computer Vision, Sep. 2004, vol. 59, Issue 2, pp. 167-181. |
C. J. C. Burges “A Tutorial on Support Vector Machines for Pattern Recognition”, Springer, Data Mining and Knowledge Discovery, Jun. 1998, vol. 2, Issue 2, pp. 121-167. |
Y.A. LeCun et al. “Efficient BackProp” Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol. 7700, Springer, 2012. |
L. Neumann, J. Matas, “Real-Time Scene Text Localization and Recognition”, 5th IEEE Conference on Computer Vision and Pattern Recognition, Jun. 2012. |
R. Smith “An Overview of the Tesseract OCR Engine”, International Conference on Document Analysis and Recognition (ICDAR), 2007. |
M. Merler, et al., “The Excitement of Sports: Automatic Highlights Using Audio/Visual Cues”, Dec. 31, 2017, pp. 2520-2523. |
H. Harb, et al., Highlights Detection in Sports Videos Based on Audio Analysis, pp. 1-4, Sep. 2009. |
J. Ye, et al., Audio-Based Sports Highlight Detection by Fourier Local-Auto-Correlations, 11th Annual Conference of the International Speech Communication Association, Sep. 2010, pp. 2198-2201. |
Miyamori, Hisashi “Automatic Generation of Personalized Digest Based on Context Flow and Distinctive Events”, IEICE Technical Report, Jul. 10, 2003, vol. 103, No. 209, pp. 35-40. |
Number | Date | Country | |
---|---|---|---|
62221999 | Sep 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14510481 | Oct 2014 | US |
Child | 14710438 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14877691 | Oct 2015 | US |
Child | 15264928 | US | |
Parent | 14710438 | May 2015 | US |
Child | 14877691 | US | |
Parent | 14510481 | Oct 2014 | US |
Child | 14710438 | US |