Television viewers often browse different channels to monitor multiple programs or find a more interesting program to watch. A viewer may exhibit this behavior in various situations including checking a score in a sporting event, waiting for a particular program or segment to begin to air, or checking to see whether the program has resumed after a commercial break. Typically, viewers only view one or two channels on a television screen at a given moment and, therefore, they have limited awareness in the other channels not displayed on the television screen.
While a viewer may be able to watch one program and record a second program for later viewing, this approach is not perfect for keeping abreast of the two programs. The viewer may want to watch the programs live, or the programs may be enjoyed more if watched live. One example of a program that a viewer is more likely to desire to follow live is a sporting event and, often, many sporting events are being broadcast simultaneously. Thus, the viewer may be switching between the various sporting events in an attempt to keep up to date on the events' status, and the viewer may miss an opportunity to watch a sporting event, program, or other content that is more immediately of interest to them.
Thus, there remains an ever-present need to provide more useful tools to viewers so that they may be presented with information about content, such as a television program, that enables the viewers to learn of a development or event in the content or enables the viewers to switch to content that is more immediately of interest to them.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosure. The summary is not an extensive overview of the disclosure. It is neither intended to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the description below.
Some aspects of this disclosure relate to methods and systems for providing a user with data, e.g., in the form of messages, that include information describing events that occurred in content, such as a scene in a television program or movie, a development or play in a sporting event, or some other occurrence in another video program. Each event message may enable a user to view video (or other content) of the event once the user desires to view the event. Some of the methods and systems described herein provide for the presentation, e.g., display, of event messages to a user, the storage of the event messages so that the user may be able to browse through the stored event messages and decide when to view the video of the event, and the display of the video upon selection by a user
In one or more embodiments, event messages may be transmitted to a user device. A computing device may register a device that is to receive event messages. The computing device may receive event message criteria, such as information identifying a user's request to be notified of the occurrence of a predetermined event in a transmitted content. At an appropriate time (e.g., during transmission of the content from a content provider, or in accordance with event message criteria), a computing device may determine that the predetermined event has occurred in the transmitted content. Responsive to determining that the event has occurred, the event message may be generated. The event message may alert the user of the occurrence of the predetermined event in the transmitted content. The event message may also include an option to initiate presentation of a portion of the content during which the predetermined event occurred, such as a stream of video. In some embodiments, the option may be a link to video of the predetermined event or data of the video, so that the user may view the video when they select the event message. Subsequently, the event message may be transmitted to one or multiple user devices. For example, the user device could be a device in communication with a television being viewed by the user and the user may also be using a second device, such as a tablet computing device, smartphone, or personal computer. In such arrangements, the event message could also be transmitted to the second device, for example, to display the event message on both the television (via the user device) and the second device.
Event messages may also include information describing multiple events and, thus, include a link or data, e.g., of a composite video, that is for the multiple events. For example, a user may be able to enter his or her starting lineup of a fantasy team as event message criteria. At a specified time (e.g., after the sporting events are complete or in accordance with user-defined criteria), an event message can be generated that includes the fantasy scoring plays for the user's fantasy team. In other words, the event message can be used to provide a video summary of the scoring plays that contributed to the score of the user's fantasy team.
The details of these and other embodiments of the present disclosure are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
The present disclosure is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
In the following description of various illustrative embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown, by way of illustration, various embodiments in which aspects of the disclosure may be practiced. It is to be understood that other embodiments may be utilized, and structural and functional modifications may be made, without departing from the scope of the present disclosure.
There may be one link 101 originating from the local office 103, and it may be split a number of times to distribute the signal to various homes 102 in the vicinity (which may be many miles) of the local office 103. The links 101 may include components not illustrated, such as splitters, filters, amplifiers, etc. to help convey the signal clearly, but in general each split introduces a bit of signal degradation. Portions of the links 101 may also be implemented with fiber-optic cable, while other portions may be implemented with coaxial cable, other links, or wireless communication paths.
The local office 103 may include a termination system (TS) 104, such as a cable modem termination system (CMTS) in a HFC network, which may be a computing device configured to manage communications between devices on the network of links 101 and backend devices such as servers 105-107 (to be discussed further below). The TS may be as specified in a standard, such as the Data Over Cable Service Interface Specification (DOCSIS) standard, published by Cable Television Laboratories, Inc. (a.k.a. CableLabs), or it may be a similar or modified device instead. The TS may be configured to place data on one or more downstream frequencies to be received by modems or other user devices at the various premises 102, and to receive upstream communications from those modems on one or more upstream frequencies. The local office 103 may also include one or more network interfaces 108, which can permit the local office 103 to communicate with various other external networks 109. These networks 109 may include, for example, networks of Internet devices, telephone networks, cellular telephone networks, fiber optic networks, local wireless networks (e.g., WiMAX), satellite networks, and any other desired network, and the interface 108 may include the corresponding circuitry needed to communicate on the network 109, and to other devices on the network such as a cellular telephone network and its corresponding cell phones.
As noted above, the local office 103 (e.g., a data processing and/or distribution facility) may include a variety of servers 105-107 that may be configured to perform various functions. For example, the local office 103 may include a push notification server 105. The push notification server 105 may generate push notifications to deliver data and/or commands to the various homes 102 in the network (or more specifically, to the devices in the homes 102 that are configured to detect such notifications). The local office 103 may also include a content server 106. The content server 106 may be one or more computing devices that are configured to provide content to users in the homes. This content may be, for example, video on demand movies, television programs, songs, text listings, etc. The content server 106 may include software to validate user identities and entitlements, locate and retrieve requested content, encrypt the content, and initiate delivery (e.g., streaming) of the content to the requesting user and/or device.
The local office 103 may also include one or more application servers 107. An application server 107 may be a computing device configured to offer any desired service, and may run various languages and operating systems (e.g., servlets and JSP pages running on Tomcat/MySQL, OSX, BSD, Ubuntu, Redhat, HTML5, JavaScript, AJAX and COMET). For example, an application server may be responsible for collecting television program listings information and generating a data download for electronic program guide listings. Another application server may be responsible for monitoring user viewing habits and collecting that information for use in selecting advertisements. Another application server may be responsible for formatting and inserting advertisements in a video stream being transmitted to the premises 102. Another application server may be responsible for formatting and providing data for an interactive service being transmitted to the premises 102 (e.g., chat messaging service, etc.).
An example premises 102a may include an interface 120. The interface 120 may comprise a modem 110, which may include transmitters and receivers used to communicate on the links 101 and with the local office 103. The modem 110 may be, for example, a coaxial cable modem (for coaxial cable links 101), a fiber interface node (for fiber optic links 101), or any other desired device offering similar functionality. The interface 120 may also comprise a gateway interface device 111 or gateway. The modem 110 may be connected to, or be a part of, the gateway interface device 111. The gateway interface device 111 may be a computing device that communicates with the modem 110 to allow one or more other devices in the premises to communicate with the local office 103 and other devices beyond the local office. The gateway 111 may comprise a set-top box (STB), digital video recorder (DVR), computer server, or any other desired computing device. The gateway 111 may also include (not shown) local network interfaces to provide communication signals to devices in the premises, such as display devices 112 (e.g., televisions), additional STBs 113, personal computers 114, laptop computers 115, wireless devices 116 (wireless laptops and netbooks, mobile phones, mobile televisions, personal digital assistants (PDA), etc.), and any other desired devices. Examples of the local network interfaces include Multimedia Over Coax Alliance (MoCA) interfaces, Ethernet interfaces, universal serial bus (USB) interfaces, wireless interfaces (e.g., IEEE 802.11), Bluetooth interfaces, and others.
The
One or more aspects of the disclosure may be embodied in computer-usable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other data processing device. The computer executable instructions may be stored on one or more computer readable media such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc. The functionality of the program modules may be combined or distributed as desired in various embodiments. In addition, the functionality may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), application-specific integrated circuits (ASIC), and the like. Particular data structures may be used to more effectively implement one or more aspects of the invention, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
Various aspects of this disclosure relate to providing a viewer (also interchangeably referred herein as a user) with messages that include information describing events that have occurred in content. For examples, messages may be sent for scoring plays that occur during a sporting event or for scenes involving a particular actor or character in a television program. The content may include video data and/or audio data, such as a song or audio for corresponding video content. Further, while many of the examples described herein will be discussed in terms of video, the video can be accompanied by audio. Additionally, the concepts discussed herein may be similarly applied to an audio-only arrangement or other content arrangement, instead of a video or audio-video arrangement.
Each event message may inform a user of its occurrence, and may enable a user to view video of the event once the user desires to watch the event. The methods and systems described herein provide for the display of event messages to a user. In some arrangements, event messages may be generated/displayed as they occur in content. For example, when a player scores a touchdown during a football game, an event message may be generated at that time, or shortly thereafter, to inform users of this occurrence. The message may be of particular importance to fans of that player, or to fantasy football team owners who have that player on their starting roster. Alternatively, event messages may be generated at times independent of when the event occurred, such as when transmission of the content is complete or at times specified by a user or content provider. Additionally, the event messages may be stored so that a user may be able to browse through the stored event messages and decide when to view the video of the event. Upon a user selecting an event message, the video for the events may be transmitted to a device of the user so that the user may view the video.
The embodiments described herein include arrangements where event messages may be displayed on a single device (e.g., on a television via a gateway device or set top box) or on multiple devices (e.g., viewed on a television and another user device, such as a tablet, mobile device, or personal computer).
Content, such as video content, may be transmitted (e.g., streamed) from a local office 304 (e.g., a data processing and/or distribution facility) to the interfaces of various premises, such as premise 303 (see also local office 103, premises 102, and interface 120 of
Although
Further, each of the second screen devices 302 may be configured to bi-directionally communicate via a wired and/or wireless connection with a second screen experience computing device 340 via the network 330. Specifically, the second screen devices 302 may be configured to access the network 330 (e.g., the Internet or any other local or wide area network, either public or private) to obtain data and to transmit/receive the data via the network 330 to/from the second screen experience computing device 340. For example, a second screen device 302 may transmit data through a wired connection, including the local office 304 which then routes the transmission to the network 330 so that it may eventually reach the second screen experience computing device 340. That is, the second screen device 302 may connect to the interface 309 and communicate with the second screen experience computing device 340 over-the-top of the links used to transmit the content from local office 304. Alternatively, the second screen devices 302 may connect directly to the network 330 to communicate with the second screen experience computing device 340. For example, a second screen device 302 may wirelessly communicate using, for example, a WiFi connection and/or cellular backhaul, to connect to the network 330 (e.g., the Internet) and ultimately to the second screen experience computing device 340. Accordingly, although not shown, the network 330 may include cell towers and/or wireless routers for communicating with the second screen devices 302.
Although
Still referring to
The displays in
Some characteristics may be common and uncommon between the different embodiments illustrated in
Information bar 406 may be configured to display the information included in an event message. Additionally, information bar 406 may be placed on the display 401 to identify which event message is currently highlighted by a user. As illustrated in
To indicate the ability of a user to scroll between the various event messages, scrollable display bar 413 may include scroll controls 418 and 419. A user may be able to press the appropriate button (e.g., a scroll left button on a remote and a scroll right button on a remote) to change which events are displayed on the scrollable display bar 413. For example, as illustrated, event message 415 is displayed as being in the middle of the display bar 413. If a user was to scroll left once, event message 415 may move to the right and event message 414 may be displayed as being in the middle of the display bar 413. As a result of scrolling right, event message 416 may no longer be displayed and a different event message (e.g., a message received prior to event message 414) may be displayed on the left of event message 414. The event messages may also automatically scroll upon receipt of new event messages. For example, when a new event message is received, event messages 416 and 415 may move to the left. As a result of scrolling left, event message 414 may no longer be displayed and the new event message may be displayed to the right of event message 416.
In some embodiments, as new event messages are received, the new event message may be displayed in an order of receipt or based on some other message priority scheme. For example, event messages 414, 415 and 416 may be ordered in accordance to when they were received (e.g., message 414 was received prior to message 415, and message 415 was received prior to message 416). Alternatively, the messages could be arranged in a message priority queue, where messages with a higher priority are inserted at the front of the queue, while those with a lower priority are inserted towards the end of the queue. The user may be able to scroll through the queue of items, but display 411 may default to a view of the front of the queue or provide a view of the front of the queue every time a new event message is displayed. Additionally, as the user views or selects event messages in the queue, the viewed or selected event message may be removed from the queue and sent to a message archive that the user can view at a later time.
Event messages may also be received for content different from the four examples listed above. Such event messages would be organized under different identifiers (not shown) and a user may be able to scroll to them using the appropriate controls. The ability for a user to scroll left or right through the stored event messages (or the control a user must highlight to be able to scroll left or right) is depicted by scroll controls 428 and 429, respectively.
What content identifiers are shown on display 421 may depend on a content priority or other preference, such as a user preference or content provider preference. For example, the content identifiers with the most number of new or unviewed events may be displayed on display 421 (e.g., identifier 425 is shown on display 421 because it has the most number of unviewed event messages, while another identifier is not shown on display 421 because there are no unviewed event messages). As another example, the user or a content provider may be able to set certain content with a high priority so that when an event message is received for that content, the identifier will be shown on the display 421 (e.g., a user may set the Bears v. Packers game as high priority so that identifier 425 may always be displayed on display 421 when an event message is received and/or if there are unviewed messages for the Bears v. Packers game).
The content identifiers 424, 425, 426 and 427 may display various information. For example, an image, text or an animation of images may be displayed for each content identifier. In some arrangements, the image, text or animation of images may be taken from the most recent event message received for the content. For example, as depicted in
When a user highlights a particular identifier, information pane 423 displays information related to one or more of the event messages associated with the identifier. For example, when a user scrolls such that identifier 425 is overlaid on information pane 423, a listing of the event messages associated with identifier 425 may be displayed in history panel 430. A short description may be included in the listing for each event message, such as a date or time the event occurred and/or a brief text string describing the event (e.g., “Johnson goes over 200 yards receiving”).
The user may be able to scroll up and down through the listing displayed on history panel 430 to highlight a particular event message. When the user highlights a particular event message in the listing, the user may select it to view additional information about the message (e.g., description panel 440 may include the entire description of the event message and any feed information included in the event message). Further, after selecting a particular event message, the user may select any of the options within options panel 435. Options panel 435 may include various selectable options, such as, for example, a “play now” option (shown) that causes video of the event to be displayed as video program 422 upon selection by a user. Options panel 435 may include a “bookmark” option (shown) that causes the event message to be added to a favorite list of the user so the user can find the event message on a favorite list when desired (in embodiments with a favorite list, the favorite list may be viewed in various ways, such as its own identifier that a user can scroll to while scrolling through identifiers 424, 425, 426 and 427, or the user may view the favorite list by pressing an appropriate control). Options panel 435 may include a “send to a friend” option (shown) that allows a user to e-mail, short message service (SMS) or otherwise transmit the event message, a link to the video of the event, or the video itself, to one or more other users. In some arrangements, options panel 435 may include an “other options” option that activates a display listing all other options to the user, such as a display that would allow the user to set up a digital video recording of the program corresponding to the event message (e.g., set up a series recording for Seinfeld if the event message is of a scene of a Seinfeld episode) or share the event message or link to the event message's video using social media (e.g., share via Facebook or Twitter). In some embodiments, a user may also be able to delete a received event message or otherwise remove an event message being listed in history panel 430.
In some instances, description panel 440 may include other information than the description of a particular event message. For example, text panel 440 may include a description of the content associated with the highlighted identifier. As one particular example, if content identifier 425 is for a Bears v. Packers game, text panel 440 may include a description of the game, including the channel the game is being broadcast on, the time the game is played, the current score of the game, and a brief textual description of the game (e.g., “the 6-4 Bears play the 6-4 Packers in this important division game that could determine which team wins the division.”).
The user may have previously set up criteria defining which content that he or she wishes to be notified of when certain events occur in the content. As illustrated in
In one example, first content may be a sporting event (e.g., Bears v. Packers), second content 514 may be a movie (e.g., The Natural), and third content 515 may be a different sporting event (e.g., Minnesota v. Detroit). Each content may be currently being transmitted (e.g., streamed or broadcast) from a content provider. As the events occur for the two sporting events and the movie, each event message may be displayed on the first screen device and the second screen device. For example, in one or more embodiments, the first screen device may display the event message 503 as a pop-up display. The pop-up display may include various portions of images and text that are included in the event message in order to identify the event and/or content to the user. The event message may also be displayed on the second screen device as it occurs.
Continuing the example where the first content is a football game between the Bears and Packers, event message 503 may be for a passing touchdown. When the event occurs, the user may be viewing a television program of a different channel as video program 502 of the first display device (e.g., an episode of Seinfeld may be playing as video program 502). When the touchdown pass occurs, event message 503 may be displayed as a pop-up display on the first screen device (as shown in display 501). The pop-up display of the first screen device may include various images or textual data to identify event message 503 as being for video of the touchdown pass. For example, the pop-up display of the first screen device may include an image of one or more players from the Bears or Packers, or a representative frame taken from a video segment of the event (e.g., an image of the quarterback throwing the ball during the touchdown pass play, or an image of the receiver catching the ball during the touchdown pass play). A short text string may also be included on the pop-up display of the first screen device that describes the event (e.g., “Bears score six with long TD pass”). Other information may also be included on the pop-up display of the first screen device, including a logo or other image related to the content or the event.
The event message 503 for the touchdown pass may also be displayed on the second screen device using a similar pop-up display. However, the pop-up display for event message 503 of the second display device may include different information and may be placed differently on display 511 than it was placed on display 501 of the first screen device. For example, the pop-up display of the second screen device may include an advertisement, such as a sponsor for the sporting event or the event message functionality (e.g., “sponsored by Comcast”). Additionally, the pop-up display of the second screen device, instead of being overlaid on a video program (see display 501), may be placed adjacent to an identifier for the event's content. As depicted in
When the pop-up display for the event message 503 is being displayed on either or both of the screen devices, a user may select the event message (e.g., a user may touch, via a touchscreen, the pop-up display of event message 503 on the second screen device) to cause display 511 to display information related to the event message 503. For example, video display panel 512 may display the video for the event of event message 503 as the video program (e.g., display the video of the touchdown pass).
Message panel 520 may include a listing of event messages for the content that were previously received for the content corresponding to the currently selected event message (or selected identifier 513, 514, 515). Additionally, in some arrangements, message panel 520 may include an entry in the listing to view the live video of the content. Each entry included in message panel 520 may be selectable by a user so the video corresponding to the selected entry is displayed on video panel 512. For example, message panel 520 for a football game may include entries for viewing the live football game, and entries for each event message received for that football game such as entries for the touchdown pass (an entry for event message 503) and, for example, an entry for a kickoff touchdown, an entry of a fumble at the goal line and an entry for a 40-yard run.
Display 511 may also display a “watch now” button 517 and an “unfollow” button 518. If a user selects the “watch now” button 517, the video currently being viewed in video panel 512 may be also displayed as the video program 502 of the first screen device. Upon selection of the “watch now” button, a command may be transmitted from the user's premises to a local office of a content provider. The command may be processed so that the same video is also transmitted to the first screen device (e.g., displayed in both video panel 512 and displayed as video program 502 of display 501). For example, after the user selects the pop-up display for event message 503 on the second screen device, video panel 512 may display the video for the touchdown pass. If the user selects the “watch now” button 517, the video for the touchdown pass may also be displayed as video program 502 on the first screen device. In some instances, this may cause a time-shift to occur in video program 502, such as when the video program goes from live video back to a past time to view the touchdown pass.
If a user selects the “unfollow” button 518, the content corresponding to the video currently being viewed in video panel 512 may be removed from those the user is currently receiving events messages. For example, after the user selects the pop-up display for event message 503 on the second screen device, video panel 512 may display the video for the touchdown pass. If the user selects the “unfollow” button 518, the football game (e.g., first content) may be removed from the content that the user wishes to receive event messages. Accordingly, identifier 513 may be removed from display 511 and the system will no longer display event messages for the football game on the first screen device or the second screen device.
Display 511 may include various other user selectable items. For example, each of the identifiers for the content (e.g., identifier 513, 514 and 515) may be selectable by the user to view video of the content in the video panel 512. In some instances, the live video of the content may be displayed upon the user selecting a particular identifier from identifiers 513, 514 and 515 (e.g., display the live football game upon the user selecting identifier 513). The user may also be able to create a queue of events to view (e.g., add an event message to a queue by performing a long-press via a touchscreen or a right click of a mouse when the pointer is over the event message), and display 511 may include a button for viewing the queue or a button to begin viewing video of the events in the queue (not shown) on video panel 512. Display 511 may also include a menu button or other suitable control to allow a user to adjust the settings of the application controlling display 511, or another button to adjust the content for which the user wishes to receive event messages. For example, the user may want to begin receiving event messages for another sporting event and may be able to enter criteria on a data entry display defining what event messages should be displayed (e.g., a user may enter the specific sporting event they want to follow and user-defined criteria that would cause event messages to be displayed upon, for example, scoring plays, or plays involving particular players).
Although the above description explains that the displays in
Referring now to
At step 602, a computing device may register the one or more devices from the device registration data for event messaging. In some instances, registering a device may include placing the device identifier on a listing of devices that are to receive event messages, or registering the device identifier with a computing device that executes an event messaging service. The user's log-in information may also be verified at this time. If correct, the user may be allowed to proceed with receiving event messages. However, if the information is incorrect, the user may have to reattempt the log-in.
Additionally, the computing device may determine whether the device has previously registered for event messages. If the device has already pre-registered, the computing device may activate event messages for that device, such as by activating a device status to “on” so that event messaging will resume for that device. In some arrangements, the device status may stay “on” so long as the device stays connected to the network, or until a message is received at the computing device to disable the event messaging service (such as by a user request to disable event messages from being transmitted to the device).
When registering a device, the computing device may also determine whether a user profile exists for the user. If one is found it may be retrieved or sent to the computing device that executes the event messaging service. In some arrangements, if the user profile has not been established, the user may be prompted to enter information to create the profile. In some arrangements, the user profile may be stored at the device being registered or in a database of a content provider (e.g., a database at a local office). Further, device registration may include receiving the user profile from the device being registered and storing the user profile in a location accessible to the computing device.
Additionally, the computing device may also transmit a listing of registered devices and their addresses to the device being registered, so that the device being registered can store the listing and addresses. Such addresses may be used in some embodiments to indicate that the content corresponding to an event message should be transmitted to a different device (e.g., an event message is selected by a user on a second screen device but the content of the selected event message is to be viewed on a first screen device). If changes to the registered devices occur (e.g., a device is unregistered, or a device's address changes), the computing device may update each registered device with an updated listing and updated addresses.
At step 603, a computing device may determine event messaging criteria. The computing device may determine event messaging criteria in various ways. For example, the computing device may receive information identifying a user's request to be notified of the occurrence of a predetermined event in a transmitted content. For example, a user may input information—via a user device, such as a tablet computer or other computing device at a user's premises—that defines event messaging criteria that is desired by the user. By inputting the event messaging criteria, a user may specify the event messages that he/she will receive. For example, if a user desires only to view event messages for a football game currently being broadcast, the user may specify the football game as part of the event messaging criteria. The user may further desire to only view particular events that occur during the football game. Accordingly, the user may specify additional criteria, such as criteria specifying that he/she wants to receive only scoring play events, events for plays involving particular players and the like. In some embodiments, a user may be allowed to specify any desired criteria (e.g., allow the user to input text that will be matched to an event). In other embodiments, the user may be limited to various criteria choices that the content provider makes available to a user (e.g., the user is limited to 10 different types of event messages for a football game, a different set of 10 types of event messages for a television program, etc.). Event message criteria may also be retrieved or determined from the user profile. Further, event message criteria may be included in the content by a content provider or content creator (e.g., “suggested event message criteria,” which may be based on a user's prior use of content or the provider's own suggestions). Such suggested event message criteria may be retrieved or determined from the content.
The limited criteria choices may be based on the types of information gathered through a content recognition process. Content recognition processes, such as video and audio analysis processes, may be used as part of a content segmenting process that generates content segments. In some embodiments, the content segments may be used when determining whether to generate an event message based on the event message criteria. The content recognition processes may result in data such as text or metadata that describes the content segments. For example, text or metadata may include a short descriptive title for the content segment and text describing one or more entities present in the segment (e.g., the name of a player that appeared on a display during a football play, or the name of an actor that appeared in or spoke during the content segment); a description of the music sound tracks playing during the content; a description of the geographic location where the content depicts/takes place (e.g., in a European city or in a sports stadium); a description of any landmark presented in the content (e.g., the Eiffel Tower); a description of any product or brand within the content or associated with the content (e.g., criteria defining a brand of beer); a type of scene within the content (e.g., car chase, gunfight, love scene); a type of program (e.g., news program, sports program); or the like. Further, the text or metadata may include an indication of the type of play (e.g., rushing play, passing play), whether a penalty was called/flagged during the play, whether a score occurred during the play. Segments may also be described with other information that is specific to the type of video being processed. For example, with respect to football content, a segment may be described with such information as yards to go for a first down, the down number, and the like. The choices available for selection may include any of the information in the text/metadata (e.g., a user could set the criteria as 3rd downs with 5 yards or less to go).
The types of information that can be generated by a content recognition process is not limited to only the above-mentioned types. Additionally, descriptive data or “tags” that results from a content recognition process and any data available about the segment can be used to infer additional descriptive data. Continuing the football example above, down number, yards to go, and that the content is a football game may be some of the data available about a segment. A 3rd down and short event can be inferred from that data. The criteria for event messages can be quite extensive and include many events built from knowledge about the type of content.
Additionally, the event messaging criteria may include criteria defining when an event message should be generated/sent or defining how many events should be included in an event message. For example, the user may define criteria so that an event message is generated every time an event occurs. Criteria may also specify the time when the system should generate an event message (e.g., a time of day, or only when the content has finished being transmitted). Moreover, many of the above described examples were with respect to an event message for a single event occurrence (e.g., an event message for a single play of a football game). However, an event message could be generated that includes more than one event (herein referred to as a “composite event message”) and the user could specify the criteria for generating such a composite event message. The user may define criteria so that the composite event message includes any event that occurred over a user-specified time period or for the duration of the content's transmission. Continuing the football example, users watching a football game may also be interested in one of his or her fantasy football teams. The user may be able to enter criteria defining the players on his/her team so that an event message showing all fantasy scoring plays can be generated at a specified time (e.g., touchdowns, runs, receptions, turnovers, and the like involving his/her fantasy player).
Event criteria may further include criteria defining what the user does not want to see. For example, a user may never want to view an event that includes a particular person, character, type of scene, music, etc. Such criteria could be combined with other criteria to define both characteristics that should never appear in an event and those that should occur (e.g., identifying a particular singer and additional criteria identifying that an event should never include music, so that events such as interviews about the singer would be presented to the user).
Event messaging criteria may include criteria defining additional recipients of the event message. For example, an event message could be generated for a fantasy football game (e.g., a listing of all players on each fantasy team) and criteria identifying another user that should receive the event message (e.g., another user of the content provider's network, or an e-mail address that could include a link to the video of the events for the fantasy football game). In some instances, the user may be able to specify criteria that can be used when ordering the events of a composite event message.
In addition to user-defined event messaging criteria, event messaging criteria may be determined based on data collected by the content provider. For example, viewing habit data of the user or another user may be used to define criteria for one or more event messages that will be generated and transmitted to the user. As one particular example, a user may commonly watch a television program that is of a similar genre as another television program. The event messaging criteria may include criteria so that event messages related to one or both of the television programs are generated. As another example, the content provider may define event message criteria so that whenever a particular commercial or advertisement occurs, an event message will be generated and transmitted.
At step 605, a computing device may monitor content for events. In general, the computing device may monitor the content by searching a video segment database. For example, a content segmentation process may be executing at the content provider's local office to segment content as the content is broadcast or being otherwise transmitted and generate data that describes each segment (e.g., a short descriptive title for the content segment and one or more entities present in the segment). The content segmentation process may use any number of suitable video and/or audio analysis processes and may result in data entries describing the various segments that were found in the content. In some arrangements, content is stored in a content repository and segment descriptions are stored in a segment database. The segment descriptions may include content identifiers or content link; time links or time-codes, and additional data that describe the segment and enable the content corresponding to the segment to be retrieved from the repository (e.g., the video corresponding to a segment is retrieved according to the content identifier and the beginning and ending time-codes of the segment). When monitoring for events, the computing device may be able to search the segment descriptions, or search the data for a new segment when the new segment is determined by the content segmentation process.
At step 607, a computing device may determine an event of the event messaging criteria has occurred. In general, the computing device may determine whether an event has occurred if criteria of the event messaging criteria matches data describing one or more segments (e.g., the segments produced by a segmentation process running in parallel to the event messaging process). For example, if the event messaging criteria includes criteria to generate event messages whenever a touchdown is scored during a football game, segments determined by the content segmentation process may be monitored for a segment that is a play that is from the football game and where a touchdown was scored (e.g., search for the word “touchdown” or other word used by the content segmentation process to describe a touchdown play). If the event has occurred, the method may proceed to step 608. Otherwise, the method may continue to wait for an event to occur by proceeding back to step 605 to repeat steps of monitoring the content and determining whether an event has occurred.
At step 608, a computing device may generate the event message to, for example, alert the user that the event has occurred in the content. The event message may include an image and/or text that describes the event and, in some arrangements, may include multiple sets of image/text descriptors which are specific to a number of factors, such as display device, application, user rights, etc. The image/text descriptors may also include audio, descriptive icons, links to extended or external information (webpages, text, images, video, stats), and other types of content. In some arrangements, the data included in an event may be retrieved from the searchable database with the content segments. For example, each entry may include an identifier for a keyframe from the segment and a short textual description of the segment. The keyframe may be used as the image for the event message and the short textual description may be included in the event message. Other data may also be included in the event message, including feed information such as channel identifiers and the like. In general, an event message may include any of the information that was described in connection with
In instances where the event message includes one or more events, the computing device may create a composite video that includes the video of each event in the event message. Additionally, the video for the events may be ordered according to various criteria. For example, the video could be ordered based on time (e.g., events that occurred first come sooner in the composite video); according to user-defined criteria, such as by a user-defined priority for the events (e.g., touchdowns come first, followed by long runs, etc.); or ordered by some other criteria that is dependent on the type of event or event message (e.g., the criteria for ordering a composite video for an event message with events from a movie would be different than the criteria for ordering a composite video for a fantasy football game with events from various football games). The content provider may also define various criteria that can be used when ordering the composite video. For example, the content provider may define ordering rules, such as rules that are meant to increase the dramatic effect of the composite video. Continuing the above example of an event message for a fantasy football game, the content provider may mix or alternate fantasy scoring plays of each team to simulate lead changes. Additionally, the content provider may order the video based on the magnitude of the fantasy scoring involved in the play. For example, short runs or receptions may come earlier in the composite video, while touchdowns come later in the composite video. Further, the content provider could include one or more failure plays in the composite video. For example, the system, when building the composite video, may search the segment database for play that failed to result in fantasy points (e.g., a pass that missed a wide-open receiver) to give the impression that points are about to be scored and maybe the impression that a team is about to increase a lead or start catching up to the opponent's score. Inclusion of plays that fail to result in fantasy points may allow for the creation of a composite video for a fantasy game that more accurately matches the randomness of a real football game.
Further, the computing device may insert various other data into the video or the event message. For example, an advertisement may be inserted at one or more places in the video, or a banner advertisement may be included in the event message so that whenever the event message is displayed, a banner advertisement is also displayed on the display. The computing device may also insert images or text to the event message, such as images and text of a sponsor that pays for the event message or the event message service. Interactive content (or a link to the interactive content) may also be included in an event message, such as content that allows a user to purchase access to the content prior to consumption or purchase access to the event message service. For example, upon a user selecting an event message, the user may be directed to a page or application that facilitates the user's purchase. After validating the purchase, the user may be directly presented with the content of the event message. As another example, when a user receives his or her first event message (or first message according to a periodic schedule, such as monthly), upon the user selecting the event message, the user may be required to purchase access to the event message services before viewing any content provided by the event message. After validating the purchase, the user may be directly presented with the content of the event message.
At optional step 609, the computing device may add the event message to an event message log that includes a history of the event messages generated for a user. In some instances, a user may be able to search or view the log in order to select a previously-generated event message at a later time.
At step 610, a computing device may transmit the event message. The computing device may transmit the event message in many different ways. For example, the computing device may transmit the event message to one or more user devices, such as any of the devices described in connection with
The above steps of
At step 613, a computing device may process the user input message. This step involves performing any of the necessary steps to perform the interactions represented by the user input message that was received at step 611 (e.g., add the event message to the event message criteria so that the event message will be generated and transmitted to the user device, remove the event message from the event message criteria, change a video feed to display a requested video feed or video segment, etc.). For example, if the user input message is an event message selection, user input message may include an identification of the selected event message and an identifier of one or more destination devices (e.g., an address of a first screen device and/or a second screen device). The event message that was selected by a user may be identified based on the received user input message, the selected event message may be retrieved from the event message log, and the content identified by the selected event message may be retrieved and transmitted to the one or more destination devices. As another example, if the user input message is for changing the event message criteria, the user profile and the change to the event message criteria may be identified based on the user input message, and the event message criteria and/or the user profile may be changed according to the specified change (e.g., add/delete event message criteria). Processing a user input message may include sending and transmitting data to any number of devices, such as, for example, devices at the content provider's local office and devices under the control of other content distribution networks, in order to complete the user interaction.
Referring now to
At step 703, a computing device may receive and transmit event message criteria. In some arrangements, a user may enter the event message criteria. When the user desires to enter event message criteria, the user may view a display that allows a user to insert or choose the criteria for an event message the user wishes to receive whenever the specified event occurs. In some embodiments, the display for entering the event message criteria may include fields for a different type of criteria that a user may choose from. For example, the user may select from a drop down menu to specify what type of content the event message is for, such as to specify that the event message is for a sporting event, a movie, a television show, a fantasy game, etc. Based on the user's selection, a new set of fields may be displayed that allow the user to enter in the other criteria for the event message. For example, if the event message is for a fantasy game, the data fields may be a number of text entry fields for the players in the fantasy lineup's starting lineup and additional fields for inserting the scoring rules of the fantasy league. If the event message is for a sporting event, the data fields may be various fields allowing the user to choose what types of plays should cause an event message to be generated (e.g., scoring plays, turnovers, plays involving particular players, etc.). Of course, a user does not need to define additional criteria. In some instances, the user may wish to view every event that occurs for particular content. For example, the user may define event message criteria so messages are generated for every event that occurs in a football game or for every scene in a television program. In some arrangements, an event message may be generated every time the content segmentation process determines a new content segment for that football game or television program. In other words, the system determines that an event occurs whenever a new segment for the content is determined by the content segmentation process. The criteria entered by a user may be stored as part of a user profile (either stored locally or on the network).
Criteria may also be generated by a computing device. For example, criteria may be generated based on the user's previous usage or content consumption history. In some arrangements, generated criteria (e.g., criteria not input by a user) may be presented to a user for approval or rejection. If approved, the generated criteria may be used to generate event messages and/or stored as part of the user profile.
At step 705, a computing device may display received video. The received video may include one or more video feeds. For example, a video feed may be received for a video program being viewed by the user on the screen device (e.g., video program 402, 412 and 422 of
At step 707, a computing device may determine whether an event message has been received. In general, the computing device may be iteratively checking for new event messages as data is received at the computing device. Additionally, in some arrangements, the computing device will only begin to determine whether event messages have been received when the user has properly registered the device for event messaging. For example, the computing device may receive a message from a content provider indicating that the device has completed registration for event messaging. Until the device receives the message indicating that registration is complete, the device may not check for event messages. If an event message has been received, the method may proceed to option step 708, in some embodiments, or step 709, in others. Otherwise, the method may continue to monitor for new event messages by returning to step 707 to repeat the determination.
At optional step 708, a computing device may store the event message. This may include storing the event message in a local database or a network database accessible to the computing device.
At step 709, a computing device may display the event message. Displaying the event message may include display the event message in various ways, including the arrangements described in connection with
Upon display of the event message, the user may be able to view the message and decide whether to select the event message and view the video of the event. In addition to selecting an event message, a user may perform various other interactions to modify the event messaging service or the video being displayed.
At step 712, a computing device may determine whether user input has been received. The user may perform various interactions related to the event messages, including a command to change a video feed or to view a video segment (e.g., based upon a user's selection of an event message). In general, any of the above described user interactions (see above description of
At step 713, a computing device may process and/or transmit the user input. For example, the computing device may need to determine whether the user interaction can be processed locally or whether it must be transmitted to a computing device for processing. Some interactions, such as a command to change a video feed or, in some embodiments, view a video of an event message, may need to be transmitted to a computing device for further processing. In such instances, the computing device may generate a user input message that includes the user input and any additional data needed by user input. For example, if the user input is an event message selection, user input message may be generated to include an identification of the selected event message and an identifier of one or more destination devices (e.g., an address of a first screen device and/or a second screen device) depending on where the user is to view the content corresponding to the selected event message. As another example, if the user input is to change the event message criteria, the user input message may be generated to include the user profile or a location of the user profile, and data identifying the change to the event message criteria.
However, some interactions may not be transmitted to a computing device. For example, a user interaction to dismiss the event message or view more detailed information about an event message may not be transmitted to the computing device in some embodiments. Such embodiments may include when the alert is only to be dismissed from a local repository of event messages or the detailed information may be found in the local repository of event messages. This, however, is generally dependent on the system implementation. Notably, if the event messages were stored at the computing device or at another network device, interactions such as dismissing an event message or viewing more detailed information may need to be transmitted to that device.
Aspects of the disclosure have been described in terms of illustrative embodiments thereof. While illustrative systems and methods as described herein embodying various aspects of the present disclosure are shown, it will be understood by those skilled in the art, that the disclosure is not limited to these embodiments. Modifications may be made by those skilled in the art, particularly in light of the foregoing teachings. For example, each of the features of the aforementioned illustrative examples may be utilized alone or in combination or subcombination with elements of the other examples. For example, any of the above described systems and methods or parts thereof may be combined with the other methods and systems or parts thereof described above. For example, the steps illustrated in the illustrative figures may be performed in other than the recited order, and one or more steps illustrated may be optional in accordance with aspects of the disclosure. It will also be appreciated and understood that modifications may be made without departing from the true spirit and scope of the present disclosure. The description is thus to be regarded as illustrative instead of restrictive on the present disclosure.
Number | Name | Date | Kind |
---|---|---|---|
5287489 | Nimmo et al. | Feb 1994 | A |
5321750 | Nadan | Jun 1994 | A |
5353121 | Young et al. | Oct 1994 | A |
5485221 | Banker et al. | Jan 1996 | A |
5521841 | Arman et al. | May 1996 | A |
5530939 | Mansfield, Jr. et al. | Jun 1996 | A |
5583563 | Wanderscheid et al. | Dec 1996 | A |
5589892 | Knee et al. | Dec 1996 | A |
5592551 | Lett et al. | Jan 1997 | A |
5594509 | Florin et al. | Jan 1997 | A |
5613057 | Caravel | Mar 1997 | A |
5621456 | Florin et al. | Apr 1997 | A |
5657072 | Aristides et al. | Aug 1997 | A |
5659793 | Escobar et al. | Aug 1997 | A |
5666645 | Thomas et al. | Sep 1997 | A |
5675752 | Scott et al. | Oct 1997 | A |
5694176 | Bruette et al. | Dec 1997 | A |
5737552 | Lavallee et al. | Apr 1998 | A |
5802284 | Karlton et al. | Sep 1998 | A |
5826102 | Escobar et al. | Oct 1998 | A |
5844620 | Coleman et al. | Dec 1998 | A |
5850218 | LaJoie et al. | Dec 1998 | A |
5852435 | Vigneaux et al. | Dec 1998 | A |
5860073 | Ferrel et al. | Jan 1999 | A |
5883677 | Hofmann | Mar 1999 | A |
5892902 | Clark | Apr 1999 | A |
5892905 | Brandt et al. | Apr 1999 | A |
5905492 | Straub et al. | May 1999 | A |
5929849 | Kikinis | Jul 1999 | A |
5945987 | Dunn | Aug 1999 | A |
5960194 | Choy et al. | Sep 1999 | A |
5990890 | Etheredge | Nov 1999 | A |
5996025 | Day et al. | Nov 1999 | A |
6002394 | Schein et al. | Dec 1999 | A |
6005561 | Hawkins et al. | Dec 1999 | A |
6008803 | Rowe et al. | Dec 1999 | A |
6008836 | Bruck et al. | Dec 1999 | A |
6016144 | Blonstein et al. | Jan 2000 | A |
6025837 | Matthews, III et al. | Feb 2000 | A |
6038560 | Wical | Mar 2000 | A |
6049823 | Hwang | Apr 2000 | A |
6061695 | Slivka et al. | May 2000 | A |
6067108 | Yokote et al. | May 2000 | A |
6088722 | Herz et al. | Jul 2000 | A |
6091411 | Straub et al. | Jul 2000 | A |
6094237 | Hashimoto | Jul 2000 | A |
6141003 | Chor et al. | Oct 2000 | A |
6148081 | Szymanski et al. | Nov 2000 | A |
6162697 | Singh et al. | Dec 2000 | A |
6169543 | Wehmeyer | Jan 2001 | B1 |
6172677 | Stautner et al. | Jan 2001 | B1 |
6177931 | Alexander et al. | Jan 2001 | B1 |
6191781 | Chaney et al. | Feb 2001 | B1 |
6195692 | Hsu | Feb 2001 | B1 |
6205582 | Hoarty | Mar 2001 | B1 |
6219839 | Sampsell | Apr 2001 | B1 |
6239795 | Ulrich et al. | May 2001 | B1 |
6240555 | Shoff et al. | May 2001 | B1 |
6281940 | Sciammarella | Aug 2001 | B1 |
6292187 | Gibbs et al. | Sep 2001 | B1 |
6292827 | Raz | Sep 2001 | B1 |
6295057 | Rosin et al. | Sep 2001 | B1 |
6314569 | Chernock et al. | Nov 2001 | B1 |
6317885 | Fries | Nov 2001 | B1 |
6345305 | Beck et al. | Feb 2002 | B1 |
6405239 | Addington et al. | Jun 2002 | B1 |
6415438 | Blackketter et al. | Jul 2002 | B1 |
6421067 | Kamen et al. | Jul 2002 | B1 |
6426779 | Noguchi et al. | Jul 2002 | B1 |
6442755 | Lemmons et al. | Aug 2002 | B1 |
6477705 | Yuen et al. | Nov 2002 | B1 |
6486920 | Arai et al. | Nov 2002 | B2 |
6522342 | Gagnon et al. | Feb 2003 | B1 |
6529950 | Lumelsky et al. | Mar 2003 | B1 |
6530082 | Del Sesto et al. | Mar 2003 | B1 |
6532589 | Proehl et al. | Mar 2003 | B1 |
6564263 | Bergman et al. | May 2003 | B1 |
6567104 | Andrew et al. | May 2003 | B1 |
6571392 | Zigmond et al. | May 2003 | B1 |
6591292 | Morrison et al. | Jul 2003 | B1 |
6621509 | Eiref et al. | Sep 2003 | B1 |
6636887 | Augeri | Oct 2003 | B1 |
6658661 | Arsenault et al. | Dec 2003 | B1 |
6678891 | Wilcox et al. | Jan 2004 | B1 |
6684400 | Goode et al. | Jan 2004 | B1 |
6698020 | Zigmond et al. | Feb 2004 | B1 |
6704359 | Bayrakeri et al. | Mar 2004 | B1 |
6731310 | Craycroft et al. | May 2004 | B2 |
6745367 | Bates et al. | Jun 2004 | B1 |
6760043 | Markel | Jul 2004 | B2 |
6763522 | Kondo et al. | Jul 2004 | B1 |
6766526 | Ellis | Jul 2004 | B1 |
6806887 | Chernock et al. | Oct 2004 | B2 |
6857128 | Borden, IV et al. | Feb 2005 | B1 |
6886029 | Pecus et al. | Apr 2005 | B1 |
6904610 | Bayrakeri et al. | Jun 2005 | B1 |
6910191 | Segerberg et al. | Jun 2005 | B2 |
6918131 | Rautila et al. | Jul 2005 | B1 |
6963880 | Pingte et al. | Nov 2005 | B1 |
7028327 | Dougherty et al. | Apr 2006 | B1 |
7065785 | Shaffer et al. | Jun 2006 | B1 |
7080400 | Navar | Jul 2006 | B1 |
7103904 | Blackketter et al. | Sep 2006 | B1 |
7114170 | Harris et al. | Sep 2006 | B2 |
7134072 | Lovett et al. | Nov 2006 | B1 |
7152236 | Wugofski et al. | Dec 2006 | B1 |
7162694 | Venolia | Jan 2007 | B2 |
7162697 | Markel | Jan 2007 | B2 |
7174512 | Martin et al. | Feb 2007 | B2 |
7177861 | Tovinkere et al. | Feb 2007 | B2 |
7197715 | Valeria | Mar 2007 | B1 |
7207057 | Rowe | Apr 2007 | B1 |
7213005 | Mourad et al. | May 2007 | B2 |
7221801 | Jang et al. | May 2007 | B2 |
7237252 | Billmaier | Jun 2007 | B2 |
7293275 | Krieger et al. | Nov 2007 | B1 |
7305696 | Thomas et al. | Dec 2007 | B2 |
7313806 | Williams et al. | Dec 2007 | B1 |
7337457 | Pack et al. | Feb 2008 | B2 |
7360232 | Mitchell | Apr 2008 | B2 |
7363612 | Satuloori et al. | Apr 2008 | B2 |
7406705 | Crinon et al. | Jul 2008 | B2 |
7440967 | Chidlovskii | Oct 2008 | B2 |
7464344 | Carmichael et al. | Dec 2008 | B1 |
7472137 | Edelstein et al. | Dec 2008 | B2 |
7490092 | Sibley et al. | Feb 2009 | B2 |
7516468 | Deller et al. | Apr 2009 | B1 |
7523180 | DeLuca et al. | Apr 2009 | B1 |
7587415 | Gaurav et al. | Sep 2009 | B2 |
7624416 | Vandermolen et al. | Nov 2009 | B1 |
7640487 | Amielh-Caprioglio et al. | Dec 2009 | B2 |
7702315 | Engstrom et al. | Apr 2010 | B2 |
7703116 | Moreau et al. | Apr 2010 | B1 |
7721307 | Hendricks et al. | May 2010 | B2 |
7743330 | Hendricks et al. | Jun 2010 | B1 |
7752258 | Lewin et al. | Jul 2010 | B2 |
7861259 | Barone, Jr. | Dec 2010 | B2 |
7913286 | Sarachik et al. | Mar 2011 | B2 |
7958528 | Moreau et al. | Jun 2011 | B2 |
7975277 | Jerding et al. | Jul 2011 | B1 |
8006262 | Rodriguez et al. | Aug 2011 | B2 |
8032914 | Rodriguez | Oct 2011 | B2 |
8156533 | Crichton | Apr 2012 | B2 |
8220018 | de Andrade et al. | Jul 2012 | B2 |
8266652 | Roberts et al. | Sep 2012 | B2 |
8296805 | Tabatabai et al. | Oct 2012 | B2 |
8365230 | Chane et al. | Jan 2013 | B2 |
8381259 | Khosla | Feb 2013 | B1 |
8434109 | Kamimaeda et al. | Apr 2013 | B2 |
8448208 | Moreau et al. | May 2013 | B2 |
8660545 | Redford et al. | Feb 2014 | B1 |
8699862 | Sharifi et al. | Apr 2014 | B1 |
8793256 | McIntire et al. | Jul 2014 | B2 |
D712132 | O'Connor | Sep 2014 | S |
8850495 | Pan | Sep 2014 | B2 |
8863196 | Patil et al. | Oct 2014 | B2 |
8938675 | Holladay et al. | Jan 2015 | B2 |
8943533 | de Andrade et al. | Jan 2015 | B2 |
8973063 | Spilo et al. | Mar 2015 | B2 |
9021528 | Moreau et al. | Apr 2015 | B2 |
9363560 | Moreau et al. | Jun 2016 | B2 |
9473548 | Chakrovorthy et al. | Oct 2016 | B1 |
9516253 | De Andrade et al. | Dec 2016 | B2 |
20010014206 | Artigalas et al. | Aug 2001 | A1 |
20010027563 | White et al. | Oct 2001 | A1 |
20010049823 | Matey | Dec 2001 | A1 |
20010056573 | Kovac et al. | Dec 2001 | A1 |
20010056577 | Gordon et al. | Dec 2001 | A1 |
20020010928 | Sahota | Jan 2002 | A1 |
20020016969 | Kimble | Feb 2002 | A1 |
20020023270 | Thomas et al. | Feb 2002 | A1 |
20020026642 | Augenbraun et al. | Feb 2002 | A1 |
20020032905 | Sherr et al. | Mar 2002 | A1 |
20020035573 | Black et al. | Mar 2002 | A1 |
20020041104 | Graf et al. | Apr 2002 | A1 |
20020042915 | Kubischta et al. | Apr 2002 | A1 |
20020042920 | Thomas et al. | Apr 2002 | A1 |
20020046099 | Frengut et al. | Apr 2002 | A1 |
20020059094 | Hosea et al. | May 2002 | A1 |
20020059586 | Carney et al. | May 2002 | A1 |
20020059629 | Markel | May 2002 | A1 |
20020067376 | Martin et al. | Jun 2002 | A1 |
20020069407 | Fagnani et al. | Jun 2002 | A1 |
20020070978 | Wishoff et al. | Jun 2002 | A1 |
20020078444 | Krewin et al. | Jun 2002 | A1 |
20020078449 | Gordon et al. | Jun 2002 | A1 |
20020083450 | Kamen et al. | Jun 2002 | A1 |
20020100041 | Rosenberg et al. | Jul 2002 | A1 |
20020107973 | Lennon et al. | Aug 2002 | A1 |
20020108121 | Alao et al. | Aug 2002 | A1 |
20020108122 | Alao et al. | Aug 2002 | A1 |
20020120609 | Lang et al. | Aug 2002 | A1 |
20020124254 | Kikinis | Sep 2002 | A1 |
20020144268 | Khoo et al. | Oct 2002 | A1 |
20020144269 | Connelly | Oct 2002 | A1 |
20020144273 | Reto | Oct 2002 | A1 |
20020147645 | Alao et al. | Oct 2002 | A1 |
20020152477 | Goodman et al. | Oct 2002 | A1 |
20020156839 | Peterson et al. | Oct 2002 | A1 |
20020156890 | Carlyle et al. | Oct 2002 | A1 |
20020162120 | Mitchell | Oct 2002 | A1 |
20020169885 | Alao et al. | Nov 2002 | A1 |
20020170059 | Hoang | Nov 2002 | A1 |
20020171691 | Currans et al. | Nov 2002 | A1 |
20020171940 | He et al. | Nov 2002 | A1 |
20020184629 | Sie et al. | Dec 2002 | A1 |
20020188944 | Noble | Dec 2002 | A1 |
20020194181 | Wachtel | Dec 2002 | A1 |
20020196268 | Wolff et al. | Dec 2002 | A1 |
20020199187 | Gissin et al. | Dec 2002 | A1 |
20020199190 | Su | Dec 2002 | A1 |
20030001880 | Holtz et al. | Jan 2003 | A1 |
20030005444 | Crinon et al. | Jan 2003 | A1 |
20030005453 | Rodriguez et al. | Jan 2003 | A1 |
20030014752 | Zaslaysky et al. | Jan 2003 | A1 |
20030014753 | Beach et al. | Jan 2003 | A1 |
20030018755 | Masterson et al. | Jan 2003 | A1 |
20030023970 | Panabaker | Jan 2003 | A1 |
20030025832 | Swart et al. | Feb 2003 | A1 |
20030028871 | Wang et al. | Feb 2003 | A1 |
20030028873 | Lemmons | Feb 2003 | A1 |
20030041104 | Wingard et al. | Feb 2003 | A1 |
20030051246 | Wilder et al. | Mar 2003 | A1 |
20030056216 | Wugofski et al. | Mar 2003 | A1 |
20030056218 | Wingard et al. | Mar 2003 | A1 |
20030058948 | Kelly et al. | Mar 2003 | A1 |
20030061028 | Dey et al. | Mar 2003 | A1 |
20030066081 | Barone et al. | Apr 2003 | A1 |
20030067554 | Klarfeld et al. | Apr 2003 | A1 |
20030068046 | Lindqvist et al. | Apr 2003 | A1 |
20030070170 | Lennon | Apr 2003 | A1 |
20030079226 | Barrett | Apr 2003 | A1 |
20030084443 | Laughlin et al. | May 2003 | A1 |
20030084444 | Ullman et al. | May 2003 | A1 |
20030084449 | Chane et al. | May 2003 | A1 |
20030086694 | Davidsson | May 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030093792 | Labeeb et al. | May 2003 | A1 |
20030097657 | Zhou et al. | May 2003 | A1 |
20030110500 | Rodriguez | Jun 2003 | A1 |
20030110503 | Perkes | Jun 2003 | A1 |
20030115219 | Chadwick | Jun 2003 | A1 |
20030115612 | Mao et al. | Jun 2003 | A1 |
20030126601 | Roberts et al. | Jul 2003 | A1 |
20030132971 | Billmaier et al. | Jul 2003 | A1 |
20030135464 | Mourad et al. | Jul 2003 | A1 |
20030135582 | Allen et al. | Jul 2003 | A1 |
20030140097 | Schloer | Jul 2003 | A1 |
20030151621 | McEvilly et al. | Aug 2003 | A1 |
20030158777 | Schiff et al. | Aug 2003 | A1 |
20030172370 | Satuloori et al. | Sep 2003 | A1 |
20030177501 | Takahashi et al. | Sep 2003 | A1 |
20030182663 | Gudorf et al. | Sep 2003 | A1 |
20030189668 | Newnam et al. | Oct 2003 | A1 |
20030204814 | Elo et al. | Oct 2003 | A1 |
20030204846 | Breen et al. | Oct 2003 | A1 |
20030204854 | Blackketter et al. | Oct 2003 | A1 |
20030207696 | Willenegger et al. | Nov 2003 | A1 |
20030226141 | Krasnow et al. | Dec 2003 | A1 |
20030229899 | Thompson et al. | Dec 2003 | A1 |
20040003402 | McKenna | Jan 2004 | A1 |
20040003404 | Boston et al. | Jan 2004 | A1 |
20040019900 | Knightbridge et al. | Jan 2004 | A1 |
20040019908 | Williams et al. | Jan 2004 | A1 |
20040022271 | Fichet et al. | Feb 2004 | A1 |
20040024753 | Chane et al. | Feb 2004 | A1 |
20040025180 | Begeja et al. | Feb 2004 | A1 |
20040031015 | Ben-Romdhane et al. | Feb 2004 | A1 |
20040031058 | Reisman | Feb 2004 | A1 |
20040031062 | Lemmons | Feb 2004 | A1 |
20040039754 | Harple | Feb 2004 | A1 |
20040073915 | Dureau | Apr 2004 | A1 |
20040078814 | Allen | Apr 2004 | A1 |
20040107437 | Reichardt et al. | Jun 2004 | A1 |
20040107439 | Hassell et al. | Jun 2004 | A1 |
20040111465 | Chuang et al. | Jun 2004 | A1 |
20040128699 | Delpuch et al. | Jul 2004 | A1 |
20040133923 | Watson et al. | Jul 2004 | A1 |
20040136698 | Mock | Jul 2004 | A1 |
20040168186 | Rector et al. | Aug 2004 | A1 |
20040172648 | Xu et al. | Sep 2004 | A1 |
20040189658 | Dowdy | Sep 2004 | A1 |
20040194136 | Finseth et al. | Sep 2004 | A1 |
20040199578 | Kapczynski et al. | Oct 2004 | A1 |
20040221306 | Noh | Nov 2004 | A1 |
20040224723 | Farcasiu | Nov 2004 | A1 |
20040225751 | Urali | Nov 2004 | A1 |
20040226051 | Carney et al. | Nov 2004 | A1 |
20050005288 | Novak | Jan 2005 | A1 |
20050015796 | Bruckner et al. | Jan 2005 | A1 |
20050015804 | LaJoie et al. | Jan 2005 | A1 |
20050028208 | Ellis et al. | Feb 2005 | A1 |
20050086172 | Stefik | Apr 2005 | A1 |
20050125835 | Wei | Jun 2005 | A1 |
20050149972 | Knudson | Jul 2005 | A1 |
20050155063 | Bayrakeri et al. | Jul 2005 | A1 |
20050160458 | Baumgartner | Jul 2005 | A1 |
20050259147 | Nam et al. | Nov 2005 | A1 |
20050262542 | DeWeese et al. | Nov 2005 | A1 |
20050283800 | Ellis et al. | Dec 2005 | A1 |
20050287948 | Hellwagner et al. | Dec 2005 | A1 |
20060004743 | Murao et al. | Jan 2006 | A1 |
20060059525 | Jerding et al. | Mar 2006 | A1 |
20060068818 | Leitersdorf et al. | Mar 2006 | A1 |
20060080707 | Laksono | Apr 2006 | A1 |
20060080716 | Nishikawa et al. | Apr 2006 | A1 |
20060104511 | Guo et al. | May 2006 | A1 |
20060105793 | Gutowski et al. | May 2006 | A1 |
20060125962 | Shelton et al. | Jun 2006 | A1 |
20060143191 | Cho et al. | Jun 2006 | A1 |
20060156336 | Knudson et al. | Jul 2006 | A1 |
20060195865 | Fablet | Aug 2006 | A1 |
20060200842 | Chapman et al. | Sep 2006 | A1 |
20060206470 | McIntyre | Sep 2006 | A1 |
20060206912 | Klarfeld et al. | Sep 2006 | A1 |
20060233514 | Weng et al. | Oct 2006 | A1 |
20060248572 | Kitsukama et al. | Nov 2006 | A1 |
20070019001 | Ha | Jan 2007 | A1 |
20070050343 | Siddaramappa et al. | Mar 2007 | A1 |
20070064715 | Lloyd et al. | Mar 2007 | A1 |
20070083538 | Roy et al. | Apr 2007 | A1 |
20070112761 | Xu et al. | May 2007 | A1 |
20070211762 | Song et al. | Sep 2007 | A1 |
20070214123 | Messer et al. | Sep 2007 | A1 |
20070214488 | Nguyen et al. | Sep 2007 | A1 |
20070220016 | Estrada et al. | Sep 2007 | A1 |
20070239707 | Collins et al. | Oct 2007 | A1 |
20070250901 | McIntire et al. | Oct 2007 | A1 |
20070260700 | Messer | Nov 2007 | A1 |
20070261072 | Boulet et al. | Nov 2007 | A1 |
20070271587 | Rowe | Nov 2007 | A1 |
20080037722 | Klassen | Feb 2008 | A1 |
20080060011 | Kelts | Mar 2008 | A1 |
20080071770 | Schloter et al. | Mar 2008 | A1 |
20080092201 | Agarwal et al. | Apr 2008 | A1 |
20080113504 | Lee et al. | May 2008 | A1 |
20080126109 | Cragun et al. | May 2008 | A1 |
20080133504 | Messer et al. | Jun 2008 | A1 |
20080148317 | Opaluch | Jun 2008 | A1 |
20080163304 | Ellis | Jul 2008 | A1 |
20080183681 | Messer et al. | Jul 2008 | A1 |
20080183698 | Messer et al. | Jul 2008 | A1 |
20080189740 | Carpenter et al. | Aug 2008 | A1 |
20080196070 | White et al. | Aug 2008 | A1 |
20080204595 | Rathod et al. | Aug 2008 | A1 |
20080208796 | Messer et al. | Aug 2008 | A1 |
20080208839 | Sheshagiri et al. | Aug 2008 | A1 |
20080221989 | Messer et al. | Sep 2008 | A1 |
20080235209 | Rathod et al. | Sep 2008 | A1 |
20080235393 | Kunjithapatham et al. | Sep 2008 | A1 |
20080235725 | Hendricks | Sep 2008 | A1 |
20080250010 | Rathod et al. | Oct 2008 | A1 |
20080256097 | Messer et al. | Oct 2008 | A1 |
20080266449 | Rathod et al. | Oct 2008 | A1 |
20080276278 | Krieger et al. | Nov 2008 | A1 |
20080282294 | Carpenter et al. | Nov 2008 | A1 |
20080288641 | Messer et al. | Nov 2008 | A1 |
20080288644 | Gilfix et al. | Nov 2008 | A1 |
20080301320 | Morris | Dec 2008 | A1 |
20080301732 | Archer et al. | Dec 2008 | A1 |
20080317233 | Rey et al. | Dec 2008 | A1 |
20090006315 | Mukherjea et al. | Jan 2009 | A1 |
20090019485 | Ellis et al. | Jan 2009 | A1 |
20090024629 | Miyauchi | Jan 2009 | A1 |
20090025054 | Gibbs et al. | Jan 2009 | A1 |
20090083257 | Bargeron et al. | Mar 2009 | A1 |
20090094113 | Berry et al. | Apr 2009 | A1 |
20090094632 | Newnam et al. | Apr 2009 | A1 |
20090094651 | Damm et al. | Apr 2009 | A1 |
20090123021 | Jung et al. | May 2009 | A1 |
20090133025 | Malhotra et al. | May 2009 | A1 |
20090164904 | Horowitz et al. | Jun 2009 | A1 |
20090183210 | Andrade | Jul 2009 | A1 |
20090222872 | Schlack | Sep 2009 | A1 |
20090228441 | Sandvik | Sep 2009 | A1 |
20090240650 | Wang et al. | Sep 2009 | A1 |
20090249427 | Dunnigan et al. | Oct 2009 | A1 |
20090271829 | Larsson et al. | Oct 2009 | A1 |
20090288132 | Hegde | Nov 2009 | A1 |
20090292548 | Van Court | Nov 2009 | A1 |
20100023966 | Shahraray et al. | Jan 2010 | A1 |
20100077057 | Godin et al. | Mar 2010 | A1 |
20100079670 | Frazier | Apr 2010 | A1 |
20100175084 | Ellis et al. | Jul 2010 | A1 |
20100180300 | Carpenter et al. | Jul 2010 | A1 |
20100223640 | Reichardt et al. | Sep 2010 | A1 |
20100250190 | Zhang et al. | Sep 2010 | A1 |
20100251284 | Ellis et al. | Sep 2010 | A1 |
20100257548 | Lee et al. | Oct 2010 | A1 |
20110055282 | Hoving | Mar 2011 | A1 |
20110058101 | Earley et al. | Mar 2011 | A1 |
20110087348 | Wong | Apr 2011 | A1 |
20110093909 | Roberts et al. | Apr 2011 | A1 |
20110131204 | Bodin et al. | Jun 2011 | A1 |
20110176787 | DeCamp | Jul 2011 | A1 |
20110209180 | Ellis et al. | Aug 2011 | A1 |
20110211813 | Marks | Sep 2011 | A1 |
20110214143 | Rits et al. | Sep 2011 | A1 |
20110219386 | Hwang et al. | Sep 2011 | A1 |
20110219419 | Reisman | Sep 2011 | A1 |
20110225417 | Maharajh | Sep 2011 | A1 |
20110246495 | Mallinson | Oct 2011 | A1 |
20110247042 | Mallinson | Oct 2011 | A1 |
20120002111 | Sandoval et al. | Jan 2012 | A1 |
20120011550 | Holland | Jan 2012 | A1 |
20120054811 | Spears | Mar 2012 | A1 |
20120117151 | Bill | May 2012 | A1 |
20120192226 | Zimmerman et al. | Jul 2012 | A1 |
20120227073 | Hosein et al. | Sep 2012 | A1 |
20120233646 | Coniglio et al. | Sep 2012 | A1 |
20120295686 | Lockton | Nov 2012 | A1 |
20120324002 | Chen | Dec 2012 | A1 |
20120324494 | Burger et al. | Dec 2012 | A1 |
20120324495 | Matthews, III et al. | Dec 2012 | A1 |
20120324518 | Thomas et al. | Dec 2012 | A1 |
20130014155 | Clarke et al. | Jan 2013 | A1 |
20130040623 | Chun et al. | Feb 2013 | A1 |
20130051770 | Sargent | Feb 2013 | A1 |
20130103446 | Bragdon et al. | Apr 2013 | A1 |
20130110769 | Ito | May 2013 | A1 |
20130111514 | Slavin et al. | May 2013 | A1 |
20130170813 | Woods et al. | Jul 2013 | A1 |
20130176493 | Khosla | Jul 2013 | A1 |
20130198642 | Carney et al. | Aug 2013 | A1 |
20130262997 | Markworth et al. | Oct 2013 | A1 |
20130298038 | Spivack et al. | Nov 2013 | A1 |
20130316716 | Tapia et al. | Nov 2013 | A1 |
20130326570 | Cowper et al. | Dec 2013 | A1 |
20130332839 | Frazier et al. | Dec 2013 | A1 |
20130332852 | Castanho et al. | Dec 2013 | A1 |
20130347018 | Limp et al. | Dec 2013 | A1 |
20130347030 | Oh et al. | Dec 2013 | A1 |
20140006951 | Hunter | Jan 2014 | A1 |
20140009680 | Moon et al. | Jan 2014 | A1 |
20140032473 | Enoki et al. | Jan 2014 | A1 |
20140068648 | Green et al. | Mar 2014 | A1 |
20140075465 | Petrovic et al. | Mar 2014 | A1 |
20140089423 | Jackels | Mar 2014 | A1 |
20140089967 | Mandalia et al. | Mar 2014 | A1 |
20140129570 | Johnson | May 2014 | A1 |
20140149918 | Asokan et al. | May 2014 | A1 |
20140150022 | Oh et al. | May 2014 | A1 |
20140237498 | Ivins | Aug 2014 | A1 |
20140267931 | Gilson et al. | Sep 2014 | A1 |
20140279852 | Chen | Sep 2014 | A1 |
20140280695 | Sharma et al. | Sep 2014 | A1 |
20140282122 | Mathur | Sep 2014 | A1 |
20140325359 | Vehovsky et al. | Oct 2014 | A1 |
20140327677 | Walker | Nov 2014 | A1 |
20140359662 | Packard | Dec 2014 | A1 |
20140365302 | Walker | Dec 2014 | A1 |
20140373032 | Merry et al. | Dec 2014 | A1 |
20150026743 | Kim et al. | Jan 2015 | A1 |
20150263923 | Kruglick | Sep 2015 | A1 |
Number | Date | Country |
---|---|---|
0624039 | Nov 1994 | EP |
0963115 | Dec 1999 | EP |
1058999 | Dec 2000 | EP |
1080582 | Mar 2001 | EP |
2323489 | Sep 1998 | GB |
2448874 | Nov 2008 | GB |
2448875 | Nov 2008 | GB |
9963757 | Dec 1999 | WO |
0011869 | Mar 2000 | WO |
0033576 | Jun 2000 | WO |
0110115 | Feb 2001 | WO |
0182613 | Nov 2001 | WO |
02063426 | Aug 2002 | WO |
02063471 | Aug 2002 | WO |
02063851 | Aug 2002 | WO |
02063878 | Aug 2002 | WO |
03009126 | Jan 2003 | WO |
20031026275 | Mar 2003 | WO |
2007115224 | Oct 2007 | WO |
2008053132 | May 2008 | WO |
2011053271 | May 2011 | WO |
2012094105 | Jul 2012 | WO |
2012154541 | Nov 2012 | WO |
Entry |
---|
European Patent Application No. 09175979.5—Office Action dated Dec. 13, 2011. |
Canadian Patent Application No. 2,685,833—Office Actoin dated Jan. 20, 2012. |
Li, Y. et al. “Reliable Video Clock Time Recognition”, Pattern Recognition, 2006, 1CPR 1006, 18th International Conference on Pattern Recognition, 4 pages. |
European Search Report dated Mar. 1, 2010. |
Salton et al., Computer Evaluation of Indexing and Text Processing Journal of the Association for Computing Machinery, vol. 15, No. 1, Jan. 1968, pp. 8-36. |
Smith, J.R. et al., An Image and Video Search Engine for the World-Wide Web Storage and Retrieval for Image and Video Databases 5, San Jose, Feb. 13-14, 1997, Proceedings of Spie, Belingham, Spie, US, vol. 3022, Feb. 13, 1997, pp. 84-95. |
Kontothoanassis, Ledonias et al. “Design, Implementation, and Analysis of a Multimedia Indexing and Delivery Server”, Technical Report Series, Aug. 1999, Cambridge Research Laboratory. |
Messer, Alan et al., “SeeNSearch: A context Directed Search Facilitator for Home Entertainment Devices”, Paper, Samsung Information Systems America Inc., San Jose, CA. |
Boulgouris N. V. et al., “Real-Time Compressed-Domain Spatiotemporal Segmentation and Ontologies for Video Indexing and Retrieval”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 14, No. 5, pp. 606-621, May 2004. |
Changsheng Xu et al., “Using Webcast Text for Semantic Event Detection in Broadcast Sports Video”, IEEE Transactions on Multimedia, vol. 10, No. 7, pp. 1342-1355, Nov. 2008. |
Liang Bai et al., “Video Semantic Content Analysis based on Ontology”, International Machine Vision and Image Processing Conference, pp. 117-124, Sep. 2007. |
Koskela M. et al., “Measuring Concept Similarities in Multimedia Ontologies: Analysis and Evaluations”, IEEE Transactions on Multimedia, vol. 9, No. 5, pp. 912-922, Aug. 2007. |
Steffan Staab et al., “Semantic Multimedia”, Reasoning Web; Lecture Notes in Computer Science, pp. 125-170, Sep. 2008. |
European Search Report for Application No. 09180776.8, dated Jun. 7, 2010, 9 pages. |
European Search Report, EP 09 18 0762, completion date Mar. 22, 2010. |
European Search Report dated Jun. 4, 2010. |
EP Application No. 09 179 987.4-1241—Office Action dated Feb. 15, 2011. |
European Application No. 09 175 979.5—Office Action dated Apr. 11, 2011. |
Fernando Pereira, “The MPEG-4 Book”, Prentice Hall, Jul. 10, 2002. |
Michael Adams, “Open Cable Architecture”, Cisco Press, Dec. 3, 1999. |
Andreas Kraft and Klaus Hofrichter, “An Approach for Script-Based Broadcast Application Production”, Springer-Verlag Brling Heidelberg, pp. 74-82, 1999. |
Mark Riehl, “xml and Perl”, Sams, Oct. 16, 2002. |
MetaTV, Inc., PCT/US02/29917 filed Sep. 19, 2002, International Search Report dated Apr. 14, 2003; ISA/US; 6 pages. |
Sylvain Devillers, “Bitstream Syntax Definition Language: an Input to MPEG-21 Content Representation”, Mar. 2001, ISO, ISO/IEC JTC1/SC29/WG11 MPEG01/M7053. |
Shim, et al., “A SMIL Based Graphical Interface for Interactive TV”, Internet Tech. Laboratory Dept. of Comp. Engineering, San Jose State University, pp. 257-266, 2003. |
Yoon, et al., “Video Gadget: MPET-7 Based Audio-Visual Content Indexing and Browsing Engine”, LG Electronics Institute of Technology, pp. 59-68. |
Watchwith webpage; http://www.watchwith.com/content_owners/watchwith_plalform_components.jsp (last visited Mar. 12, 2013). |
Matt Duffy; TVplus App reveals content click-through rates north of 10% across sync enabled programming; http://www.tvplus.com/blog/TVplus-App-reveals-content-click-through-rates-north-of-10-Percent-across-sync-enabled-programming (retrieved from the Wayback Machine on Mar. 12, 2013). |
“In Time for Academy Awards Telecast, Companion TV App Umami Debuts First Real-Time Sharing of a TV Program's Images”; Umami News; http:www.umami.tv/2012-02-23.html (retrieved from the Wayback Machine on Mar. 12, 2013). |
European Extended Search Report—EP 13192112.4—dated May 11, 2015. |
Boronat F et al: “Multimedia group and inter-stream synchronization techniques: A comparative study”, Information Systems. Pergamon Press. Oxford. GB. vol. 34. No. 1. Mar. 1, 2009 (Mar. 1, 2009). pp. 108-131. XP025644936. |
Extended European Search Report—EP14159227.9—dated Sep. 3, 2014. |
Canadian Office Action—CA 2,685,833—dated Jan. 22, 2015. |
CA Response to Office Action—CA Appl. 2,685,833—dated Jul. 17, 2015. |
CA Office Action—CA App 2,685,833—dated Jan. 27, 2016. |
Response to European Office Action—European Appl. 13192112.4—dated Dec. 9, 2015. |
European Office Action—EP App 14159227.9—dated Jul. 12, 2016. |
Agnieszka Zagozdzinnska et al. “TRIDAQ Systems in HEP Experiments at LHC Accelerator” Kwartalnik Elektroniki I Telekomunikacji, vol. 59, No. 4, Jan. 1, 2013. |
CA Office Action—CA Application 2685833—dated Feb. 8, 2017. |
Nov. 29, 2017—Canadian Office Action—CA 2,685,833. |
Mar. 9, 2018—European Office Action—EP 13192112.4. |
Feb. 19, 2018—European Summons to Oral Proceedings—EP 14159227.9. |
Jul. 31, 2018—European Decision to Refuse—14159227.9. |
Sep. 5, 2019—Canadian Office Action—CA 2,685,833. |
Nov. 6, 2019—Canadian Office Action—CA 2,832,800. |
Number | Date | Country | |
---|---|---|---|
20140282745 A1 | Sep 2014 | US |