Dynamic media recording

Information

  • Patent Grant
  • 10297287
  • Patent Number
    10,297,287
  • Date Filed
    Monday, October 21, 2013
    11 years ago
  • Date Issued
    Tuesday, May 21, 2019
    5 years ago
Abstract
A stream of media content is received in a media device. A value is determined in metadata of the media content relating to an indicia of interest in a portion of the media content. A first clip including the portion of the media content is stored based at least in part on the indicia of interest, whereby the first clip is made available for later retrieval.
Description
BACKGROUND

Digital video recorders (DVR's) and the like may be used to record presentations of media content, such as sporting events, political events, etc. However, even when the subject of an item of media content is of interest to a user, e.g., a football fan may generally be interested in a football game, some or all portions of the item of media content may not be of interest to the user. For example, a user may not be interested in seeing an entire football game between teams the user does not follow, or may not have time to watch an entire game. Unfortunately, mechanisms are lacking to allow a user to record and view only portions of items of media content of interest to the user, e.g., exciting portions of a football game or other sporting event.





DRAWINGS


FIG. 1 is a block diagram of an exemplary media content delivery system.



FIG. 2 is a block diagram of exemplary media content.



FIG. 3 is a diagram of an exemplary process for recording media content.





DETAILED DESCRIPTION
Introduction


FIG. 1 is a block diagram of an exemplary media content delivery system 100. Using certain elements of the system 100 as disclosed herein, in an exemplary implementation, a digital media processing device 140 may selectively, e.g., dynamically, record portions of a video stream determined likely to be of interest, e.g., exciting, to a viewer. Accordingly, the media processing device 140 may be used to generate and store clips 155. Further, the clips 155 may be presented to a user in a variety of ways, e.g., presented for selective retrieval by a user, one or more clips 155 may be arranged together to create a custom program for a user, etc.


Accordingly, with reference to certain of the exemplary elements shown in FIG. 1, a media source 105 in the system 100 includes media content 110, e.g., streaming content such as a video presentation which in the context of the system 100 may be an event such as a sporting event or other public event, or a presentation, e.g., a news presentation, etc. The media content 110 may be provided via a network 130 to a media device 140 that is generally located in a customer premises 135. Media content 110 may include media data 115, e.g., frames of video and associated audio, along with metadata 120 describing various attributes and or portions of the media data 115. The media source 105 may also store, e.g., included in the metadata 120, interest data 125 related to an item of media content 110. The interest data 125 generally includes indicia of likely user interest, e.g., a tag or keyword or the like, or a numerical value indicating or rating a likely level of user interest in a portion of the media data 115, e.g., relating to an excitement level or the like for the portion of media data 115. A recording module 145 included in the media device 140 may use the metadata 120, including the interest data 125, and possibly also a set of rules 150, to generate one or more clips 155 of the content 110.


Exemplary System Elements

Media Source


In general, media source 105 may include multiple elements for processing, storing, and providing media content 110 and related data. Elements of the media source 105 may be local to one another and/or may be distributed amongst multiple locations. For example, media source 105 may include one or more computer servers (some or all of which may be referred to as “media servers”) and data storage devices, e.g., for storing and processing content 110 and other data such as discussed herein.


In general, the media source 105 may be any one or some combination of various mechanisms for delivering media content 110, e.g., one or more computing devices and storage devices, and may depend on a type of media content 110 being provided. By way of example and not limitation, media content 110 data may be provided as video-on-demand through a cable, satellite, or internet protocol television (IPTV) distribution system, as streaming Internet video data, or as some other kind of data. Accordingly, the media source 105 may include one or more of a cable or satellite television headend, a video streaming service that generally includes a multimedia web server (or some other computing device), or some other mechanism for delivering multimedia data. In general, examples of media content 110 include various types of data, including audio, video, images, etc.


Media content 110 is generally delivered via the network 130 in a digital format, e.g., as compressed audio and/or video data. The media content 110 generally includes, according to such digital format, media data 115 and media metadata 120. For example, MPEG refers to a set of standards generally promulgated by the International Standards Organization/International Electrical Commission Moving Picture Experts Group (MPEG). H.264 refers to a standard promulgated by the International Telecommunications Union (ITU). Accordingly, by way of example and not limitation, media content 110 may be provided in a format such as the MPEG-1, MPEG-2 or the H.264/MPEG-4 Advanced Video Coding standards (AVC) (H.264 and MPEG-4 at present being consistent), or according to some other standard or standards. For example, media content 110 could be audio data formatted according to standards such as MPEG-2 Audio Layer III (MP3), Advanced Audio Coding (AAC), etc. Further, the foregoing standards generally provide for including metadata, e.g. media metadata 120, along with media data 115, in a file of media content 110, such as the media metadata 120 discussed herein (and moreover, as discussed elsewhere herein, the metadata 120 may include other elements such as interest data 125).


Media content 110 includes media content as it is usually provided for general distribution, e.g., a sports or news program, etc., in a form has provided by a distributor of the media content 110 via a media source 105. Alternatively or additionally, media content 110 may be modified from the form provided by a general distributor of content (e.g., recompressed, re-encoded, etc.). In any case, media data 115 generally includes data by which a display, playback, representation, etc. of the media content 110 is presented by a media device 140, e.g., on a display device such a monitor, television set, etc. For example, media data 115 generally includes units of encoded and/or compressed video data, e.g., frames of an MPEG file or stream.


Media metadata 120 may include metadata as provided by an encoding standard such as an MPEG standard. Alternatively and/or additionally, media metadata 120 could be stored and/or provided separately to a media device 140, apart from media data 115. In general, media metadata 120 provides general descriptive information for an item of media content 110. Examples of media metadata 120 include information such as content 110 title, chapter, actor information, Motion Picture Association of America MPAA rating information, reviews, and other information that describes an item of media content 110. Information for metadata 120 may be gathered from a content producer, e.g., a movie studio, media information aggregators, and other sources such as critical movie reviews.


As already mentioned, the metadata 120 may include other elements such as interest data 125. Accordingly, generally as part of metadata 120 in media content 110, interest data 125 may be provided from the media source 105 to one or more media devices 140. The interest data 125 generally includes one or more indicia of interest, e.g., a numerical excitement or interest rating, a descriptive keyword or tag, etc., relating to a portion or portions of media data 115. Interest data 125 may be provided according to a variety of mechanisms, e.g., a third party vendor may supply interest data 125 concerning an item of media content 110 in real time or near real time as the media content 110, e.g., a live sporting event, is made available from the media source 105.


In addition, to specify exciting and/or interesting portions of media data 115, interest data 125 generally further includes a pointer or pointers or the like to a location or locations in media data 115, e.g., according to timestamps or other indices. Such pointers may be used to access one or more portions of media data 115, e.g., such as may be specified according to pointers or the like provided in the metadata 120 associated with the media data 115. For example, FIG. 2 is a block diagram of an exemplary item of media content 110 with interest data 125 specifying various segments in the media content 110. As stated above, media data 115 is typically an encoded (e.g., MPEG) video stream or file. Metadata 120 includes indices or the like according to which interest data 125 may point to a specified segment (or segments), e.g., a specified set of frames of the media data 115 included in the content 110. Accordingly, clips 155a, 155b, 155c, etc., in a file or stream of media content 110 may be specified according to respective indicia of interest, and pointers to respective portions of media data 115, included in respective items of interest data 125.


Network


Communications to and from the media source 105, customer premises 135, and one or more remote sites 170 may occur via the network 130. In general, the network 130 represents one or more mechanisms for delivering content 110 from the media source 105 to a media device 140. Accordingly, the network 130 may be one or more of various wired or wireless communication mechanisms, including any desired combination of wired (e.g., cable and fiber) and/or wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms and any desired network topology (or topologies when multiple communication mechanisms are utilized). Exemplary communication networks include wireless communication networks, local area networks (LAN) and/or wide area networks (WAN), including the Internet, etc.


Customer Premises


Turning to the customer premises 135, the media device 140 is generally a device including a computer processor and associated storage, e.g., volatile memory, nonvolatile memory, etc., and capable of communicating via the network 130. Exemplary media devices 140 include a set-top box that includes or is coupled to a digital video recorder (DVR), a personal computer such as a laptop, handheld, or tablet computer, a smart phone, etc. Accordingly, it is to be understood that a media device 140 may be a mobile device rather than being located in a physical customer premises all (or even some) of the time.


The media device 140 may include a display and/or may be connected to a display device, e.g., a television, or may incorporate a display device, e.g., a display of a smartphone, tablet or personal computer. When content 110 is referred to herein as being “displayed,” it is to be understood that such display could include any possible mode of displaying media data, such as a display of visual data, audio data, etc. For example, content 110 could be displayed by showing video or image data on a screen with or without sound, by playing audio data with or without a visual display, etc.


The media device 140 generally includes a recording module 145, e.g., a set of instructions stored on a memory of the device 140, and executable by a processor thereof. The recording module 145 is generally configured to identify a portion or portions of media data 115 in media content 110 that meets at least one pre-determined criterion of user interest. The recording module 145 is further generally configured to make this identification according to the indicia of interest included in interest data 125, as well as possibly according to one or more rules 150.


For example, the recording module 145 could be configured to identify interest data 125 included in metadata 120 for a stream of content data 110, and to determine whether the interest data 125 specifies an indicia of interest that should trigger recording of a portion of media data 115. Continuing this example, metadata 120 for each frame included in an MPEG stream of media data 115 could include an interest datum 125 providing an indicia of interest such as a numerical excitement rating, e.g., on a scale from 1 to 100. Then, when the recording module 145 encountered a frame of data 115 associated with an indicia of interest at or above a predetermined threshold, e.g., 75, the recording module 145 could record, i.e., capture for storage, that frame of data 115. Further, any immediately adjacent or contiguous frames of data 115 that met or surpassed the threshold could likewise be captured and included in a clip 155 with the first-identified frame of data 115 that met or surpassed the threshold. Similarly, an interest datum 125 could specify start and end points, e.g., using timestamps, indices, etc., for respective starting and ending locations in a stream of media data 115 between which an indicia of interest had a specified numeric value or associated keyword, a value over a certain threshold, etc.


Likewise, an interest datum 125 could specify a tag or keyword associated with one or more frames of media data 115, and the recording module 145 could capture for storage, i.e. record, frames of media data 115 in a clip 155, where the captured frames of media data 115 were specified by the tags, keyword, etc.


Rules 150 may provide further parameters, rules, etc. for capturing a clip 155. Rules 150 may be predefined for all users, i.e., for all media devices 140, and moreover could be stored by a media source 105 and provided and/or updated in a memory of the media device 140 by the media source 105. Alternatively or additionally, rules 150 may be customized for particular users and/or media devices 140. For example, a generic rule 150, i.e., a rule 150 for all media devices 140, could specify that for events identified in metadata 120 as sporting events, and excitement rating above a value of 75 should trigger recording of a clip 155, but for all other events, e.g., news events, an excitement rating above a value of 85 is required. Further, a custom rule 150 could specify that for events identified in metadata 120 as sporting events for a user's favorite team, e.g., an excitement rating above a value of 60 is required, but for other sporting events in a specified sport, e.g., baseball, an excitement rating above a value of 90 is required. Additionally or alternatively, as discussed in more detail below, a rule 150 could be used to identify an item of media content 110 for which the recording module 145 should examine interest data 125.


As mentioned above, rules 150 could be defined by a media source 105, and moreover, rules 150 could be included in instructions comprised in the recording module 145. Additionally or alternatively, rules 150 could be specified by a user of a media device 140, e.g., using a graphical user interface (GUI) or the like provided on a display associated with the media device 140. For example, such GUI could allow a user to specify keywords, tags, etc. and/or threshold values for interest ratings, e.g., excitement ratings, to trigger recording of a clip 155. Accordingly, a rule 150 could specify both a keyword and a threshold for a numerical interest rating, wherein a combination of the presence of the keyword and meeting or exceeding the threshold triggers recording of a clip 155 according to instructions in the recording module 145.


Remote Sites


A remote site 170, as discussed above, may include a social media site, an e-commerce site, a news site, a site providing reference information, etc. A media device 140 could include instructions for allowing a user to specify a remote site 170 for posting a clip 155. For example, a GUI provided by the media device 140 could allow a user to specify one or more clips 155 to be posted to a user's social media account, or could allow a user to specify that a clip 155 will be automatically posted to a user's social media account when the interest indicia in interest data 125 associated with the clip 155 matches certain keywords, tags, etc. and/or meets or exceeds a specified numeric threshold. Further, a user's account on a remote site 170 could be used to provide tags or the like indicating user interest in subjects that could then be matched to one or more interest data 125.


Exemplary Process Flows


FIG. 3 is a diagram of an exemplary process 300 for recording media content.


The process 300 begins in a block 305, in which the media device 140 receives, and analyzes interest data 125 in, media content 110. The media content 110 may be a stream of a live event, such as a sporting event or a news event. However, it is possible that the media content 110 is a prerecorded program or the like. In one implementation, the media device 140 includes a digital video recorder (DVR), and uses operations of the DVR to receive and store media content 110. For example, the media device 140 may include instructions to receive certain programming providing certain items of media content 110, e.g., specified sporting events on specified video channels, and to analyze such items of media content 110 for possible generation of a clip or clips 155 as described herein.


In any case, a specific item of media content 110 analyzed in the block 305 may be selected or identified according to a variety of mechanisms. For example, a GUI provided by the media device 140 or some mechanism of the media source 105, e.g., a webpage or the like, could allow a user to make a selection of one or more items of media content 110 for which interest data 125 should be analyzed for possible recording of one or more clips 155. In this example, a user could be presented with a list of possible programs to select.


Alternatively or additionally, a user profile or the like could be stored in the media device 140, and/or at the media source 105 and provided to the media device 140, where the user profile included information to identify items of media content 110 for review by the media device 140 for possible generation of one or more clips 155. Accordingly, such user profile information could be used in a rule 150 indicating items of media content 110 for which the recording module 145 should examine interest data 125. For example, a user profile, which could be generated according to user-provided input, according to information gathered by user viewing habits, etc., could identify types of programming, and attributes of types of programming, of interest to a user. These program types and/or attributes could be specified in a rule 150. For example, the media device 140 could store and/or could receive from the media source 105 data, e.g., one or more rules 150, indicating that a user of the media device 140 was very interested in the game of football and/or an identity of a particular football team.


In any event, the media device 140 could be configured to analyzed interest data 125 for items of media content 110 received from the media source 105 according to broadcasts received over particular viewing channels available from the media source 105 and/or at particular times, according to media content 110 posted in a particular category at a particular website, etc. Further, as discussed above, the media device 140 could use information in metadata 120, e.g., identifying a type of event, teams playing, news figures being reported on, etc. to determine whether to review interest data 125 in an item of media content 110.


Following the block 305, next, in a block 310, the recording module 145 checks interest data 125 in the media content 110 metadata 120. If the interest data 125 triggers a predetermined interest level, e.g., meets or exceeds a predetermined threshold and/or includes predetermined tags or keywords, then the process 300 proceeds to a block 320. Otherwise, the process 300 proceeds to a block 315.


As mentioned above, a user could use a GUI of the media device 140 to specify a predetermined threshold and/or tags, keywords, etc. used to trigger an identification of an interest level for recording a clip 155. However, the predetermined threshold could also be specified at the media source 105 and downloaded by the recording module 145. Further, it is possible that interest data 125 could indicate that an entire item of media content 110, e.g., an entire football game, has an interest level for a user such that the entire item of media content 110, e.g., a football game, political speech, etc., should be recorded, i.e., a single clip 155 that includes the entire item of media content 110 may be generated.


Further, the recording module 145 may make use of rules 150 in determining whether interest data 125 triggers recording of a clip 155. For example, as mentioned above, a rule 150 could specify an interest data 125 threshold to be met or exceeded for a particular type of media content item 110, e.g., a football game, and/or for an attribute of a type of media content item 110, e.g., the data 125 threshold could be varied according to the presence or absence of a particular football team in the media content item 110. As also mentioned above, the recording module 145 could use a rule 150 to determine whether to examine interest data 125 in an item of media content 110.


The block 320 is executed if it is determined that an interest level has been triggered in the block 310. In the block 320, the recording module 145 records a portion of media data 115 in the item of media content 110 for which it is determined that the interest level has been triggered. For example, the recording module 145 could begin recording frames in an MPEG stream of media data 115 when reaching a frame that was indicated to have an interest level at or above the predetermined threshold, and could then complete recording when reaching a frame for which the interest data 125 indicated was associated with an interest level below the predetermined threshold. These recorded frames of media data 115 are then included in a clip 155.


Following the block 320, in a block 325, the recording module 145 stores the portion of media data 115 captured in the block 315 as a clip 155. In general, the recording module 145 includes metadata in the clip 155 that may be copied or derived from the metadata 120 included in the item of media content 110. For example, a program or event name may be provided, and interest level, e.g., an excitement level, could be specified, and other attributes of the media content item 110 from which the clip 155 was taking could be indicated, e.g., a sport or news event, teams and/or individuals featured in the clip 155, etc. Further, the clip 155 may be made available to a user in a variety of ways. For example, a GUI provided by the media device 140 could display a list of clips 155 generated within a specified period of time. The clips 155 could be displayed according to various organizational criteria, such as subject matter (e.g., football, hockey, political speeches, etc.), interest levels (e.g., more exciting clips 155 listed ahead of less exciting clips 155), etc.


Further, the module 145 could include instructions to assemble clips 155 into a composite presentation of media content 110, i.e., a presentation of media content 110 including clips 155 taken from one or more items of media content 110 received from a media source 105. For example, the module 145 could assemble clips 155 according to a one or more attributes, e.g., subject of the clip 155, excitement level of the clip 155, etc., e.g., football plays having a high excitement level, or even a particular kind of football play, e.g., kickoff returns, having a high excitement level. Such composite presentations of clips 155 could then be made available in a GUI provided by the device 140, e.g., listed according to a description of attributes in the clip 155, e.g., excitement level, subject matter, etc.


A block 315 may follow either the block 310 or the block 325. In the block 315, the recording module 145 determines whether content 110 continues to be received in the media device 140. If so, the process 300 returns to the block 305. Otherwise, the process 300 ends following the block 315.


Conclusion

Computing devices such as those discussed herein generally each include instructions executable by one or more computing devices such as those identified above, and for carrying out blocks or steps of processes described above. For example, process blocks discussed above may be embodied as computer-executable instructions.


Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java™, C, C++, Visual Basic, Java Script, Perl, HTML, etc. In general, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media. A file in a computing device is generally a collection of data stored on a computer readable medium, such as a storage medium, a random access memory, etc.


A computer-readable medium includes any medium that participates in providing data (e.g., instructions), which may be read by a computer. Such a medium may take many forms, including, but not limited to, non-volatile media, volatile media, etc. Non-volatile media include, for example, optical or magnetic disks and other persistent memory. Volatile media include dynamic random access memory (DRAM), which typically constitutes a main memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.


In the drawings, the same reference numbers indicate the same elements. Further, some or all of these elements could be changed. With regard to the media, processes, systems, methods, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain embodiments, and should in no way be construed so as to limit the claimed invention.


Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent to those of skill in the art upon reading the above description. The scope of the invention should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the arts discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the invention is capable of modification and variation and is limited only by the following claims.


All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those skilled in the art unless an explicit indication to the contrary in made herein. In particular, use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.

Claims
  • 1. A computing device that includes a processor and a memory, the device programmed to: identify a media content stream for examining by the computing device;receive the media content stream andwhile the media content stream is being received: examine the media content stream based on a rule;determine that a first value in metadata corresponds to a parameter indicating interest specified by the rule;determine a threshold based on the parameter indicating interest according to the rule, wherein the parameter indicating interest includes an attribute of the media content type; andfor each of a plurality of successively received portions of the media content stream, perform the steps of: determining whether a second value in metadata of the media content stream relating to an indicia of interest in the portion of the media content stream meets or exceeds the threshold; andresponsive to a determination that the second value meets or exceeds the threshold, automatically storing a first clip including the portion of the media content stream, whereby the first clip is made available for later retrieval;wherein the indicia of interest comprises a numerical rating independent of preferences of any single user and made available to a plurality of users.
  • 2. The device of claim 1, wherein the indicia of interest is subjective.
  • 3. The device of claim 1, further programmed to automatically store, along with the first clip, one or more additional clips, thereby creating a composite clip.
  • 4. The device of claim 3, further configured to automatically arrange the first clip and one or more additional clips according to the indicia of interest.
  • 5. The device of claim 1, further programmed to display a graphical user interface including a link to the first clip.
  • 6. The device of claim 1, wherein the media content stream comprises a live broadcast and the indicia of interest is generated in real time or near real time.
  • 7. The device of claim 1, wherein the parameter indicating interest is based on data from an account associated with a user on a remote site.
  • 8. A computing device that includes a processor and a memory, the device programmed to: identify a media content stream for examining by the computing device;receive the media content stream; andwhile the media content stream is being received: examine the media content stream based on a rule;determine that a first value in metadata corresponds to a parameter indicating interest specified by the rule;establish, based on the rule, a numerical threshold value for a value representing an indicia of interest in the media content stream based on the parameter indicating interest, the indicia of interest being included in metadata for the media content stream; andfor each of a plurality of successively received portions of the media content stream, perform the steps of: determining whether the indicia of interest meets or exceeds the threshold value;identifying a portion of the media content stream associated with the indicia of interest; andresponsive to a determination that the indicia of interest meets or exceeds the threshold, automatically storing a first clip including the portion of the media content stream, whereby the first clip is made available for later retrieval;wherein: the indicia of interest comprises a subjective numerical rating independent of the preferences of any single user and made available to a plurality of users;the rule specifies a parameter indicating interest in the clip in addition to the indicia, wherein the parameter is associated with a particular user; andestablishing the numerical threshold is based at least in part on the parameter indicating interest.
  • 9. The device of claim 8, wherein the media content stream comprises a live broadcast and the indicia of interest is generated in real time or near real time.
  • 10. The device of claim 8, further configured to automatically store, along with the first clip, one or more additional clips, thereby creating a composite clip.
  • 11. The device of claim 10, further configured to automatically arrange the first clip and one or more additional clips according to the indicia of interest.
  • 12. The device of claim 8, further configured to display a graphical user interface including a link to the first clip.
  • 13. A computing device that includes a processor and a memory, the device programmed to: identify a media content stream for examining by the computing device;receive the media content stream; andwhile the media content stream is being received: examine the media content stream based on a rule;determine that a first value in metadata corresponds to a parameter indicating interest specified by the rule;determine a threshold based on the parameter indicating interest according to the rule, wherein the parameter indicating interest includes an attribute of the media content type; andfor each of a plurality of successively received portions of the media content stream, perform the steps of: determining whether a second value in metadata of the media content stream relating to an indicia of interest in a portion of the media content stream meets or exceeds a threshold based on the parameter of interest according to the rule;responsive to a determination that the second value meets or exceeds the threshold, automatically storing a first clip including the portion of the media content stream, whereby the first clip is made available for later retrieval;wherein:the indicia of interest comprises a numerical rating independent of preferences of any single user and made available to a plurality of users;the media content stream comprises a live broadcast; andthe indicia of interest is generated in real time or near real time.
  • 14. The device of claim 13, wherein the indicia of interest is subjective.
  • 15. The device of claim 13, further programmed to store, along with the first clip, one or more additional clips, thereby creating a composite clip.
  • 16. The device of claim 13, further programmed to display a graphical user interface including a link to the first clip.
US Referenced Citations (337)
Number Name Date Kind
6005562 Shiga et al. Dec 1999 A
6177931 Alexander et al. Jan 2001 B1
6185527 Petkovic et al. Feb 2001 B1
6195458 Warnick et al. Feb 2001 B1
6681396 Bates et al. Jan 2004 B1
6721490 Yao et al. Apr 2004 B1
7174512 Martin et al. Feb 2007 B2
7197715 Valeria Mar 2007 B1
7386217 Zhang Jun 2008 B2
7543322 Bhogal et al. Jun 2009 B1
7633887 Panwar et al. Dec 2009 B2
7646962 Ellis et al. Jan 2010 B1
7680894 Diot et al. Mar 2010 B2
7774811 Poslinski et al. Aug 2010 B2
7818368 Yanq et al. Oct 2010 B2
7825989 Greenberg Nov 2010 B1
7849487 Vosseller Dec 2010 B1
7929808 Seaman et al. Apr 2011 B2
8024753 Kummer et al. Sep 2011 B1
8046798 Schlack et al. Oct 2011 B1
8079052 Chen et al. Dec 2011 B2
8099315 Amento Jan 2012 B2
8104065 Aaby et al. Jan 2012 B2
8140570 Ingrassia et al. Mar 2012 B2
8196168 Bryan et al. Jun 2012 B1
8209713 Lai et al. Jun 2012 B1
8296797 Olstad et al. Oct 2012 B2
8296808 Hardacker et al. Oct 2012 B2
8312486 Briggs et al. Nov 2012 B1
8320674 Guillou et al. Nov 2012 B2
8424041 Candelore et al. Apr 2013 B2
8427356 Satish Apr 2013 B1
8457768 Hammer et al. Jun 2013 B2
8535131 Packard et al. Sep 2013 B2
8595763 Packard et al. Nov 2013 B1
8627349 Kirby et al. Jan 2014 B2
8688434 Birnbaum et al. Apr 2014 B1
8689258 Kemo Apr 2014 B2
8702504 Hughes et al. Apr 2014 B1
8713008 Negi Apr 2014 B2
8752084 Lai et al. Jun 2014 B1
8793579 Halliday et al. Jul 2014 B2
8973038 Gratton Mar 2015 B2
8973068 Kotecha et al. Mar 2015 B2
8990418 Bragg et al. Mar 2015 B1
9038127 Hastings et al. May 2015 B2
9060210 Packard et al. Jun 2015 B2
9066156 Kapa Jun 2015 B2
9213986 Buchheit et al. Dec 2015 B1
9253533 Morgan Feb 2016 B1
9264779 Kirby et al. Feb 2016 B2
9420333 Martch et al. Aug 2016 B2
9451202 Beals Sep 2016 B2
9565474 Petruzzelli et al. Feb 2017 B2
9648379 Howcroft May 2017 B2
9715902 Coviello et al. Jul 2017 B2
20010013123 Freeman et al. Aug 2001 A1
20010026609 Weinstein et al. Oct 2001 A1
20020041752 Abiko et al. Apr 2002 A1
20020059610 Ellis May 2002 A1
20020067376 Martin et al. Jun 2002 A1
20020075402 Robson et al. Jun 2002 A1
20020136528 Dagtas Sep 2002 A1
20020157095 Masumitsu et al. Oct 2002 A1
20020157101 Schrader et al. Oct 2002 A1
20020174430 Ellis et al. Nov 2002 A1
20020178444 Trajkovic et al. Nov 2002 A1
20020180774 Errico et al. Dec 2002 A1
20020194095 Koren Dec 2002 A1
20030012554 Zeidler et al. Jan 2003 A1
20030023742 Allen et al. Jan 2003 A1
20030056220 Thornton et al. Mar 2003 A1
20030066077 Gutta et al. Apr 2003 A1
20030118014 Iyer et al. Jun 2003 A1
20030126605 Betz et al. Jul 2003 A1
20030126606 Buczak et al. Jul 2003 A1
20030154475 Rodriguez et al. Aug 2003 A1
20030172376 Coffin, III Sep 2003 A1
20030188317 Liew et al. Oct 2003 A1
20030189674 Inoue et al. Oct 2003 A1
20030208763 McElhatten et al. Nov 2003 A1
20030229899 Thompson Dec 2003 A1
20040003403 Marsh Jan 2004 A1
20040181807 Theiste et al. Sep 2004 A1
20050005308 Logan Jan 2005 A1
20050015712 Plastina Jan 2005 A1
20050030977 Casev et al. Feb 2005 A1
20050044570 Poslinski Feb 2005 A1
20050071865 Martins Mar 2005 A1
20050071881 Deshpande Mar 2005 A1
20050091690 Delpuch et al. Apr 2005 A1
20050120368 Goronzy et al. Jun 2005 A1
20050125302 Brown et al. Jun 2005 A1
20050149965 Neogi Jul 2005 A1
20050152565 Jouppi et al. Jul 2005 A1
20050154987 Otsuka et al. Jul 2005 A1
20050166230 Gaydou et al. Jul 2005 A1
20050180568 Krause Aug 2005 A1
20050182792 Israel et al. Aug 2005 A1
20050191041 Braun et al. Sep 2005 A1
20050198570 Otsuka et al. Sep 2005 A1
20050204294 Burke Sep 2005 A1
20050240961 Jerding Oct 2005 A1
20050264705 Kitamura Dec 2005 A1
20060020962 Stark et al. Jan 2006 A1
20060085828 Dureau et al. Apr 2006 A1
20060174277 Sezan et al. Aug 2006 A1
20060190615 Panwar et al. Aug 2006 A1
20060218573 Proebstel Sep 2006 A1
20060238656 Chen et al. Oct 2006 A1
20060253581 Dixon et al. Nov 2006 A1
20060282852 Purpura et al. Dec 2006 A1
20060282869 Plourde, Jr. Dec 2006 A1
20070033616 Gutta Feb 2007 A1
20070058930 Iwamoto Mar 2007 A1
20070083901 Bond Apr 2007 A1
20070127894 Ando et al. Jun 2007 A1
20070146554 Strickland et al. Jun 2007 A1
20070154163 Cordray Jul 2007 A1
20070154169 Cordrav et al. Jul 2007 A1
20070157235 Teunissen Jul 2007 A1
20070157249 Cordrav et al. Jul 2007 A1
20070157253 Ellis et al. Jul 2007 A1
20070157285 Frank et al. Jul 2007 A1
20070162924 Radhakrishnan et al. Jul 2007 A1
20070169165 Crull et al. Jul 2007 A1
20070188655 Ohta Aug 2007 A1
20070199040 Kates Aug 2007 A1
20070204302 Calzone Aug 2007 A1
20070212023 Whillock Sep 2007 A1
20070226766 Poslinski et al. Sep 2007 A1
20070239856 Abadir Oct 2007 A1
20070245379 Aqnihortri Oct 2007 A1
20070288951 Ray et al. Dec 2007 A1
20080022012 Wanq Jan 2008 A1
20080060006 Shanks et al. Mar 2008 A1
20080064490 Ellis Mar 2008 A1
20080086743 Chenq et al. Apr 2008 A1
20080092168 Logan Apr 2008 A1
20080097949 Kelly et al. Apr 2008 A1
20080109307 Ullah May 2008 A1
20080115166 Bhogal et al. May 2008 A1
20080134043 Georgis et al. Jun 2008 A1
20080155602 Collet Jun 2008 A1
20080159708 Kazama et al. Jul 2008 A1
20080163305 Johnson et al. Jul 2008 A1
20080168503 Sparrell Jul 2008 A1
20080178219 Grannan Jul 2008 A1
20080193016 Lim et al. Aug 2008 A1
20080195457 Sherman et al. Aug 2008 A1
20080235348 Dasgupta Sep 2008 A1
20080239169 Moon et al. Oct 2008 A1
20080244666 Moon et al. Oct 2008 A1
20080270038 Partovi et al. Oct 2008 A1
20080271078 Gossweiler et al. Oct 2008 A1
20080300982 Larson et al. Dec 2008 A1
20080307485 Clement et al. Dec 2008 A1
20080320523 Morris et al. Dec 2008 A1
20090025027 Craner Jan 2009 A1
20090034932 Oisel et al. Feb 2009 A1
20090055385 Jean et al. Feb 2009 A1
20090080857 St. John-Larkin Mar 2009 A1
20090082110 Relyea et al. Mar 2009 A1
20090102984 Arlina et al. Apr 2009 A1
20090138902 Kamen May 2009 A1
20090144777 Mikami et al. Jun 2009 A1
20090158357 Miller Jun 2009 A1
20090178071 Whitehead Jul 2009 A1
20090210898 Childress et al. Aug 2009 A1
20090228911 Vriisen Sep 2009 A1
20090234828 Tu Sep 2009 A1
20090235313 Maruyama et al. Sep 2009 A1
20090249412 Bhogal et al. Oct 2009 A1
20090293093 Igarashi Nov 2009 A1
20090299824 Barnes, Jr. Dec 2009 A1
20090325523 Choi Dec 2009 A1
20100040151 Garrett Feb 2010 A1
20100064306 Tionqson et al. Mar 2010 A1
20100071007 Meijer Mar 2010 A1
20100071062 Choyi et al. Mar 2010 A1
20100086277 Craner Apr 2010 A1
20100089996 Koolar Apr 2010 A1
20100115554 Drouet et al. May 2010 A1
20100122294 Craner May 2010 A1
20100123830 Vunic May 2010 A1
20100125864 Dwyer et al. May 2010 A1
20100146560 Bonfrer Jun 2010 A1
20100153856 Russ Jun 2010 A1
20100153983 Phillmon et al. Jun 2010 A1
20100153999 Yates Jun 2010 A1
20100158479 Craner Jun 2010 A1
20100166389 Knee et al. Jul 2010 A1
20100169925 Takegoshi Jul 2010 A1
20100218214 Fan et al. Aug 2010 A1
20100251295 Amento et al. Sep 2010 A1
20100251304 Donoghue Sep 2010 A1
20100251305 Kimble et al. Sep 2010 A1
20100262986 Adimatvam et al. Oct 2010 A1
20100269144 Forsman et al. Oct 2010 A1
20100319019 Zazza Dec 2010 A1
20100322592 Casagrande Dec 2010 A1
20100333131 Parker et al. Dec 2010 A1
20110016492 Marita Jan 2011 A1
20110016493 Lee et al. Jan 2011 A1
20110019839 Nandury Jan 2011 A1
20110052156 Kuhn Mar 2011 A1
20110072448 Stiers et al. Mar 2011 A1
20110082858 Yu et al. Apr 2011 A1
20110109801 Thomas et al. May 2011 A1
20110161242 Chung Jun 2011 A1
20110173337 Walsh et al. Jul 2011 A1
20110202956 Connelly et al. Aug 2011 A1
20110206342 Thompson et al. Aug 2011 A1
20110212756 Packard et al. Sep 2011 A1
20110217024 Schlieski Sep 2011 A1
20110231887 West et al. Sep 2011 A1
20110239249 Murison et al. Sep 2011 A1
20110243533 Stern et al. Oct 2011 A1
20110252451 Turgeman et al. Oct 2011 A1
20110286721 Craner Nov 2011 A1
20110289410 Paczkowski et al. Nov 2011 A1
20110293113 McCarthy Dec 2011 A1
20120020641 Sakaniwa et al. Jan 2012 A1
20120047542 Lewis Feb 2012 A1
20120052941 Mo Mar 2012 A1
20120060178 Minakuchi et al. Mar 2012 A1
20120082431 Sengupta et al. Apr 2012 A1
20120106932 Grevers, Jr. May 2012 A1
20120110615 Kilar et al. May 2012 A1
20120110616 Kilar et al. May 2012 A1
20120124625 Foote et al. May 2012 A1
20120131613 Ellis et al. May 2012 A1
20120185895 Wong et al. Jul 2012 A1
20120204209 Kuba Aug 2012 A1
20120216118 Lin et al. Aug 2012 A1
20120230651 Chen Sep 2012 A1
20120237182 Eyer Sep 2012 A1
20120246672 Sridhar et al. Sep 2012 A1
20120260295 Rondeau Oct 2012 A1
20120263439 Lassman Oct 2012 A1
20120278834 Richardson Nov 2012 A1
20120278837 Gurtis et al. Nov 2012 A1
20120284745 Strang Nov 2012 A1
20120311633 Mandrekar et al. Dec 2012 A1
20120324491 Bathiche et al. Dec 2012 A1
20130014159 Wiser et al. Jan 2013 A1
20130042179 Cormack Feb 2013 A1
20130055304 Kirby et al. Feb 2013 A1
20130061313 Cullimore et al. Mar 2013 A1
20130073473 Heath Mar 2013 A1
20130074109 Skelton et al. Mar 2013 A1
20130114940 Merzon et al. May 2013 A1
20130128119 Madathodivil et al. May 2013 A1
20130138435 Weber May 2013 A1
20130145023 Li et al. Jun 2013 A1
20130160051 Armstrong et al. Jun 2013 A1
20130174196 Herlein Jul 2013 A1
20130194503 Yamashita Aug 2013 A1
20130226983 Beining et al. Aug 2013 A1
20130251331 Sambongi Sep 2013 A1
20130263189 Garner Oct 2013 A1
20130268620 Osminer Oct 2013 A1
20130268955 Conrad et al. Oct 2013 A1
20130283162 Aronsson et al. Oct 2013 A1
20130291037 Im et al. Oct 2013 A1
20130298146 Conrad et al. Nov 2013 A1
20130298151 Leske et al. Nov 2013 A1
20130325869 Reiley et al. Dec 2013 A1
20130326406 Reiley et al. Dec 2013 A1
20130326575 Robillard et al. Dec 2013 A1
20130332962 Moritz et al. Dec 2013 A1
20130332965 Seyller et al. Dec 2013 A1
20130346302 Purves et al. Dec 2013 A1
20140023348 O'Kelly Jan 2014 A1
20140028917 Smith et al. Jan 2014 A1
20140032709 Saussy et al. Jan 2014 A1
20140062696 Packard et al. Mar 2014 A1
20140067825 Oztaskent et al. Mar 2014 A1
20140067828 Archibong Mar 2014 A1
20140067939 Packard et al. Mar 2014 A1
20140068675 Mountain Mar 2014 A1
20140068692 Archibong et al. Mar 2014 A1
20140074866 Shah Mar 2014 A1
20140082670 Papish Mar 2014 A1
20140114647 Allen Apr 2014 A1
20140114966 Bilinski et al. Apr 2014 A1
20140123160 van Coppenolle et al. May 2014 A1
20140139555 Levy May 2014 A1
20140140680 Jo May 2014 A1
20140150009 Sharma May 2014 A1
20140153904 Adimatvam et al. Jun 2014 A1
20140157327 Roberts et al. Jun 2014 A1
20140161417 Kurupacheril et al. Jun 2014 A1
20140215539 Chen et al. Jul 2014 A1
20140223479 Krishnamoorthi Aug 2014 A1
20140282714 Hussain Sep 2014 A1
20140282741 Shoykhet Sep 2014 A1
20140282744 Hardy et al. Sep 2014 A1
20140282745 Chipman et al. Sep 2014 A1
20140282759 Harvey et al. Sep 2014 A1
20140282779 Navarra Sep 2014 A1
20140294201 Johnson et al. Oct 2014 A1
20140298378 Kelley Oct 2014 A1
20140310819 Cakarel et al. Oct 2014 A1
20140313341 Stribling Oct 2014 A1
20140321831 Olsen et al. Oct 2014 A1
20140325556 Hoang et al. Oct 2014 A1
20140331260 Gratton Nov 2014 A1
20140333841 Steck Nov 2014 A1
20140351045 Abihssira et al. Nov 2014 A1
20140373079 Friedrich et al. Dec 2014 A1
20150003814 Miller Jan 2015 A1
20150012656 Phillips et al. Jan 2015 A1
20150020097 Freed et al. Jan 2015 A1
20150040176 Hybertson et al. Feb 2015 A1
20150052568 Glennon et al. Feb 2015 A1
20150058890 Kapa Feb 2015 A1
20150082172 Shakib et al. Mar 2015 A1
20150095932 Ren Apr 2015 A1
20150118992 Wyatt et al. Apr 2015 A1
20150181132 Kummer et al. Jun 2015 A1
20150181279 Martch et al. Jun 2015 A1
20150189377 Wheatley et al. Jul 2015 A1
20150249803 Tozer et al. Sep 2015 A1
20150249864 Tang et al. Sep 2015 A1
20150281778 Xhafa et al. Oct 2015 A1
20150310725 Koskan et al. Oct 2015 A1
20150334461 Yu Nov 2015 A1
20160066020 Mountain Mar 2016 A1
20160066026 Mountain Mar 2016 A1
20160066049 Mountain Mar 2016 A1
20160066056 Mountain Mar 2016 A1
20160073172 Sharples Mar 2016 A1
20160088351 Petruzzelli et al. Mar 2016 A1
20160191147 Martch Jun 2016 A1
20160198229 Keipert Jul 2016 A1
20160309212 Martch et al. Oct 2016 A1
Foreign Referenced Citations (31)
Number Date Country
1469476 Oct 2004 EP
1 865 716 Dec 2007 EP
2107477 Oct 2009 EP
2 309 733 Apr 2011 EP
2 403 239 Jan 2012 EP
2464138 Jun 2012 EP
2 902 568 Dec 2007 FR
H10 322622 Dec 1998 JP
2005-317165 Nov 2005 JP
2006-245745 Sep 2006 JP
2012-029150 Feb 2012 JP
2013-175854 Sep 2013 JP
2014-157460 Aug 2014 JP
2004 0025073 Mar 2004 KR
2006 0128295 Dec 2006 KR
9837694 Aug 1988 WO
0243353 May 2002 WO
2005059807 Jun 2005 WO
2007064987 Jun 2007 WO
2007098067 Aug 2007 WO
2009073925 Jun 2009 WO
2011040999 Apr 2011 WO
2013016626 Jan 2013 WO
2013166456 Nov 2013 WO
2014072742 May 2014 WO
2014164782 Oct 2014 WO
2014179017 Nov 2014 WO
2016030384 Mar 2016 WO
2016030477 Mar 2016 WO
2016034899 Mar 2016 WO
2016055761 Apr 2016 WO
Non-Patent Literature Citations (94)
Entry
Boxfish|TV's API; www.boxfish.com, 5 pages.
Thuuz Sports, “Frequently Asked Questions”, www.thuuz.com/faq/, 5 pages.
International Search Report for PCT/US2014/060651 dated Jan. 19, 2015 (9 pages).
International Search Report for PCT/US2014/060649 dated Jan. 8, 2015 (9 pages).
International Preliminary Report on Patentability for PCT/US2014/060651 dated May 6, 2016 (7 pages).
International Preliminary Report on Patentability for PCT/US2014/060649 dated May 6, 2016 (6 pages).
International Search Report and Written Opinion for PCT/GB2015/052456 dated Jun. 13, 2016, all pages.
U.S. Appl. No. 13/834,916, filed Mar. 15, 2013, Non-Final Rejection dated Aug. 18, 2016, all pages.
U.S. Appl. No. 14/470,248, filed Aug. 27, 2014, Non Final Office Action dated Jul. 25, 2016, all pages.
U.S. Appl. No. 14/470,279, filed Aug. 27, 2014, Non Final Office Action dated Jul. 19, 2016, all pages.
U.S. Appl. No. 14/479,007, filed Sep. 5, 2014, Non-Final Office Action dated Jul. 27, 2016, 37 pages.
U.S. Appl. No. 13/942,451, filed Jul. 15, 2013, Final Office Action dated Jun. 22, 2016, all pages.
U.S. Appl. No. 14/139,299, filed Dec. 23, 2013, Non Final Office Action dated Jun. 20, 2016, all pages.
U.S. Appl. No. 14/470,392, filed Aug. 27, 2014, Non-Final Office Action dated Aug. 5, 2016, all pages.
U.S. Appl. No. 14/470,415, filed Aug. 27, 2014, Non Final Office Action dated Jul. 29, 2016, all pages.
U.S. Appl. No. 14/297,279, filed Jun. 5, 2014, Final Rejection dated Apr. 22, 2016, 33 pages.
U.S. Appl. No. 14/139,420, filed Dec. 23, 2013, Notice of Allowance dated Mar. 31, 2016, 37 pages.
U.S. Appl. No. 13/801,932, filed Mar. 13, 2013, Non-Final Rejection dated May 20, 2016, 28 pages.
International Preliminary Report on Patentability for PCT/US2014/023466 dated Sep. 15, 2015, 8 pages.
International Search Report and Written Opinion for PCT/EP2015/069461 dated Oct. 1, 2015, 13 pages.
International Search Report and Written Opinion for PCT/EP2015/069456 dated Oct. 5, 2015, all pages.
International Preliminary Report on Patentability for PCT/US2014/033796 dated Nov. 3, 2015, all pages.
International Search Report and Written Opinion for PCT/EP2015/069681 dated Nov. 23, 2015, 12 pages.
U.S. Appl. No. 14/297,322, filed Jun. 5, 2014, Notice of Allowance dated Nov. 5, 2015, 34 pages.
U.S. Patent Application No. 14/297,279, filed Jun. 5, 2014, Non-Final Office Action dated Nov. 5, 2015, 45 pages.
U.S. Appl. No. 14/200,864, filed Mar. 7, 2014, Non-Final Office Action dated Feb. 18, 2016, 61 pages.
U.S. Appl. No. 14/479,007, filed Sep. 5, 2014, Final Office Action dated Feb. 22, 2016, 37 pages.
U.S. Appl. No. 14/095,860, filed Dec. 3, 2013, Notice of Allowance dated Oct. 19, 2015, 14 pages.
U.S. Appl. No. 14/071,613, filed Nov. 4, 2013, Final Office Action dated Oct. 8, 2015, 11 pages.
U.S. Appl. No. 14/139,299, filed Dec. 23, 2013, Final Office Action dated Feb. 25, 2016, all pages.
U.S. Appl. No. 14/470,392, filed Aug. 27, 2014, Non Final Office Action dated Nov. 5, 2015, 31 pages.
U.S. Appl. No. 14/470,392, filed Aug. 27, 2014, Final Office Action dated Mar. 4, 2016, all pages.
U.S. Appl. No. 14/470,415, filed Aug. 27, 2014, Non Final Office Action dated Nov. 18, 2015, 28 pages.
U.S. Appl. No. 14/470,415, filed Aug. 27, 2014, Final Office Action dated Mar. 3, 2016, all pages.
U.S. Appl. No. 13/801,932, filed Mar. 13, 2013, Final Office Action dated Dec. 17, 2015, 23 pages.
U.S. Appl. No. 13/834,916, filed Mar. 15, 2013, Final Office Action dated Dec. 14, 2015, 31 pages.
U.S. Appl. No. 14/470,248, filed Aug. 27, 2014, Final Office Action dated Feb. 16, 2016, 26 pages.
U.S. Appl. No. 14/470,279, filed Aug. 27, 2014, Final Office Action dated Jan. 22, 2016, 25 pages.
U.S. Appl. No. 14/591,474, filed Jan. 7, 2015, Non-Final Office Action dated Feb. 12, 2016, 32 pages.
U.S. Appl. No. 14/494,079, filed Sep. 23, 2014, Preinterview first office action dated Feb. 10, 2016, 6 pages.
Office Action for EP 14160140.1, dated Jan. 19, 2016, 5 pages.
International Search Report and Written Opinion for PCT/GB2015/052570, dated Dec. 11, 2015, 13 pages.
U.S. Appl. No. 14/470,248, filed Aug. 27, 201, Preinterview first office action dated Sep. 4, 2015, 22 pages.
U.S. Appl. No. 14/470,279, filed Aug. 27, 2014, Preinterview first office action dated Aug. 26, 2015, 23 pages.
U.S. Appl. No. 14/479,007, filed Sep. 5, 2014, Non-Final Office Action dated Sep. 1, 2015, 44 pages.
U.S. Appl. No. 14/139,299, filed Dec. 23, 2013, Non Final Office Action dated Aug. 14, 2015, 39 pages.
Jin S H et al., “Intelligent broadcasting system and services for personalized semantic contents consumption”, Expert Systems With Applications, Oxford, GB, vol. 31, No. 1, Jul. 1, 2006, pp. 164-173, XP024962718, ISSN: 0957-4174, DOI: 10.1016/J.ESWA.2005.09.021.
Sung Ho Jin et al., “Real-time content filtering for live broadcasts in TV terminals”, Multimedia Tools and Applications, Kluwer Academic Publishers, BO, vol. 36, No. 3, Jun. 29, 2007 pp. 285-301, XP019578768, ISSN: 1573-7721.
European Search Report for EP 14197940.1 dated Apr. 28, 2015, 13 pages.
U.S. Appl. No. 13/971,579, filed Aug. 20, 2013, Notice of Allowance dated Feb. 27, 2015, 28 pages.
U.S. Appl. No. 13/801,932, filed Mar. 13, 2013, Non Final Office Action dated Jun. 24, 2015, 21 pages.
U.S. Appl. No. 13/834,916, filed Mar. 15, 2013, Non Final Office Action dated Apr. 27, 2015, 22 pages.
U.S. Appl. No. 13/942,451, filed Jul. 15, 2013, Final Office Action dated Apr. 30, 2015, 33 pages.
U.S. Appl. No. 14/071,613, filed Nov. 4, 2013, Non-Final Office Action dated May 18, 2015, 20 pages.
U.S. Appl. No. 14/095,860, filed Dec. 3, 2013, Final Office Action dated May 1, 2015, 18 pages.
U.S. Appl. No. 14/095,860, filed Dec. 3, 2013, Notice of Allowance dated Jul. 13, 2015, 31 pages.
U.S. Appl. No. 14/139,420, filed Dec. 23, 2013, Non-Final Office Action dated Apr. 30, 2015, 27 pages.
U.S. Appl. No. 14/200,864, filed Mar. 7, 2014, Final Office Action dated Jun. 18, 2015, 36 pages.
U.S. Appl. No. 13/834,916, filed Mar. 15, 2013, Final Office Action dated Jan. 12, 2015, 22 pages.
U.S. Appl. No. 13/886,873, filed May 3, 2013, Notice of Allowance dated Oct. 24, 2014, 40 pages.
U.S. Appl. No. 14/095,860, filed Dec. 3, 2013, Non-Final Office Action dated Dec. 26, 2014, 45 pages.
U.S. Appl. No. 14/200,864, filed Mar. 7, 2014, Non-Final Office Action dated Dec. 5, 2014, 35 pages.
International Search Report and Written Opinion of PCT/US2014/033796 dated Sep. 5, 2014, 12 pages.
U.S. Appl. No. 13/971,579, filed Aug. 20, 2013, Non Final Office Action dated Oct. 28, 2014, 35 pages.
Extended European Search Report for EP 14160140.1 dated Jul. 7, 2014, 7 pages.
International Search Report and Written Opinion for PCT/US2014/023466 dated Jul. 10, 2014, 15 pages.
U.S. Appl. No. 13/801,932, filed Mar. 13, 2013, Notice of Allowance dated Nov. 25, 2016, all pages.
U.S. Appl. No. 13/834,916, filed Mar. 15, 2013, Non Final Office Action dated Aug. 8, 2014, 19 pages.
U.S. Appl. No. 13/942,451, filed Jul. 15, 2013, Non Final Office Action dated Jul. 28, 2014, 27 pages.
Extended European Search Report for EP 11166892.7 dated Oct. 6, 2011, 7 pages.
U.S. Appl. No. 14/200,864, filed Mar. 7, 2014, Notice of Allowance dated Sep. 15, 2016, all pages.
U.S. Appl. No. 14/470,248, filed Aug. 27, 2014, Final Office Action dated Dec. 9, 2016, all pages.
U.S. Appl. No. 14/470,279, filed Aug. 27, 2014, Final Office Action dated Dec. 9, 2016, all pages.
U.S. Appl. No. 14/479,007, filed Sep. 5, 2014, Final Office Action dated Jan. 23, 2017, all pages.
U.S. Appl. No. 14/591,474, filed Jan. 7, 2015, Non-Final Office Action dated Dec. 16, 2016, 32 pages.
U.S. Appl. No. 15/195,527, filed Jun. 28, 2016, Non-Final Rejection dated Sep. 30, 2016, all pages.
U.S. Appl. No. 13/942,451, filed Jul. 15, 2013, Non Final Office Action dated Oct. 25, 2016, all pages.
R. Natarajan et al. “Audio-Based Event Detection in Videos—A Comprehensive Survey”, Int. Journal of Engineering and Technology, vol. 6 No. 4 Aug.-Sep. 2014.
Q. Huang et al. “Hierarchical Language Modeling for Audio Events Detection in a Sports Game”, IEEE International Conference on Acoustics, Speech and Signal Processing, 2010.
Q. Huang et al. “Inferring the Structure of a Tennis Game Using Audio Information”, IEEE Trans. on Audio Speech and Language Proc., Oct. 2011.
M. Baillie et al. “Audio-based Event Detection for Sports Video”, International Conference on Image and Video, CIVR 2003.
Y. Rui et al. “Automatically Extracting Highlights for TV Baseball Programs”, Proceedings of the eighth ACM International conference on Multimedia, 2000.
D. A. Sadlier et al. “A Combined Audio-Visual Contribution to Event Detection in Field Sports Broadcast Video. Case Study: Gaelic Football”, Proceedings of the 3rd IEEE International Symposium on Signal Processing and Information Technology, Dec. 2003.
E. Kijak et al. “Audiovisual Integration for Tennis Broadcast Structuring”, Multimedia Tools and Applications, Springer, vol. 30, Issue 3, pp. 289-311, Sep. 2006.
A. Baijal et al. “Sports Highlights Generation Based on Acoustic Events Detection: A Rugby Case Study”, IEEE International Conference on Consumer Electronics (ICCE), 2015.
J. Han et al. “A Unified and Efficient Framework for Court-Net Sports Video Analysis Using 3-D Camera Modeling”, Proceedings vol. 6506, Multimedia Content Access: Algorithms and Systems; 65060F (2007).
Huang-Chia Shih “A Survey on Content-aware Video Analysis for Sports”, IEEE Trans. on Circuits and Systems for Video Technology, vol. 99, No. 9, Jan. 2017.
A. Krizhevsky et al. “ImageNet Classification with Deep Convolutional Neural Networks”, In Proc. NIPS, pp. 1097-1105, 2012.
D. A. Sadlier et al. “Event Detection in Field Sports Video Using Audio-Visual Features and a Support Vector Machine”, IEEE Trans. on Circuits and Systems for Video Technology, vol. 15, No. 10, Oct. 2005.
P. F. Felzenszwalb et al. “Efficient Graph-Based Image Segmentation”, International Journal of Computer Vision, Sep. 2004, vol. 59, Issue 2, pp. 167-181.
C. J. C. Burges “A Tutorial on Support Vector Machines for Pattern Recognition”, Springer, Data Mining and Knowledge Discovery, Jun. 1998, vol. 2, Issue 2, pp. 121-167.
Y.A. LeCun et al. “Efficient BackProp” Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol. 7700, Springer, 2012.
L. Neumann, J. Matas, “Real-Time Scene Text Localization and Recognition”, 5th IEEE Conference on Computer Vision and Pattern Recognition, Jun. 2012.
R. Smith “An Overview of the Tesseract OCR Engine”, International Conference on Document Analysis and Recognition (ICDAR), 2007.
Related Publications (1)
Number Date Country
20150110461 A1 Apr 2015 US