This invention relates to determining the starting and ending times of commercial breaks, as well as the starting and ending times of commercials within those commercial breaks, in audiovisual content (e.g., a television broadcast).
There are a variety of previous approaches to detecting commercials in a television broadcast. However, previous approaches suffer from a flaw in that they act as relatively simple finite state machines with little or no error recovery. Previous approaches sometimes make an erroneous decision regarding a commercial location which only becomes apparent by considering data far ahead or after the commercial location in time. However, since previous approaches don't consider such data, information regarding the erroneous decision is ignored and the error remains uncorrected.
Merlino et al. of the MITRE corporation describe a multiple-cue-based method for segmenting news programs, including finding the commercial breaks. The Merlino et al. method uses black frames, audio silence and blank closed-captioning to find commercials, R. Lienhart et al. of the University of Mannheim also describe a multiple-cue-based system for detecting commercials. The Lienhart et al. method uses black frames, scene cuts and a measure of motion in a visual recording to detect commercials. The Informedia project at Carnegie-Mellon University used black frames, scene cuts and lapses in closed-captioning to detect commercials. Additionally, some VCRs come with commercial detection built in to the VCR. There are also a number of patents that describe methods for commercial detection, all of which use coincidences of black frames, audio silence and/or certain closed-captioning signals to detect commercials. For example, U.S. Pat. Nos. 4,319,286, 4,750,053, 4,390,904, 4,782,401 and 4,602,297 detect commercials based on these types of coincidences.
All of the previous approaches to commercial detection select commercial start or end times as the times at which some combination of cues, such as a black frame and an audio pause, coincide, with some optional restrictions. Typically, in previous approaches the decision as to whether or not a commercial starts at a particular time “t” is independent of the analogous decision for any other time in the audiovisual broadcast or recording. Previous approaches in which such complete independence does not exist use only a very limited dependence in which the decision at time “t” may be affected by whether or not a commercial was thought to start within some window of time [t−n, . . . , t] prior to the time “t”. Thus, in previous approaches, a commercial detection decision made at any time “t” in a broadcast or recording is not affected by parts of the broadcast or recording following time “t” and only sometimes is affected by limited parts (less than one minute) of the broadcast or recording immediately prior to time “t”. Further, none of the previous approaches have any sort of double-checking or error recovery; once a decision is made for time “t”, by whatever heuristic the approach uses, the decision remains unchanged no matter what happens in the broadcast or recording after time “t” and no matter what other decisions are made before or after time “t”. In summary, previous approaches to commercial detection make decisions as to commercial locations both time-locally and sequentially, i.e., only data from within a narrow time window about a particular time t is considered in making the decision as to whether a commercial starts or ends at that time t, and the decisions are made one at a time and are never reversed.
In accordance with the invention, the starting and ending times of commercial breaks, as well as the starting and ending times of commercials within those commercial breaks, can be found in audiovisual content (e.g., a television broadcast) using a method having characteristics which overcome the above-described disadvantages of the prior art. The invention is implemented as a solution to a “batch optimization” problem in which commercial locations within a set of audiovisual content are detected as a group by choosing a set of commercial locations which optimizes a cost function which can include consideration of, for example, 1) one or more of many types of visual recording, audio recording and/or closed-captioning cues, 2) relative locations of commercials within the audiovisual content, and/or 3) probability models based on statistics obtained regarding characteristics of typical commercial and commercial breaks (e.g., commercial and commercial break duration, separation times of commercials and commercial breaks, likelihood of the presence of a commercial at any given time in a set of audiovisual content). Optimization can be done over the total set of commercial location decisions, rather than on a per-commercial basis. Additionally, the cost function can be iteratively evaluated, increasing the accuracy of commercial location decisions produced by the method. The logic for making decisions regarding detection of commercials in accordance with the invention is orders of magnitude more sophisticated than that of other approaches to commercial detection and produces correspondingly superior results. Additionally, many more types of information (cues) can be used in detection of commercials in accordance with the invention than have been used in other approaches to commercial detection.
The invention can make use of any one of a number of particular novel collections of cues to enable commercial detection. Previous approaches to commercial detection have used some of the cues that can be used in commercial detection according to the invention, but have not used all of the cues or collections of cues that can be used in accordance with the invention. Additionally, cues other than those specifically described herein can be used to enable commercial detection according to the invention.
Embodiments of the invention can have the following advantageous characteristics. The invention can be implemented so that commercial detection is performed iteratively. The invention can be implemented so that audiovisual content occurring before and/or after the time of a possible commercial is considered in deciding whether a commercial is present at that time. In particular, the invention can be implemented so that commercials are detected in a set of audiovisual content by considering all of that set of audiovisual content. The invention can be implemented so that the presence of multiple commercials in a set of audiovisual content can be detected at the same time. The invention can be implemented so that a decision regarding whether a commercial is detected at a particular location in a set of audiovisual content is dependent on the possible detection of one or more commercials at other location(s) in the set of audiovisual content. In particular, the invention can be implemented so that whether a commercial is detected at a particular location in a set of audiovisual content is dependent on all other possible detections of a commercial in the set of audiovisual content. Finally, as discussed above, the invention can be implemented so that commercial detection is based on the evaluation of a variety of novel cues or combinations of cues. Each of these aspects of the invention produce improved commercial detection results as compared to previous approaches to commercial detection.
The invention can be used for a wide variety of applications and purposes, as will be appreciated by those skilled in the art in view of the description herein. For example, commercial detection in accordance with the invention can be useful for enabling the observation of audiovisual content without commercial interruption, recording audiovisual content so that commercials are deleted from the recorded audiovisual content, searching for a particular commercial or a commercial of a particular type within audiovisual content, and customizing commercials in audiovisual content. Additionally, the invention can be used with, or implemented in, for example, conventional network television broadcasts, cable television broadcasts, television set-top boxes and digital VCRs.
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
The invention can be implemented in numerous ways, including as a process, an apparatus, a system, a composition of matter, a computer readable medium such as a computer readable storage medium or a computer network wherein program instructions are sent over optical or electronic communication links. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. A component such as a processor or a memory described as being configured to perform a task includes both a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. In general, the order of the steps of disclosed processes may be altered within the scope of the invention.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
In step 101 of the method 100, the data representing the audiovisual content (including closed-captioning or other transcription data, if applicable) is analyzed to identify the presence of one or more predetermined types of cues. The location and duration of each cue (i.e., the beginning and ending times of each cue) can be determined. Other characteristics of the cues can be determined as well. Examples of the types of cues that can be identified are discussed in more detail below.
In step 102 of the method 100, the cues are analyzed to identify possible locations of commercial beginnings or endings (candidate times) within the audiovisual content. Examples of such analysis are discussed further below. A score is assigned to each candidate time, the score representing a probability that the candidate time is in fact a beginning or ending of a commercial.
In step 103 of the method 100, the scores associated with each candidate time are adjusted. The score for a candidate time can be adjusted, for example, based on an analysis of one or more cues proximate to the candidate time that are different than the one or more cues used to identify the candidate time. For instance, as described in more detail below with respect to step 206 of the method 200 (see
The invention can be implemented so that after the adjustment of scores in step 103, scores below a specified threshold are eliminated. However, this need not necessarily be done.
In step 104 of the method 100, one or more commercial 30 breaks are constructed based on an evaluation of the adjusted scores of the candidate times and relationships among the candidate times. In particular, as described in more detail below with respect to step 208 of the method 200 (see
The method 100 can be implemented so that step 104 is iteratively performed. This can improve the identification of commercial breaks and individual commercials in a set of audiovisual content by enabling reconsideration of high-scoring candidate times which were not chosen as a commercial start or end time in an earlier performance of the step 104.
The output of the method 200 is a list of commercial break start and end times within the audiovisual content, plus lists of the individual commercial start times (and, therefore, implicitly, the individual commercial end times) within each commercial break. The identification of the start and end times of commercial breaks and individual commercials in the audiovisual content can be used to, for example, enable editing of the audiovisual content. For instance, the detected commercials can be deleted from the audiovisual content or the commercials can be altered in a desired manner. (The modified data, e.g., data representing commercial-less audiovisual content, can be stored on a data storage medium or media). However, a method according to the invention need not be used to edit the audiovisual content within which the commercial breaks and individual commercials have been detected.
In the first step of the method 200, step 201, the input data is identified as the input data is input to the method 200 (e.g., read from a data storage medium or media). Apparatus for effecting such identification is known to those skilled in the art and will depend on the source of the input data. For example, readers for all useful data storage media are readily available.
The next step of the method 200, step 202, is to decompress the raw data, if necessary. The invention does not require the original input data to be in compressed format, nor does the invention require the input data to be in uncompressed format. However, if the input data is in compressed format, the implementation of the invention illustrated in
In step 203 of the method 200, the decompressed data is split into visual, audio and closed-captioning subcomponents. (As indicated above, one or two of visual, audio and closed captioning data may not be present as part of the input data.) The precise form of the mechanism for splitting apart the input data can depend on the decompression mechanism used, but such data-splitting mechanisms are readily available for any useful compression format.
In step 204, the audiovisual content is evaluated to identify the presence of one or more cues in the audiovisual content. Each data subcomponent produced by step 203 is input into one or more analyzers. The analyzer(s) identify the location(s) and duration(s) of cue(s) within the audiovisual content. In particular, the presence of cue(s) throughout an entire set of audiovisual content can be identified. The analyzers may also identify other characteristics of the cue(s). The following is an exemplary list of characteristics of a set of audiovisual content regarding which cues can be identified within the set of audiovisual content: 1) an audio pause (i.e., a period of silence or near silence) in the audio content, 2) a sequence of black frames in the visual content, 3) a scene cut or fade in the visual content, 4) a significant (i.e., greater than a specified amount) change in average volume in the audio content, 5) the presence of music in the audio content, 6) speaker identity, 7) the “density” of scene breaks (cuts) or fades in the visual content (i.e., the number of scene breaks and/or scene fades during a specified time window divided by the duration of the time window), 8) the absence of a usually present network icon (whose shape and color characteristics can be learned automatically by appropriately analyzing a region, e.g., a region, such as a corner, near the edge of the visual content, of the visual content in which a network icon is expected to be present, 9) the degree of motion in a period of visual content, 10) the presence of text in the visual content, 11) the occurrence of specified closed-captioning formatting signals, and 12) the absence of closed-captioning. Suitable methods for identifying each of the above-listed cues have been published in various academic journals, industry journals and conference proceedings, and are known to those skilled in the art. As will be appreciated by those skilled in the art in view of the description herein, other cues not listed above can also be used, alone or in combination with each other and/or one or more of the above listed cues to enable commercial detection according to the invention. The specific methods used to identify the above listed or other cues may affect the overall performance of a system or method according to the invention, but, in general, any such methods can be used with the invention. However, it is an advantageous aspect of the invention that the invention enables use of a combination of the above-listed cues to effect commercial detection.
In an alternative implementation of the invention, the output from step 201 of the method 200 would be input directly to some or all of the analysis engines of step 204. That is, some of the analysis engines can be made to operate directly on the compressed data, depending on the compression format. For example, the black-frame detection, scene-cut detection, motion analysis and audio-level change detection can all operate directly on data that has been compressed in the MPEG-1 or MPEG-2 format.
In step 205 of the method 200, one or more of the cues identified in step 204 are analyzed to identify candidate times within the audiovisual content at which a commercial beginning or a commercial ending may occur. For example, an audio pause often accompanies either the beginning or the end of a commercial, so the presence of an audio pause in the audio content can be identified as a factor that militates toward establishing a candidate time at some time during or proximate to the audio pause. Similarly, a sequence of black frames often accompanies either the beginning or the end of a commercial, so the presence of a sequence of black frames in the visual content can be identified as a factor that militates toward establishing a candidate time at some time during or proximate to the sequence of black frames. A scene cut or fade also typically accompanies the beginning or the end of a commercial, so the presence of a scene break or fade in the visual content can be identified as a factor that militates toward establishing a candidate time at some time during or proximate to the scene break or fade. The beginning and end of a commercial break are often accompanied by a noticeable increase and decrease in volume, respectively, so that a significant change in average volume (measured over a specified window of time) can be identified as a factor that militates toward establishing a candidate time at some time proximate to times at which the volume is seen to change significantly. Commercials often include relatively more musical content than the rest of a set of audiovisual content, so the occurrence of a time window of specified duration (e.g., the expected duration of a typical commercial break, such as 60 seconds, or the expected duration of a typical commercial, such as 15 or 30 seconds) having relatively high musical content (e.g., relatively high density of musical content relative to the density of musical content in other, proximate time windows) can be identified as a factor that militates toward establishing candidate times at the beginning and end of such a time window. The beginning or end of a commercial is often accompanied by a change in speaker identity, so the occurrence of a change in speaker identity can be identified as a factor that militates toward establishing a candidate time at, or proximate to, the time at which such a change in speaker identity occurs. A commercial break often includes a relatively high density of scene breaks and/or fades (since a scene break or fade typically occurs at the beginning and end of a commercial break, as well as at the transition between commercials within a commercial break, and since commercials often include a relatively large number of scene breaks and/or fades per unit time within the commercial), so the occurrence of a time window of a specified duration (e.g., 60 seconds) during which the density of scene breaks and/or scene fades is relatively high (i.e., exceeds a specified threshold), or a significant change in density of scene breaks and/or scene fades over one window of time with respect to a proximate window of time, can be identified as a factor that militates toward establishing candidate times at the beginning and end of such a time window. A network icon is sometimes present during the noncommercial parts of a television broadcast; therefore, if a network icon is determined to be present in a set of audiovisual content, the disappearance of the network icon typically accompanies the beginning of a commercial break and the appearance of the network icon typically accompanies the end of a commercial break, so the appearance or disappearance of a network icon can be identified as a factor that militates toward establishing a candidate time at, or proximate to, a time at which the network icon appears or disappears. Since the average motion level in the visual content of a commercial is often significantly different than the average motion level of other visual content in a set of audiovisual content, significant change in the amount of motion in the visual content of a time window (e.g., about 60 seconds) relative to the amount of motion in the visual content in a proximate time window can be identified as a factor that militates toward establishing candidate times at, or proximate to, the beginning and end of such a time window. The appearance of text (other than closed-captioning) in a set of audiovisual content often accompanies the beginning of a commercial break and the disappearance of text often accompanies the end of a commercial break, so the appearance or disappearance in a set of audiovisual content of text other than closed-captioning can be identified as a factor that militates toward establishing a candidate time at, or proximate to, a time at which text appears or disappears. If closed-captioning data is present in the data representing the audiovisual content, a closed-captioning scrolling format change often occurs at the beginning or the end of a commercial break, so the occurrence of a closed-captioning scrolling format change can be identified as a factor that militates toward establishing a candidate time at, or proximate to, the time at which such a format change occurs. If closed-captioning data is present in the data representing the audiovisual content, the disappearance of closed-captioning often accompanies the beginning of a commercial break and the appearance of closed-captioning often accompanies the end of a commercial break, so the appearance or disappearance of closed-captioning can be identified as a factor that militates toward establishing a candidate time at, or proximate to, a time at which closed-captioning appears or disappears.
As indicated above, it is an advantageous aspect of the invention that the invention enables use of a combination of the cues to effect commercial detection. In particular, the invention can enable the use of cues and combinations of cues that have not previously been used for commercial detection. For example, the invention can advantageously enable any one of detection of the absence of a network icon, an analysis of musical content present in a set of audiovisual content, the density of scene breaks and/or fades, or analysis of the identity of speakers of spoken content to be used alone as a commercial detection cue. These cues can also be used in any combination with each other or any other cue. In particular, it is anticipated that one or more of these cues can advantageously be used in combination with one or more of the following cues: 1) the occurrence of an audio pause, 2) the occurrence of a sequence of black frames, 3) a scene cut or fade, 4) the occurrence of specified closed-captioning formatting signals, and 5) the appearance or disappearance of closed-captioning.
Step 205 outputs a list of candidate times at which commercials may be beginning or ending, together with a score or probability associated with each candidate time. In one implementation of the invention, each candidate time is assigned the same initial score. Alternatively, the scores assigned to candidate times can vary. For example, the score for a candidate time can depend on which cue(s) were used to identify the candidate time. The beginning or end of a commercial can be deduced from the presence of some cues with a greater degree of confidence than that associated with the presence of other cues. To the extent that a candidate time is identified based on a cue with which a relatively high degree of predictive confidence is associated, the score assigned to that candidate time can be relatively higher than would be the case if a relatively low degree of predictive confidence was associated with the cue. Additionally, the score for each candidate time can be dependent on how strongly the cue is present in the audiovisual content, as determined in accordance with a criterion or criteria appropriate for that cue: the more strongly a cue is present, the higher the score. For example, when one of the cues used to establish a candidate time is an audio pause, the score established for the candidate time can be dependent on the duration of the audio pause and/or the degree of silence during the audio pause (e.g., the score for the candidate time is made relatively greater the longer the audio pause or the less sound that is present during the audio pause). Or, for example, when one of the cues used to establish a candidate time is a sequence of black frames, the score established for the candidate time can be dependent on the duration of the sequence of black frames and/or the completeness of the blackness of the frames (e.g., the score for the candidate time is made relatively greater the longer or blacker the sequence of black frames). Or, for example, when one of the cues used to establish a candidate time is a scene cut, the score established for the candidate time can be dependent on the number of pixels that changed by more than a threshold amount from one frame to another (e.g., the score for the candidate time is made relatively greater as more pixels changed between scenes) and/or dependent on the total change of all the pixels from one frame to another (where the “change” for each pixel is the change in the color or other components of a pixel). Or, for example, when one of the cues used to establish a candidate time is a significant average audio volume change, the score established for the candidate time can be dependent on degree of the volume change (e.g., the score for the candidate time is made relatively greater as degree of the volume change increases). Those skilled in the art can readily appreciate how the score for a candidate time can be adjusted based on aspects of other cues present in the audiovisual content proximate to the candidate time. Additionally, the score for a candidate time can be dependent on the confidence level associated with identification of the cue in the audiovisual content: the greater the confidence level, the higher the score. (This confidence level is different than the confidence level associated with the predictive capability of the cue, discussed above.) For example, sound represented in audio data may be sound in the audio content or noise. The score for a candidate time identified at least in part based on the presence of an audio pause can be increased or decreased in accordance with extent to which the degree of noise present in the audio data increases or decreases the confidence with which an audio pause can be detected.
In step 206 of the method 200, the scores associated with each candidate time can be adjusted based on the presence or absence of one or more cues within some time window proximate to the candidate time. The cue(s) used to adjust the score of a candidate time in step 206 are different than the cue(s) used to establish the candidate time and an initial associated score in step 205. The duration of the time window and location of the time window with respect to the cue is dependent on the type of cue. For instance, the score for a candidate time is increased (i.e., the likelihood that the candidate time correctly indicates the beginning or ending of a commercial is deemed to increase) in each of the following cases: 1) the candidate time is coincident with the time at which an audio pause (which is a window of audio silence or near silence) occurs, 2) the candidate time is within or sufficiently proximate to a time window in which the closed-captioning scrolling format is different from that which is typical for audiovisual content of this type, 3) the candidate time is within or sufficiently proximate to a time window during which closed-captioning is absent (for audiovisual content that is known to be closed-captioned), 4) the candidate time is within or sufficiently proximate to a time window of at least a specified duration (e.g., 60 seconds) and including high musical content, 5) the candidate time is within or sufficiently proximate to a time window during which the density of scene breaks and/or scene fades exceeds a specified threshold, 6) the candidate time is sufficiently proximate to a time window of at least a specified duration (e.g., 0.5 seconds) and in which the average motion in the visual content, measured in a specified manner, is less than a specified threshold, 7) the candidate time is within a time window during which a network icon (which has been found to be persistent through a majority of the visual content) is not present at a specified location within the visual content (e.g., a region, such as a corner, near the edge of the visual content), 8) the candidate time is very near (e.g., within about 2 seconds) a time at which the time-averaged audio volume (averaged over a time window of about 10 seconds) has changed by a magnitude of greater than a specified threshold, 9) the candidate time is sufficiently proximate to (within about 1 second) a time when text is present in the visual content, 10) the candidate time is within a specified duration of time (e-g., a few seconds) after the presence in the closed-captioning stream of certain keywords or phrases such as “commercial”, “break”, “coming up” or “after”, or within a specified duration of time (e.g., a few seconds) prior to the presence in the closed-captioning stream of certain keywords or phrases such as “welcome”, “hello” or “we're back, 11) the candidate time is within a specified duration of time (e.g., 2 seconds) from a time at which the speaker identity has changed, and 12) the candidate time is within a specified duration of time (e.g., one to several seconds) from a time window of greater than a specified duration (e-g., 1 minute) that does not include speech from a speaker whose speech has been determined to be present in the audiovisual content with greater than a specified frequency. The amount by which a score is adjusted can be dependent on the same types of analyses done to establish an initial score for a candidate time, as described above with respect to step 205. (However, the particular analyses done in step 206 need not, but can be, the same as those done in step 205.) In particular, the amount of the adjustment to a score for a candidate time can be dependent on how strongly the cue is present in the audiovisual content, as determined in accordance with a criterion or criteria appropriate for that cue: in general, the more strongly a cue is present, the greater the adjustment to the score. Additionally, the amount of the adjustment to a score for a candidate time can be dependent on how high or low the score is prior to the adjustment. For example, a cue that strongly indicates the presence of a commercial beginning or ending may cause a larger adjustment in a relatively low score than in a relatively high score. The particular quantities, keywords, and other algorithm parameters given above are illustrative; they may be changed, within appropriate constraints, as can be appreciated by those skilled in the art, without adversely affecting the operation of the invention.
In step 207 of the method 200, the scores associated with each candidate time are further adjusted based on one or more probability models of characteristic(s) of the occurrence of commercials and/or commercial breaks within audiovisual content. For example, the scores of the candidate times can be adjusted based on a probability model of the time-separation of commercial start and end times. The scores of the candidate times can also be adjusted based on a probability model of the typical locations of commercial breaks within audiovisual content. Such a probability model can be constructed by collecting statistics regarding the relevant characteristic(s) across many sets of audiovisual content of a variety of different types, in order to produce a generic probability model that applies to all types of audiovisual content. Such a probability model can also be made specific to a particular type or types of audiovisual content (including a particular audiovisual program) by only combining statistics across audiovisual content of those type(s). This can be desirable to increase the accuracy obtained when the probability model is used to aid in detection of commercials in audiovisual content of those type(s). Finally, such a probability model can be constructed manually based on the intuition of the implementer of the model as to the characteristic being modeled (e.g., how long commercials typically last or when commercial breaks tend to occur in given audiovisual content). However constructed, the probability model(s) can be represented as functions of time, as described below.
In step 207, one or more probability models can be applied to the list of score-adjusted candidate times generated by step 206 to further adjust the scores of the candidate times. The list of score-adjusted candidate times generated by step 206 is first represented as a function of time, P(t), which indicates the likelihood that a commercial starts or ends at each time during the audiovisual content. For all candidate times, P(t) can be made equal to the adjusted score associated with that candidate time (perhaps normalized by the total of all the adjusted scores), while for all other times P(t) can be made equal to zero.
A function S(t) is determined, representing a probability model of the likely location, relative to a particular commercial start or end time, of other commercial start or end times. The function S(t) can be particularized to be representative of a particular type or types of audiovisual content. For example, for American television programs, the function S(t) will have peaks around +/−30 seconds, +/−60 seconds and +/−15 seconds.
P(t) is convolved with S(t) to produce a function C(t). The value of C(t) will be very high at times at which a high value of P(t) is separated from other times having a high value of P(t) by time durations S(t) having a high value. In other words, C(t) will be very large at values of t which are high-scoring candidate times and where there are other high-scoring candidate times before and/or after t by an amount of time corresponding to a typical commercial duration (e.g., 30 seconds).
A function R(t) is determined, representing a probability model of the likelihood, at all times within a set of audiovisual content, that a commercial is in progress. If the probability model is restricted to be based on a particular class of well-defined sets of audiovisual content (e.g., the different versions of a particular recurring audiovisual program) and if commercials tend to be placed at approximately the same times in each such set of audiovisual content, then the probability model will have well-defined zones during which the probability of a commercial being in progress is high. On the other hand, if the audiovisual content on which the probability model is based is not restricted at all, so that the probability model is learned across all types of audiovisual content, the probability model will likely be close to a uniform distribution (a flat function), which is not very useful. Therefore, it is desirable to restrict the audiovisual content on which the probability model is based. In particular, it is desirable to base the probability model which the function R(t) represents on audiovisual content which is similar to that in which commercials are to be detected.
As indicated above, step 207 begins by convolving the function P(t) with the function S(t) to produce the function C(t). The function C(t) is then point-wise multiplied by the function R(t) to produce a function P′(t).
The method 200 is described above as including both score adjustments of the type described in step 206 and score adjustments of the type described in step 207. However, the method 200 could be implemented with only one of those types of score adjustments, i.e., the method 200 could include only one of steps 206 and 207.
After the adjustment of scores in step 207, optionally, scores below a specified threshold can be eliminated.
In step 208 of the method 200, the candidate times and associated adjusted scores are evaluated, and starting and ending times for commercial breaks and individual commercials within those commercial breaks are identified based on that evaluation. Two additional probability models (which can be represented as functions of time) are used in this evaluation: 1) a function L(t), which represents a probability model of the typical duration of a commercial break, and 2) a function W(t), which represents a probability model of the typical time separation between commercial breaks. Each of these probability models, as the probability models discussed above with respect to step 207, can be constructed based on statistics collected across many types of audiovisual content or across only a particular type or types of audiovisual content, or can be constructed based on the intuition of the implementer of the model regarding the characteristic being modeled.
Step 208 begins by selecting the candidate time with the highest score to be a commercial start or end time (whether that time is a start time or end time is unknown at this point). A commercial break is then constructed based on the selected candidate time by successively evaluating candidate times in order of decreasing score and adding candidate times to the commercial break that satisfy each of the following criteria: 1) the additional candidate time is well-spaced in time, in accordance with the function S(t), from each candidate time that has already been included in the commercial break, 2) the additional candidate time does not create a commercial break which is too long, in accordance with the function L(t), and 3) the additional candidate time is not too close to other existing commercial breaks, in accordance with the function W(t), that have already been identified by the step 208. Stated another way, candidate times continue to be added to a commercial break in order of score as long as there are any candidate times for which all of the following are true: 1) the value of S(t), where “t” is the time separation between the candidate time being evaluated and a candidate time already included in the commercial break, is above a specified threshold value for each candidate time already included in the commercial break, 2) the value of L(t), where “t” is the duration of the commercial break if the candidate time is added to the commercial break, is not below a specified threshold value, and 3) the value of W(t), where “t” is the time separation between the candidate time and an existing commercial break, is not below a specified threshold value for each existing commercial break.
Once no more candidate times can be added to the commercial break currently being constructed, step 208 attempts to find a new candidate time around which to construct a new commercial break. The candidate time with the highest score above a specified threshold that is not currently part of any commercial break, and that is separated from all existing commercial breaks by a time “t” for which W(t) is above a specified threshold, is selected as a new candidate time upon which to base the construction of a new commercial break. If such a candidate time is identified, then the construction of a commercial break proceeds as described above. If no such candidate time can be identified, then step 208 terminates. The output of step 208 is a list of commercial break start and end times, and lists of start times of individual commercials within each commercial break.
Finally, step 209 of the method 200 causes the set of decisions made in step 208 regarding start and end times of commercial breaks and commercials to be iteratively refined and optimized. Step 209 attempts to account for other high-scoring candidate times which may occur in or near a previously identified commercial break, but which have not yet been chosen as a commercial start or end time. In step 209, the following procedure is performed for each candidate time which has not yet been selected as a commercial start or end time and for which Pt(t) has a magnitude above a specified threshold value.
First, the candidate time is added to the most temporally proximate commercial break, provided that the candidate time would not cause that commercial break to become too long (i.e., the value of L(t), where “t” is the duration of the commercial break if the candidate time is added to the commercial break, is not below a specified threshold value) and would not cause that commercial break to be too close to another commercial break (i.e., the value of W(t), where “t” is the time separation between the candidate time and an existing commercial break, is not below a specified threshold value for each existing commercial break).
Next, additional candidate times that are not part of the commercial break to which a candidate time has just been added are evaluated according to the same criteria, described above, that were used to construct the commercial break in the first place. That is, the probability models S(t), L(t), and W(t) are all considered as described above.
The new commercial break may have too many candidate times within it. Thus, an attempt is made to remove candidate times that are located too closely together. For each candidate time of the commercial break, a computation is made of the average S(t) for the time separation between the candidate time and the two adjacent candidate times. If the average is below a specified threshold, then it may be that either the candidate time or the adjacent candidate times are not accurate commercial start or end times. If the P′(t) score of the candidate time is lower than the average P′(t) of the two adjacent candidate times, the candidate time is eliminated from the commercial break. Otherwise, any of the two adjacent candidate times that are within a specified small time separation (e.g., 25 seconds) of the candidate time are eliminated from the commercial break.
Finally, after the addition(s) and elimination(s) of candidate times have been made, a new average score is calculated for the candidate times of the commercial break. If the average score is higher than the average score for the candidate times of the commercial break before changes were made in step 209, the changes are kept. Otherwise, the changes are discarded and the candidate times of the commercial break revert back to the candidate times before the addition(s) and elimination(s) made in step 209.
After an attempt has been made to include each of the candidate times for which P′(t) is above a threshold, a check is made to see if any of these candidate times were in fact included in a commercial break. If none were, then step 209 terminates and outputs the final determination(s) of the method 200 regarding the start and end times of commercial breaks and individual commercials in the audiovisual content. If any of these candidate times were included in a commercial break, then step 209 is repeated: for each candidate time having a score P′(t) above a specified threshold, an attempt is made to add that candidate time to the nearest commercial break. Many iterations of step 209 may be required before a stable solution is produced (i.e., further changes are not made to the determined start and end times of commercial breaks and individual commercials in the audiovisual content).
The invention as described above can be modified slightly in order to operate on audiovisual content that is being received live via a tuner, cable or other means, and to produce commercial detection results with minimal delay relative to the current live position within the audiovisual content. If the cue(s) are not transmitted with the data representing the audiovisual content, to accomplish this, the invention is implemented to enable the analysis of the audiovisual content to identify the presence of cues (the invention can be implemented to identify one, some or all of the cues described above with respect to step 204 of the method 200 of
The invention can be implemented entirely in software, entirely in hardware (e.g., using DSPs and/or special purpose ASICs) or in a combination of the two. Firmware can also be used to implement some part or all of the invention.
The invention can be used for a wide variety of applications, as can be appreciated by those skilled in the art in view of the description herein. In general, the invention can be used with any broadcast or other data transmission over a network (e.g., conventional network television broadcasts, cable television broadcasts, broadcasts or transmissions over a computer network such as the Internet—and, in particular, the World Wide Web portion of the Internet). Additionally, the invention can be used generally to detect commercials in audiovisual content represented by any type of data, which data can be stored on a data storage medium or media, or provided to a system or method according to the invention in real time. Further, the invention can be implemented in a wide variety of apparatus, as can also be appreciated by those skilled in the art in view of the description herein, such as, for example, television set-top boxes, digital VCRs, computers (including desktop, portable or handheld computers) or any of a variety of other computational devices (including many which are now being, or will in the future be, developed).
Various embodiments of the invention have been described. The descriptions are intended to be illustrative, not limitative. Thus, it will be apparent to one skilled in the art that certain modifications may be made to the invention as described herein without departing from the scope of the claims set out below.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
This application is a divisional of U.S. patent application Ser. No. 12/835,505, entitled ITERATIVE, MAXIMALLY PROBABLE, BATCH-MODE COMMERCIAL DETECTION FOR AUDIOVISUAL CONTENT filed Jul. 13, 2010 (now U.S. Pat. No. 8,630,536) which is a continuation of U.S. patent application Ser. No. 11/250,083, entitled ITERATIVE, MAXIMALLY PROBABLE, BATCH-MODE COMMERCIAL DETECTION FOR AUDIOVISUAL CONTENT filed Oct. 12, 2005 (now U.S. Pat. No. 7,778,519) which is incorporated herein by reference for all purposes, which is a continuation of U.S. patent application Ser. No. 09/715,901, entitled ITERATIVE, MAXIMALLY PROBABLE, BATCH-MODE COMMERCIAL DETECTION FOR AUDIOVISUAL CONTENT filed Nov. 18, 2000 (now U.S. Pat. No. 6,993,245) which is incorporated herein by reference for all purposes, which claims priority to U.S. Provisional Application No. 60/166,528, filed Nov. 18, 1999 which is incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
3868675 | Firmin | Feb 1975 | A |
4258386 | Cheung et al. | Mar 1981 | A |
4319286 | Hanpachern | Mar 1982 | A |
4390904 | Johnston et al. | Jun 1983 | A |
4546382 | McKenna et al. | Oct 1985 | A |
4602279 | Freeman | Jul 1986 | A |
4750052 | Poppy et al. | Jun 1988 | A |
4750053 | Allen | Jun 1988 | A |
4752834 | Koombes | Jun 1988 | A |
4782401 | Faerber et al. | Nov 1988 | A |
4841291 | Swix et al. | Jun 1989 | A |
4845658 | Gifford | Jul 1989 | A |
4858000 | Lu | Aug 1989 | A |
4907079 | Turner et al. | Mar 1990 | A |
4931865 | Scarampi | Jun 1990 | A |
5105184 | Pirani et al. | Apr 1992 | A |
5151788 | Blum | Sep 1992 | A |
5164992 | Turk et al. | Nov 1992 | A |
5223924 | Strubbe | Jun 1993 | A |
5231494 | Wachob | Jul 1993 | A |
5295064 | Malec et al. | Mar 1994 | A |
5305195 | Murphy | Apr 1994 | A |
5347632 | Filepp et al. | Sep 1994 | A |
5410344 | Graves et al. | Apr 1995 | A |
5422986 | Neely | Jun 1995 | A |
5436637 | Gayraud et al. | Jul 1995 | A |
5440337 | Henderson et al. | Aug 1995 | A |
5446919 | Wilkins | Aug 1995 | A |
5481294 | Thomas et al. | Jan 1996 | A |
5497185 | Dufresne et al. | Mar 1996 | A |
5498002 | Gechter | Mar 1996 | A |
5504518 | Ellis et al. | Apr 1996 | A |
5515098 | Carles | May 1996 | A |
5532735 | Blahut et al. | Jul 1996 | A |
5546071 | Zdunich | Aug 1996 | A |
5550735 | Slade et al. | Aug 1996 | A |
5550928 | Lu et al. | Aug 1996 | A |
5550965 | Gabbe et al. | Aug 1996 | A |
5559549 | Hendricks et al. | Sep 1996 | A |
5572643 | Judson | Nov 1996 | A |
5579055 | Hamilton et al. | Nov 1996 | A |
5583560 | Florin et al. | Dec 1996 | A |
5589892 | Knee et al. | Dec 1996 | A |
5596373 | White et al. | Jan 1997 | A |
5600364 | Hendricks et al. | Feb 1997 | A |
5600573 | Hendricks et al. | Feb 1997 | A |
5608445 | Mischler et al. | Mar 1997 | A |
5619709 | Caid et al. | Apr 1997 | A |
5659350 | Hendricks et al. | Aug 1997 | A |
5661516 | Carles | Aug 1997 | A |
5682195 | Hendricks et al. | Oct 1997 | A |
5704017 | Heckerman et al. | Dec 1997 | A |
5708478 | Tognazzini | Jan 1998 | A |
5717814 | Abecassis | Feb 1998 | A |
5717923 | Dedrick | Feb 1998 | A |
5724091 | Freeman et al. | Mar 1998 | A |
5724424 | Gifford | Mar 1998 | A |
5724472 | Abecassis | Mar 1998 | A |
5724521 | Dedrick | Mar 1998 | A |
5729279 | Fuller | Mar 1998 | A |
5734853 | Hendricks et al. | Mar 1998 | A |
5740549 | Reilly et al. | Apr 1998 | A |
5754939 | Herz et al. | May 1998 | A |
5758258 | Shoff et al. | May 1998 | A |
5761606 | Wolzien | Jun 1998 | A |
5767857 | Neely | Jun 1998 | A |
5767894 | Fuller et al. | Jun 1998 | A |
5768528 | Stumm | Jun 1998 | A |
5771307 | Lu et al. | Jun 1998 | A |
5774170 | Hite et al. | Jun 1998 | A |
5786845 | Tsuria et al. | Jul 1998 | A |
5793409 | Tetsumura et al. | Aug 1998 | A |
5794210 | Goldhaber et al. | Aug 1998 | A |
5798785 | Hendricks et al. | Aug 1998 | A |
5801747 | Bedard | Sep 1998 | A |
5805974 | Hite et al. | Sep 1998 | A |
5812647 | Beaumont et al. | Sep 1998 | A |
5812732 | Dettmer et al. | Sep 1998 | A |
5818512 | Fuller | Oct 1998 | A |
5819284 | Farber et al. | Oct 1998 | A |
5826165 | Echeita et al. | Oct 1998 | A |
5828839 | Moncreiff | Oct 1998 | A |
5835667 | Wactlar et al. | Nov 1998 | A |
5862324 | Collins | Jan 1999 | A |
5870151 | Korber | Feb 1999 | A |
5872588 | Aras et al. | Feb 1999 | A |
5872850 | Klein et al. | Feb 1999 | A |
5873068 | Beaumont et al. | Feb 1999 | A |
5874986 | Gibbon et al. | Feb 1999 | A |
5877755 | Hellhake | Mar 1999 | A |
5892535 | Allen et al. | Apr 1999 | A |
5892554 | DiCicco et al. | Apr 1999 | A |
5892691 | Fowler | Apr 1999 | A |
5900919 | Chen et al. | May 1999 | A |
5903816 | Broadwin et al. | May 1999 | A |
5907322 | Kelly et al. | May 1999 | A |
5913040 | Rakavy et al. | Jun 1999 | A |
5915243 | Smolen | Jun 1999 | A |
5917553 | Honey et al. | Jun 1999 | A |
5918014 | Robinson | Jun 1999 | A |
5926207 | Vaughan et al. | Jul 1999 | A |
5929849 | Kikinis | Jul 1999 | A |
5933150 | Ngo et al. | Aug 1999 | A |
5933811 | Angles et al. | Aug 1999 | A |
5945988 | Williams et al. | Aug 1999 | A |
5946646 | Schena et al. | Aug 1999 | A |
5948061 | Merriman et al. | Sep 1999 | A |
5953076 | Astle et al. | Sep 1999 | A |
5959623 | van Hoff et al. | Sep 1999 | A |
5966120 | Arazi et al. | Oct 1999 | A |
5974398 | Hanson et al. | Oct 1999 | A |
5977964 | Williams et al. | Nov 1999 | A |
5990927 | Hendricks et al. | Nov 1999 | A |
6002393 | Hite et al. | Dec 1999 | A |
6002831 | Tada et al. | Dec 1999 | A |
6002833 | Abecassis | Dec 1999 | A |
6005564 | Ahmad et al. | Dec 1999 | A |
6006197 | d'Eon et al. | Dec 1999 | A |
6006257 | Slezak | Dec 1999 | A |
6008802 | Iki et al. | Dec 1999 | A |
6011895 | Abecassis | Jan 2000 | A |
6012051 | Sammon, Jr. et al. | Jan 2000 | A |
6020883 | Herz et al. | Feb 2000 | A |
6020931 | Bilbrey et al. | Feb 2000 | A |
6026369 | Capek | Feb 2000 | A |
6029045 | Picco et al. | Feb 2000 | A |
6034652 | Freiberger et al. | Mar 2000 | A |
6036601 | Heckel | Mar 2000 | A |
6038367 | Abecassis | Mar 2000 | A |
6044376 | Kurtzman, II | Mar 2000 | A |
6052492 | Bruckhaus | Apr 2000 | A |
6052554 | Hendricks et al. | Apr 2000 | A |
6075551 | Berezowski et al. | Jun 2000 | A |
6100941 | Dimitrova et al. | Aug 2000 | A |
6104425 | Kanno | Aug 2000 | A |
6112192 | Capek | Aug 2000 | A |
6141010 | Hoyle | Oct 2000 | A |
6141463 | Covell et al. | Oct 2000 | A |
6160570 | Sitnik | Dec 2000 | A |
6169542 | Hooks et al. | Jan 2001 | B1 |
6177931 | Alexander et al. | Jan 2001 | B1 |
6208386 | Wilf et al. | Mar 2001 | B1 |
6237022 | Bruck et al. | May 2001 | B1 |
6240555 | Shoff et al. | May 2001 | B1 |
6243104 | Murray | Jun 2001 | B1 |
6275646 | Tada et al. | Aug 2001 | B1 |
6282713 | Kitsukawa et al. | Aug 2001 | B1 |
6286005 | Cannon | Sep 2001 | B1 |
6314569 | Chernock et al. | Nov 2001 | B1 |
6324519 | Eldering | Nov 2001 | B1 |
6351265 | Bulman | Feb 2002 | B1 |
6357043 | Ellis et al. | Mar 2002 | B1 |
6363380 | Dimitrova | Mar 2002 | B1 |
6438751 | Voyticky et al. | Aug 2002 | B1 |
6457010 | Eldering et al. | Sep 2002 | B1 |
6484148 | Boyd | Nov 2002 | B1 |
6519769 | Hopple et al. | Feb 2003 | B1 |
6526215 | Hirai et al. | Feb 2003 | B2 |
6560281 | Black et al. | May 2003 | B1 |
6570499 | Kaganer | May 2003 | B2 |
6574793 | Ngo et al. | Jun 2003 | B1 |
6597405 | Iggulden | Jul 2003 | B1 |
6615408 | Kaiser et al. | Sep 2003 | B1 |
6681393 | Bauminger et al. | Jan 2004 | B1 |
6684194 | Eldering et al. | Jan 2004 | B1 |
6698020 | Zigmond et al. | Feb 2004 | B1 |
6708335 | Ozer et al. | Mar 2004 | B1 |
6735776 | Legate | May 2004 | B1 |
6750880 | Freiberger et al. | Jun 2004 | B2 |
6788314 | Freiberger et al. | Sep 2004 | B1 |
6819863 | Dagtas et al. | Nov 2004 | B2 |
6906732 | Li et al. | Jun 2005 | B1 |
6937658 | Suito et al. | Aug 2005 | B1 |
6968565 | Slaney et al. | Nov 2005 | B1 |
6993245 | Harville | Jan 2006 | B1 |
7134130 | Thomas | Nov 2006 | B1 |
7134132 | Ngo et al. | Nov 2006 | B1 |
7209631 | Tada et al. | Apr 2007 | B2 |
7272295 | Christopher | Sep 2007 | B1 |
7319964 | Huang et al. | Jan 2008 | B1 |
7348935 | Freiberger et al. | Mar 2008 | B1 |
7409437 | Ullman et al. | Aug 2008 | B2 |
7661116 | Slaney et al. | Feb 2010 | B2 |
7720432 | Colby et al. | May 2010 | B1 |
7765574 | Maybury et al. | Jul 2010 | B1 |
7778519 | Harville | Aug 2010 | B2 |
7844985 | Hendricks et al. | Nov 2010 | B2 |
8397255 | Wachtfogel et al. | Mar 2013 | B2 |
8522274 | Slaney et al. | Aug 2013 | B2 |
20020046084 | Steele et al. | Apr 2002 | A1 |
20020062481 | Slaney et al. | May 2002 | A1 |
20030055831 | Ryan et al. | Mar 2003 | A1 |
20030110499 | Knudson et al. | Jun 2003 | A1 |
20030200128 | Doherty | Oct 2003 | A1 |
20040194131 | Ellis et al. | Sep 2004 | A1 |
20090210902 | Slaney et al. | Aug 2009 | A1 |
20100242063 | Slaney et al. | Sep 2010 | A1 |
20100281499 | Harville | Nov 2010 | A1 |
20130014157 | Harville | Jan 2013 | A1 |
20130086608 | Slaney et al. | Apr 2013 | A1 |
Number | Date | Country |
---|---|---|
2054331 | Feb 1990 | JP |
403179895 | Aug 1991 | JP |
403184484 | Aug 1991 | JP |
403229597 | Oct 1991 | JP |
4051628 | Feb 1992 | JP |
05037870 | Feb 1993 | JP |
406303569 | Oct 1994 | JP |
7507169 | Aug 1995 | JP |
408235676 | Sep 1996 | JP |
9269923 | Oct 1997 | JP |
409284706 | Oct 1997 | JP |
411008835 | Jan 1999 | JP |
611261909 | Sep 1999 | JP |
WO-9319427 | Sep 1993 | WO |
WO-9413107 | Jun 1994 | WO |
WO-9416442 | Jul 1994 | WO |
WO-9515649 | Jun 1995 | WO |
WO-9515658 | Jun 1995 | WO |
WO-9624115 | Aug 1996 | WO |
WO-9630864 | Oct 1996 | WO |
WO-9700494 | Jan 1997 | WO |
WO-9700581 | Jan 1997 | WO |
WO-9700582 | Jan 1997 | WO |
WO-9741683 | Nov 1997 | WO |
WO-9824242 | Jun 1998 | WO |
WO-9824243 | Jun 1998 | WO |
WO-9828906 | Jul 1998 | WO |
WO-9904561 | Jan 1999 | WO |
WO-9938320 | Jul 1999 | WO |
WO-9945702 | Sep 1999 | WO |
WO-9952285 | Oct 1999 | WO |
WO-9955066 | Oct 1999 | WO |
WO-9960789 | Nov 1999 | WO |
WO-0022818 | Apr 2000 | WO |
WO-0033160 | Jun 2000 | WO |
WO-0033163 | Jun 2000 | WO |
WO-0033228 | Jun 2000 | WO |
WO-0033233 | Jun 2000 | WO |
Entry |
---|
“About GAIN Ad Vehicles,” http://www.gainpublishing.com/about, pp. 1-2 [accessed Apr. 26, 2004]. |
“Double Click Press Kit,” www.doubleclick.com, 2 pages. |
“GAIN—Support Center,” http://www.gainpublishing.com/help/gainfaq.html, pp. 1-5 [accessed Apr. 26, 2004]. |
“GAIN Publishing-Software,” http://www.gainpublishing.com/software, pp. 1-3. |
“General Instrument & ACTV to Offer a Complete Solution for Addressable, Targeted Digital Television Advertising,” ACTV and The BOX Music Network, Press Release, www.actv.com/newpage/press/actvgiad.html, Jun. 14, 1999, 3 pages. |
“MIT Media Lab's Hypersoap uses hyperlinks to mix shopping, entertainment,” MIT News, Nov. 9, 1998, 3 pgs. [Internet accessed on Nov. 23, 1999]. |
“Scientific-Atlanta's Explorer 2000 Advanced Digital Set-Top Will Support ACTV's ‘Individualized Television,’,” ACTV and Scientific-Atlanta, Press Release, www.actv.com/newpage/press/actvsatl.html, Jan. 25, 1999, 3 pages. |
“What is Wink: How wink works,” www.wink.com, 3 pages. |
Adauction.com, http://web.archive.org/web/20000302051902/http://www.adauction.com, pp. 1-2, internet archive date of Mar. 2, 2000 [accessed Apr. 6, 2007]. |
AsSeenIn.com, http://www.asseenin.com/asseenin/infor, 3 pgs. [Internet accessed on Jan. 12, 2000]. |
Bove, M., et al., “Adding Hyperlinks to Digital Television,” MIT Media Laboratory, Proc. 140th SMPTE Technical Conference, 1998, 11 pages. |
Bregler et al., “Video Rewrite: Driving Visual Speech with Audio,” Interval Research Corporation, Abstract and ACM SIGGRAPH 97 Paper, 10 pages. |
Claria, Products and Services Overview, http://www.clairia.com/products/index.html, pp. 1-3 [Internet accessed Apr. 26, 2004]. |
Darrelli et al., “A Virtual Mirror Interface using Real-time Robust Face Tracking,” Proceedings of the Third International Conference on Face and Gesture Recognition, Apr. 1998, IEEE Computer Society Press, Nara, Japan, 20 pages. |
Delio, M., “TV Commercials Get Personal,” Wired News, Sep. 20, 2000, www.wired.com/news/print/0.1294.38754.00.html, 3 pages. |
Ebert, R., “The Incredible Shrinking Media Lab,” Y-Life: Roger Ebert-Critical Eye, Mar. 1999, http://www.zdnet.com, 3 pgs. [Internet accessed on Nov. 23, 1999]. |
Fujikawa, “Subrogation of User Account by Advertisement”, Nikkei Communications, Nikkei BP Company, Mar. 4, 1996, Japan, No. 217, p. 54. |
Gomes, “Upstart's Internet ‘TV’ Has Microsoft Tuned In,” Wall Street Journal, Aug. 1996, 1 page. |
Koenen, R., “Overview of the MPEG-4 Standard,” International Organisation for Standardisation Organisation Internatioanale De Normalisation ISO/IEC JTC1/SC29/WG11 Coding of Moving Pictures and Audio, Mar. 2001, ISO/IEC JTC1/SC29/WG11 N4030, http://www.cselt.it/mpeg/standars/mpeg-4/mpeg-4.htm, pp. 1-78. |
Lyon, R. F., “The Optical Mouse, and an Architectural Methodology for Smart Digital Sensors,” Xerox PARC, VLSI-81-1, Aug. 1981, 38 pages. |
Moghaddam et al., “Research Index: Probabilistic Visual learning for Object Detection,” http://citeseer.nj.nec.com/moghaddam95probabilistic.html, 1995, 3 pages. |
Moghaddam, B. et al., “Abstract: Probabilistic Visual learning for Object Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, No. 7, Jul. 1997, www.computer.org/tpami/tp1997/i0696abs.htm, 1 page. |
Rigdon, “Screen Savers Go Beyond Fish, Flying Toasters,” Wall Street Journal, Feb. 13, 1996, 1 page. |
Staff Reporter, “PointCast Inc. is Testing New Screen-Saver Product,” Wall Street Journal, May 1996, 1 page. |
Sutton et al., “Reinforcement Learning: An Introduction,” http://www-anw.cs.umass.edu/˜rich/book/the-book.html, MIT Press, Cambridge, MA, 1998, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20140126887 A1 | May 2014 | US |
Number | Date | Country | |
---|---|---|---|
60166528 | Nov 1999 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12835505 | Jul 2010 | US |
Child | 14154999 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11250083 | Oct 2005 | US |
Child | 12835505 | US | |
Parent | 09715901 | Nov 2000 | US |
Child | 11250083 | US |