Aspects of the invention relate to ranking a set of items. In particular, aspects described herein relate to the ranking of content items based on relevance.
The amount of available information and content that may be accessed through user devices such as computers, set-top boxes, cell phones and the like has become staggering. To find information or content that a user is interested in, users will often submit search queries to obtain a condensed list of potentially matching or relevant results. In some instances, however, such result lists may still include a large amount of information and usually in some random order. Some systems organize search results alphabetically or according to a level of match. However, sorting by an alphabet or a level of match might not reflect a relevance to the user of the items in the list.
The following presents a simplified summary of the disclosure in order to provide a basic understanding of some aspects. It is not intended to identify key or critical elements of the invention or to delineate the scope of the invention. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the more detailed description provided below
One or more aspects described herein relate to the sorting of items based on relevance. Relevance may, for example, include the consideration of popularity, proximity, recency and the like. In one or more examples, items to be sorted may be assigned an entity rank indicative of each item's popularity relative to other items of the same type. Alternatively or additionally, entity ranks may be determined on a single scale for all item types. The entity rank and/or other attributes may then be used to categorize each of the items in the list into a hierarchical group. Items within each group may then be sorted according to relevance in view of a variety of factors including entity rank, proximity, recency and the like. Once the items have been sorted in each group, the groups may be combined into a single results list organized by relevance. In one or more arrangements, relevance ranks may correspond to a value that may be converted to a specified scale (e.g., 0 to 1).
According to another aspect, entity rank may be calculated for a movie based on gross earnings, release date, number of awards won or nominated for, languages included, countries in which the movie was released and a number of votes.
According to another aspect, entity ranks may be calculated for a person based on movies or other roles that the person has had. For example, a ranking system may determine a movie-person relationship rank gained based on the entity rank of the movie, a role prominence and a role recency. The top 10 (or other predefined number) movie-person relationship rank gained may then be combined to determine an overall entity rank of the person.
According to another aspect, entity ranks may be determined at a time of indexing the entity (i.e., adding to a database). In some instances, the entity rank may be modified at a time of query. The modification of the rank may be in response to how a search result item matches a search query. In one example, if a content item such as a movie is returned as a search result because the search query matches an actor listed in the cast of the movie, the entity rank of the movie may be modified based on an entity rank of the actor. In some configurations, the modified entity rank will be lower than the original entity rank.
In other embodiments, the present invention can be partially or wholly implemented on a computer-readable medium, for example, by storing computer-executable instructions or modules, or by utilizing computer-readable data structures.
Of course, the methods and systems of the above-referenced embodiments may also include other additional elements, steps, computer-executable instructions, or computer-readable data structures. In this regard, other embodiments are disclosed and claimed herein as well.
The details of these and other embodiments of the present invention are set forth in the accompanying drawings and the description below. Other features and advantages of the invention will be apparent from the description and drawings, and from the claims.
The present disclosure is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
The STB 106 is generally located at the subscriber location such as a subscriber's home, a tavern, a hotel room, a business, etc., and the receiving device 108 is generally provided by the subscribing client. The receiving device 108 may include a television, high definition television (HDTV), monitor, host viewing device, MP3 player, audio receiver, radio, communication device, personal computer, media player, digital video recorder, game playing device, etc. The device 108 may be implemented as a transceiver having interactive capability in connection with the STB 106, the headend 102 or both the STB 106 and the headend 102.
The headend 102 is generally electrically coupled to the network 104, the network 104 is generally electrically coupled to the STB 106, and each STB 106 is generally electrically coupled to the respective device 108. The electrical coupling may be implemented as any appropriate hard-wired (e.g., twisted pair, untwisted conductors, coaxial cable, fiber optic cable, hybrid fiber cable, etc.) or wireless (e.g., radio frequency, microwave, infrared, etc.) coupling and protocol (e.g., Home Plug, HomePNA, IEEE 802.11(a-b), Bluetooth, HomeRF, etc.) to meet the design criteria of a particular application. While the distribution system 100 is illustrated showing one STB 106 coupled to one respective receiving device 108, each STB 106 may be configured with the capability of being coupled to more than one device 108.
The headend 102 may include a plurality of devices 110 (e.g., devices 110a-110n) such as data servers, computers, processors, security encryption and decryption apparatuses or systems, and the like configured to provide video and audio data (e.g., movies, music, television programming, games, and the like), processing equipment (e.g., provider operated subscriber account processing servers), television service transceivers (e.g., transceivers for standard broadcast television and radio, digital television, HDTV, audio, MP3, text messaging, gaming, etc.), and the like. In one example, the headend 102 may generate and present (i.e., transmit, provide, pass, broadcast, send, etc.) the stream VIDIN. At least one of the devices 110 (e.g., a sender security device 110x), may include a security system.
In media distribution networks such as network 104 of
As illustrated in
Referring again to
Evaluating proximity includes the analysis of how well a search query matches a title or other attribute of an entity. For example, if a query is “JU DI,” a movie named “Juan Digs a Hole” may be ranked higher in terms of proximity than a movie named “Juan loves Diane.” Additionally or alternatively, matches occurring earlier in an attribute field may be ranked above matches occurring later. In the above example query of “JU DI,” a first movie named “Juan Loves Diane With All His Heart” may rank above a second movie entitled “With All His Heart, Juan Loves Diane,” because the query string matches earlier in the title in the first movie. According to another aspect, if a query consists of two strings, entities having fewer words between the two matching strings may be ranked higher. For example, if a query consists of “Dig Hole,” a first book entitled “Dig Me a Hole” may be ranked higher in proximity than a second book named “Dig Me a Really Really Really Big Hole.” Another proximity consideration may include whether matching strings occur in reverse order of the order specified in the query.
In addition to proximity, recency and content format, other factors that may be taken into consideration for relevance ranking include expiration time (i.e., when programs are going to end), genre of the content item, user entitlements (i.e., whether a user has access to a content item based on their level of subscription and the like), field or attribute matching the user query, price of a content item, recording capability of a viewing device, search history of the user, viewing history of the user, program freshness (i.e., whether the content item is a first airing, newly released movie, season premiere or repeat) and user-designated favorites. Expiration time may be taken into consideration if some content items or types of content items such as sports programs become more exciting or interesting toward the end. Accordingly, programs that have a more recent expiration time may be ranked ahead of a later expiring program. In some arrangements, the field or attribute of a content item that matches the user query may be a ranking factor. In such instances, a content item having a title matching the user query may be ranked ahead of a content item having a director or genre matching the user query under the assumption that the title is more indicative of interest or relevance than the director or genre associated with the content.
Additionally or alternatively, the price of a content item may be used as a discriminating factor. In one example, cheaper programs may be ranked more highly than more expensive programs. Alternatively, more expensive programs may be ranked as more relevant than cheaper programs. Further and as discussed herein, programs having future air times may have a lower relevance ranking than programs that are currently airing or are soon to air. However, in some arrangements, if a viewing or receiving device has recording capability, the lower relevance ranking of a future airing program may be mitigated since the user is able to record the future program. Further, for each relevance factor, a user may specify how content items are to be ranked (e.g., whether a cheaper program is to be ranked above a more expensive program) or such rankings may be determined automatically based on a default setting.
In step 410, a language factor may be determined based on the language or languages included in the movie. For example, if the movie includes English as the only language, the language factor may be 1.0. Alternatively, if the movie includes English, but not as the only language, the language factor may equal 0.75. In step 415, a country factor may be determined based on the number of countries in which the movie was released may be determined. The country factor may be dependent on whether a specific country is listed (e.g., the United States), whether countries speaking a specified language (e.g., English) are listed and the like. In one example, if the United States is the only country listed, the country factor may equal 1.0, while if only English-speaking countries are listed, the country factor may equal 0.95. In step 420, a release date weight may be determined based on a date on which the movie was released.
award weight=# of awards nominated for or received/5
In step 430, an overall entity rank for the movie may be determined based on one or more of the gross weight, award weight, language factor, country factor, release date weight and the vote weight. For example, the entity rank may be determined according to the formula:
Entity Rank=((gross weight+award weight)*language factor*country factor*release date weight)+vote weight
Optionally, in step 435, the entity rank may be normalized to a predefined scale such as 0 to 1. Entity ranks may be scaled according to a type of entity. Thus, in one configuration, movie entity ranks may be comparable with other movie entity ranks but not with person entity ranks or book entity ranks (i.e., different scales are used). Alternatively, a global or universal scale may be used in which entity ranks for different types of entities are converted to a universal scale. For example, a linear conversion may be used to translate entity-type specific ranks to a universal rank.
The above described method for determining entity rank for movies may be applied in similar fashion to other types of content items including books, television shows, music and the like. One or more factors may be eliminated or replaced according to the type of content item. For example, instead of using a gross earnings weight for television shows, a Nielsen rating weight may be used instead. In one embodiment, television shows may be assigned an entity rank based on the formula:
Entity Rank=log(1.0+number of query matches),
where the number of query matches may correspond to a number of television programs matching a user's search query in a television guide database or a TV planner search log. A TV planner may include an interface such as a website that presents television schedules in a specified format (e.g., a grid). The planner may further include a search bar that allows users to search for a program by entering various search criteria. Accordingly, the planner search log may store the queries that are entered into the search interface. In one or more arrangements, a planner search log may further store or record user interactions with search results or items in the TV planner. For example, the TV planner search log may store a number of clicks or selections of programs. Such information may be useful for disambiguating between programs that share a title. For example, if a user searches for “MOVIE 123” and two movies exist with the name “MOVIE 123,” the system may look to a number of clicks to determine which of the two movies is a closer match. This technique assumes that given two different programs that share a name, the one more people click on should be considered to be the program people are looking for.
In one or more arrangements, relevance ranks may correspond to a numerical value. For example, all content items categorized to a first category may be given an initial value of 0.8 while all content items categorized to a second group or category may be given an initial value of 0.7. The values may be incremented or decremented based on the factors described herein such as entity rank, recency, proximity and the like. In one or more configurations, a value of a content item might not be incremented above or decremented below a threshold value so that content items in the first group remain above (or below) the second group according to the specified hierarchy.
In addition to movies and other content items, individuals may also be given an entity rank. In particular, individuals associated with movies and other content items may be assigned an entity rank that is determined relative to the content item to which they are associated. For example, an actor may be assigned an entity rank that is derived from an entity rank of a movie in which he or she had a role, the role's recency and the role's prominence (e.g., leading actor vs. an extra). Alternatively, an entity rank for a person may be determined independently of movie or television roles.
Acting roles, on the other hand, may be assigned a factor value based on the role's casting order in the movie. For example, if the cast order is between 1 and 3, inclusive, the prominence factor may be assigned a value of 1. Any roles having a cast order over 3 may be assigned a value of 0.1.
Referring again to
The above process (i.e. steps 505-520) may be repeated for each movie or other content item identified in step 500. In step 525, a predefined number of highest rated or ranking roles (i.e., person-movie relationships) may be selected and combined to generate an entity rank for the individual. In one example, the rank gained for each of the highest person-movie relationships may be added together to derive the entity rank of the person.
In various circumstances, an entity rank for a content item may be modified in response to a particular search query. In one example, if a user searches for “Actor One” and a movie matches the search because actor Actor One has a role in the movie, the rank for the movie may be modified because the match was not a match of the movie's title, but rather a cast or person attribute. Accordingly, the movie's entity rank may be modified based on an entity adjustment coefficient and a cast adjustment coefficient. The entity adjustment coefficient may be defined as: the movie's original entity rank/(the original entity rank+30). The cast adjustment coefficient, on the other hand, may be computed based on a conversion table. For example, a first or second cast order may correspond to a cast adjustment coefficient of 1.0, a third cast order may correspond to a coefficient of 0.33, a fourth cast order may correspond to a coefficient of 0.1 and cast order above 4 may correspond to the formula: 1/(2*(N−1)^2), where N is the cast order. Once the cast adjustment and entity adjustment coefficients have been determined, the adjusted entity rank may be calculated based on a formula such as original entity rank*cast adjustment coefficient*entity adjustment coefficient. Various other algorithms or formulas may also be used.
According to one or more aspects, a query might not be spell checked if it is assumed that users will be copying/pasting program names directly into a search query field. Thus, some processing power and time may be saved by removing the need for spell checking. Alternatively or additionally, a search and rank system might only look for exact matches between queries and program names.
The methods and features recited herein may further be implemented through any number of computer readable media that are able to store computer readable instructions. Examples of computer readable media that may be used include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, DVD or other optical disk storage, magnetic cassettes, magnetic tape, magnetic storage and the like.
Additionally or alternatively, in at least some embodiments, the methods and features recited herein may be implemented through one or more integrated circuits (IC s). An integrated circuit may, for example, be a microprocessor that accesses programming instructions or other data stored in a read only memory (ROM). In some such embodiments, the ROM stores programming instructions that cause the IC to perform operations according to one or more of the methods described herein. In at least some other embodiments, one or more the methods described herein are hardwired into an IC. In other words, the IC is in such cases an application specific integrated circuit (ASIC) having gates and other logic dedicated to the calculations and other operations described herein. In still other embodiments, the IC may perform some operations based on execution of programming instructions read from ROM or RAM, with other operations hardwired into gates and other logic of IC. Further, the IC may output image data to a display buffer.
Although specific examples of carrying out the invention have been described, those skilled in the art will appreciate that there are numerous variations and permutations of the above-described systems and methods that are contained within the spirit and scope of the invention as set forth in the appended claims. Additionally, numerous other embodiments, modifications and variations within the scope and spirit of the appended claims will occur to persons of ordinary skill in the art from a review of this disclosure.
The application is a continuation of U.S. application Ser. No. 13/464,186 entitled “RANKING SEARCH RESULTS” and filed on May 4, 2012 (now U.S. Pat. No. 9,348,915), which is continuation of U.S. application Ser. No. 12/402,897 entitled “RANKING SEARCH RESULTS” and filed on Mar. 12, 2009 (now U.S. Pat. No. 8,176,043). The contents of the prior applications are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4227177 | Moshier | Oct 1980 | A |
5493677 | Balogh et al. | Feb 1996 | A |
5521841 | Arman et al. | May 1996 | A |
5530859 | Tobias, II et al. | Jun 1996 | A |
5535063 | Lamming | Jul 1996 | A |
5553281 | Brown et al. | Sep 1996 | A |
5576755 | Davis et al. | Nov 1996 | A |
5594897 | Goffman | Jan 1997 | A |
5640553 | Schultz | Jun 1997 | A |
5649182 | Reitz | Jul 1997 | A |
5666528 | Thai | Sep 1997 | A |
5682326 | Klingler et al. | Oct 1997 | A |
5717914 | Husick et al. | Feb 1998 | A |
5729741 | Liaguno et al. | Mar 1998 | A |
5737495 | Adams et al. | Apr 1998 | A |
5737734 | Schultz | Apr 1998 | A |
5742816 | Barr et al. | Apr 1998 | A |
5761655 | Hoffman | Jun 1998 | A |
5765150 | Burrows | Jun 1998 | A |
5799315 | Rainey et al. | Aug 1998 | A |
5819292 | Hitz et al. | Oct 1998 | A |
5845279 | Garofalakis et al. | Dec 1998 | A |
5857200 | Togawa | Jan 1999 | A |
5924090 | Krellenstein | Jul 1999 | A |
5928330 | Goetz et al. | Jul 1999 | A |
5937422 | Nelson et al. | Aug 1999 | A |
5956729 | Goetz et al. | Sep 1999 | A |
5982369 | Sciammarella et al. | Nov 1999 | A |
6038560 | Wical | Mar 2000 | A |
6052657 | Yamron et al. | Apr 2000 | A |
6055543 | Christensen et al. | Apr 2000 | A |
6058392 | Sampson et al. | May 2000 | A |
6167377 | Gillick et al. | Dec 2000 | A |
6188976 | Ramaswamy et al. | Feb 2001 | B1 |
6278992 | Curtis et al. | Aug 2001 | B1 |
6320588 | Palmer et al. | Nov 2001 | B1 |
6343294 | Hawley | Jan 2002 | B1 |
6345253 | Viswanathan | Feb 2002 | B1 |
6363380 | Dimitrova | Mar 2002 | B1 |
6366296 | Boreczky et al. | Apr 2002 | B1 |
6374260 | Hoffert et al. | Apr 2002 | B1 |
6415434 | Kind | Jul 2002 | B1 |
6418431 | Mahajan et al. | Jul 2002 | B1 |
6463444 | Jain et al. | Oct 2002 | B1 |
6545209 | Flannery et al. | Apr 2003 | B1 |
6546385 | Mao et al. | Apr 2003 | B1 |
6567980 | Jain et al. | May 2003 | B1 |
6580437 | Liou et al. | Jun 2003 | B1 |
6675174 | Bolle et al. | Jan 2004 | B1 |
6698020 | Zigmond et al. | Feb 2004 | B1 |
6771875 | Kunieda et al. | Aug 2004 | B1 |
6789088 | Lee et al. | Sep 2004 | B1 |
6792426 | Baumeister et al. | Sep 2004 | B2 |
6877134 | Fuller et al. | Apr 2005 | B1 |
6882793 | Fu et al. | Apr 2005 | B1 |
6901364 | Nguyen et al. | May 2005 | B2 |
6937766 | Wilf et al. | Aug 2005 | B1 |
6970639 | McGrath et al. | Nov 2005 | B1 |
7155392 | Schmid et al. | Dec 2006 | B2 |
7177861 | Tovinkere et al. | Feb 2007 | B2 |
7206303 | Karas et al. | Apr 2007 | B2 |
7272558 | Soucy et al. | Sep 2007 | B1 |
7376642 | Nayak et al. | May 2008 | B2 |
7472137 | Edelstein et al. | Dec 2008 | B2 |
7490092 | Sibley et al. | Feb 2009 | B2 |
7548934 | Platt et al. | Jun 2009 | B1 |
7584102 | Hwang et al. | Sep 2009 | B2 |
7596549 | Issa et al. | Sep 2009 | B1 |
7739286 | Sethy et al. | Jun 2010 | B2 |
7788266 | Venkataraman et al. | Aug 2010 | B2 |
7792812 | Carr | Sep 2010 | B1 |
7814267 | Iyengar et al. | Oct 2010 | B1 |
7921116 | Finkelstein et al. | Apr 2011 | B2 |
7925506 | Farmaner et al. | Apr 2011 | B2 |
7958119 | Eggink et al. | Jun 2011 | B2 |
7983902 | Wu et al. | Jul 2011 | B2 |
8041566 | Peters et al. | Oct 2011 | B2 |
8078467 | Wu et al. | Dec 2011 | B2 |
8117206 | Sibley et al. | Feb 2012 | B2 |
8265933 | Bates et al. | Sep 2012 | B2 |
8468083 | Szulczewski | Jun 2013 | B1 |
8527520 | Morton et al. | Sep 2013 | B2 |
8572087 | Yagnik | Oct 2013 | B1 |
8909655 | McDonnell | Dec 2014 | B1 |
20010014891 | Hoffert et al. | Aug 2001 | A1 |
20020035573 | Black et al. | Mar 2002 | A1 |
20020087315 | Lee et al. | Jul 2002 | A1 |
20020091837 | Baumeister et al. | Jul 2002 | A1 |
20020143774 | Vandersluis | Oct 2002 | A1 |
20020194181 | Wachtel | Dec 2002 | A1 |
20030014758 | Kim | Jan 2003 | A1 |
20030033297 | Ogawa | Feb 2003 | A1 |
20030050778 | Nguyen et al. | Mar 2003 | A1 |
20030061028 | Dey et al. | Mar 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030135582 | Allen et al. | Jul 2003 | A1 |
20030163443 | Wang | Aug 2003 | A1 |
20030163815 | Begeja et al. | Aug 2003 | A1 |
20030195877 | Ford et al. | Oct 2003 | A1 |
20030204513 | Bumbulis | Oct 2003 | A1 |
20040111465 | Chuang et al. | Jun 2004 | A1 |
20040117831 | Ellis et al. | Jun 2004 | A1 |
20040139091 | Shin | Jul 2004 | A1 |
20040215634 | Wakefield et al. | Oct 2004 | A1 |
20040225667 | Hu et al. | Nov 2004 | A1 |
20040243539 | Skurtovich et al. | Dec 2004 | A1 |
20040254795 | Fujii et al. | Dec 2004 | A1 |
20040267700 | Dumais et al. | Dec 2004 | A1 |
20050044105 | Terrell | Feb 2005 | A1 |
20050060647 | Doan et al. | Mar 2005 | A1 |
20050091443 | Hershkovich et al. | Apr 2005 | A1 |
20050097138 | Kaiser et al. | May 2005 | A1 |
20050114130 | Java et al. | May 2005 | A1 |
20050152362 | Wu | Jul 2005 | A1 |
20050193005 | Gates et al. | Sep 2005 | A1 |
20050222975 | Nayak et al. | Oct 2005 | A1 |
20060004738 | Blackwell et al. | Jan 2006 | A1 |
20060037046 | Simms | Feb 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060088276 | Cho et al. | Apr 2006 | A1 |
20060100898 | Pearce et al. | May 2006 | A1 |
20060112097 | Callaghan et al. | May 2006 | A1 |
20060156399 | Parmar et al. | Jul 2006 | A1 |
20060161546 | Callaghan et al. | Jul 2006 | A1 |
20060167859 | Verbeck Sibley et al. | Jul 2006 | A1 |
20060184495 | Crosby | Aug 2006 | A1 |
20060212288 | Sethy et al. | Sep 2006 | A1 |
20060235843 | Musgrove et al. | Oct 2006 | A1 |
20060253780 | Munetsugu et al. | Nov 2006 | A1 |
20060256739 | Seier et al. | Nov 2006 | A1 |
20070011133 | Chang | Jan 2007 | A1 |
20070050343 | Siddaramappa et al. | Mar 2007 | A1 |
20070050366 | Bugir et al. | Mar 2007 | A1 |
20070067285 | Blume et al. | Mar 2007 | A1 |
20070078708 | Yu et al. | Apr 2007 | A1 |
20070083374 | Bates et al. | Apr 2007 | A1 |
20070156677 | Szabo | Jul 2007 | A1 |
20070208567 | Amento et al. | Sep 2007 | A1 |
20070211762 | Song et al. | Sep 2007 | A1 |
20070214123 | Messer et al. | Sep 2007 | A1 |
20070214488 | Nguyen et al. | Sep 2007 | A1 |
20070233487 | Cohen et al. | Oct 2007 | A1 |
20070233656 | Bunescu et al. | Oct 2007 | A1 |
20070233671 | Oztekin | Oct 2007 | A1 |
20070239707 | Collins et al. | Oct 2007 | A1 |
20070250901 | McIntire et al. | Oct 2007 | A1 |
20070260700 | Messer | Nov 2007 | A1 |
20070271086 | Peters et al. | Nov 2007 | A1 |
20080033915 | Chen et al. | Feb 2008 | A1 |
20080046929 | Cho et al. | Feb 2008 | A1 |
20080059418 | Barsness et al. | Mar 2008 | A1 |
20080091633 | Rappaport et al. | Apr 2008 | A1 |
20080118153 | Wu et al. | May 2008 | A1 |
20080133504 | Messer et al. | Jun 2008 | A1 |
20080162533 | Mount et al. | Jul 2008 | A1 |
20080163328 | Philbin et al. | Jul 2008 | A1 |
20080168045 | Suponau et al. | Jul 2008 | A1 |
20080183681 | Messer et al. | Jul 2008 | A1 |
20080183698 | Messer et al. | Jul 2008 | A1 |
20080189110 | Freeman et al. | Aug 2008 | A1 |
20080204595 | Rathod et al. | Aug 2008 | A1 |
20080208796 | Messer et al. | Aug 2008 | A1 |
20080208839 | Sheshagiri et al. | Aug 2008 | A1 |
20080208864 | Cucerzan et al. | Aug 2008 | A1 |
20080221989 | Messer et al. | Sep 2008 | A1 |
20080222105 | Matheny | Sep 2008 | A1 |
20080222106 | Rao et al. | Sep 2008 | A1 |
20080222142 | O'Donnell | Sep 2008 | A1 |
20080235209 | Rathod et al. | Sep 2008 | A1 |
20080235393 | Kunjithapatham et al. | Sep 2008 | A1 |
20080250010 | Rathod et al. | Oct 2008 | A1 |
20080256097 | Messer et al. | Oct 2008 | A1 |
20080266449 | Rathod et al. | Oct 2008 | A1 |
20080281801 | Larson et al. | Nov 2008 | A1 |
20080288641 | Messer et al. | Nov 2008 | A1 |
20080319962 | Riezler et al. | Dec 2008 | A1 |
20090006315 | Mukherjea et al. | Jan 2009 | A1 |
20090006391 | Ram | Jan 2009 | A1 |
20090013002 | Eggink et al. | Jan 2009 | A1 |
20090025054 | Gibbs et al. | Jan 2009 | A1 |
20090055381 | Wu et al. | Feb 2009 | A1 |
20090077078 | Uppala et al. | Mar 2009 | A1 |
20090083257 | Bargeron et al. | Mar 2009 | A1 |
20090094113 | Berry et al. | Apr 2009 | A1 |
20090123021 | Jung et al. | May 2009 | A1 |
20090131028 | Horodezky et al. | May 2009 | A1 |
20090144260 | Bennett et al. | Jun 2009 | A1 |
20090144609 | Liang et al. | Jun 2009 | A1 |
20090157680 | Crossley et al. | Jun 2009 | A1 |
20090172544 | Tsui et al. | Jul 2009 | A1 |
20090198686 | Cushman, II et al. | Aug 2009 | A1 |
20090204599 | Morris et al. | Aug 2009 | A1 |
20090205018 | Ferraiolo et al. | Aug 2009 | A1 |
20090240650 | Wang et al. | Sep 2009 | A1 |
20090240674 | Wilde et al. | Sep 2009 | A1 |
20090271195 | Kitade et al. | Oct 2009 | A1 |
20090282069 | Callaghan et al. | Nov 2009 | A1 |
20090326947 | Arnold et al. | Dec 2009 | A1 |
20100042602 | Smyros et al. | Feb 2010 | A1 |
20100063886 | Stratton et al. | Mar 2010 | A1 |
20100070507 | Mori | Mar 2010 | A1 |
20100094845 | Moon et al. | Apr 2010 | A1 |
20100138653 | Spencer et al. | Jun 2010 | A1 |
20100250598 | Brauer et al. | Sep 2010 | A1 |
20110004462 | Houghton et al. | Jan 2011 | A1 |
20110016106 | Xia | Jan 2011 | A1 |
20110077943 | Miki et al. | Mar 2011 | A1 |
20110125728 | Smyros et al. | May 2011 | A1 |
20110191099 | Farmaner et al. | Aug 2011 | A1 |
20110246503 | Bender et al. | Oct 2011 | A1 |
20120036119 | Zwicky et al. | Feb 2012 | A1 |
20120078932 | Skurtovich, Jr. et al. | Mar 2012 | A1 |
20120150636 | Freeman et al. | Jun 2012 | A1 |
20120191695 | Xia | Jul 2012 | A1 |
20130054589 | Cheslow | Feb 2013 | A1 |
20130216207 | Berry et al. | Aug 2013 | A1 |
Number | Date | Country |
---|---|---|
2685833 | May 2010 | CA |
1241587 | Sep 2002 | EP |
1462950 | Sep 2004 | EP |
1501305 | Jan 2005 | EP |
2448874 | Nov 2008 | GB |
2448875 | Nov 2008 | GB |
9950830 | Oct 1999 | WO |
0205135 | Jan 2002 | WO |
2005050621 | Jun 2005 | WO |
2006099621 | Sep 2006 | WO |
2007115224 | Oct 2007 | WO |
2008053132 | May 2008 | WO |
2009052277 | Apr 2009 | WO |
Entry |
---|
U.S. Appl. No. 12/344,713, Merging of Multiple Data Sets, filed Dec. 29, 2008. |
U.S. Appl. No. 12/415,734, An Efficient Scheme for Storing and Reading Encoded Data for Prefix-Based Keyword Search in an Object-Oriented Implementation, filed Mar. 31, 2009. |
U.S. Appl. No. 12/496,081, Generating Topic-Specific Language Models, filed Jul. 1, 2009. |
U.S. Appl. No. 12/343,779, Identification of Segments within Audio, Video, and Multimedia Items, filed Dec. 24, 2008. |
U.S. Appl. No. 13/464,186, Ranking Search Results, filed May 4, 2012. |
U.S. Appl. No. 14/012,289, Disambiguation and Tagging of Entities, filed Aug. 28, 2013. |
U.S. Appl. No. 14/199,063, Method and Apparatus for Organizing Segments of Media Assets and Determining Relevance of Segments to a Query, filed Mar. 6, 2014. |
U.S. Appl. No. 14/843,912, Method and System for Indexing and Searching Timed Media Information Based Upon Relevance Intervals, filed Sep. 9, 2015. |
Mar. 21, 2017—Canadian Office Action—CA App. 2,694,943. |
Shahraray: “Impact and Applications of Video Content Analysis and Coding in the internet and Telecommunications”, AT&T Labs Research, A Position Statement for Panel 4: Applications the 1998 International Workshop on Very Low Bitrate Video Coding, 3 pages. |
Kalina Bontcheva et al “Shallow Methods for Named Entity Coreference Resolution”, Proc. of Taln 2002, Jan. 1, 2002. |
Raphael Volz et al., “Towards ontology-based disambiguation of geographical identifiers”, Proceedings of the WWW2007 Workship I3: Identity, Identifiers, Identification, Entity-Centric Approaches to Information and Knowledge Management on the Web, Jan. 1, 2007. |
Wacholder N et al., “Disambiguation of Proper Names in Text”, Proceedings of the Conference on Applied Natural Language Processing, Association Computer Linguistics, Morrisontown, NJ, Mar. 1, 2007. |
Boulgouris N. V. et al., “Real-Time Compressed-Domain Spatiotemporal Segmentation and Ontologies for Video Indexing and Retrieval”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 14, No. 5, pp. 606-621, May 2004. |
Changsheng Xu et al., “Using Webcast Text for Semantic Event Detection in Broadcast Sports Video”, IEEE Transactions on Multimedia, vol. 10, No. 7, pp. 1342-1355, Nov. 2008. |
Liang Bai et al., “Video Semantic Content Analysis based on Ontology”, International Machine Vision and Image Processing Conference, pp. 117-124, Sep. 2007. |
Koskela M. et al., “Measuring Concept Similarities in Multimedia Ontologies: Analysis and Evaluations”, IEEE Transactions on Multimedia, vol. 9, No. 5, pp. 912-922, Aug. 2007. |
Steffen Staab et al., “Semantic Multimedia”, Reasoning Web; Lecture Notes in Computer Science, pp. 125-170, Sep. 2008. |
European Search Report EP09179987.4, dated Jun. 4, 2010. |
Li, Y. et al., “Reliable Video Clock Time Recognition,” Pattern Recognition, 2006, 1CPR 1006, 18th International Conference on Pattern Recognition, 4 pages. |
Salton et al., Computer Evaluation of Indexing and Text Processing Journal of the Association for Computing Machinery, vol. 15, No. 1, Jan. 1968, pp. 8-36. |
European Search Report for Application No. 09180776.8, dated Jun. 7, 2010, 9 pages. |
European Search Report EP 09180762, dated Mar. 22, 2010. |
European Application No. 09175979.5—Office Action dated Mar. 15, 2010. |
EP Application No. 09 175 979.5—Office Action dated Apr. 11, 2011. |
Smith, J.R. et al., “An Image and Video Search Engine for the World-Wide Web” Storage and Retrieval for Image and Video Databases 5, San Jose, Feb. 13-14, 1997, Proceedings of Spie, Belingham, Spie, US, vol. 3022, Feb. 13, 1997, pp. 84-95. |
Kontothoanassis, Ledonias et al. “Design, Implementation, and Analysis of a Multimedia Indexing and Delivery Server”, Technical Report Series, Aug. 1999, Cambridge Research Laboratory. |
European Patent Application No. 09175979.5—Office Action dated Dec. 13, 2011. |
International Preliminary Examination Report for PCT/US01/20894, dated Feb. 4, 2002. |
Towards a Multimedia World-Wide Web Information retrieval engines, Sougata Mukherjea, Kyoji Hirata, and Yoshinori Hara Computer Networks and ISDN Systems 29 (1997) 1181-1191. |
Experiments in Spoken Document Retrieval at CMU, M.A. Siegler, M.J. Wittbrock, S.T. Slattery, K. Seymore, R.E. Jones, and A.G. Hauptmann, School of Computer Science Carnegie Mellon University, Pittsburgh, PA 15213-3890, Justsystem Pittsburgh Research Center, 4616 Henry Street, Pittsburgh, PA 15213. |
Eberman, et al., “Indexing Multimedia for the Internet”, Compaq, Cambridge Research laboratory, Mar. 1999, pp. 1-8 and Abstract. |
Ishitani, et al., “Logical Structure Analysis of Document Images Based on Emergent Computation”, IEEE Publication, pp. 189-192, Jul. 1999. |
First Office Action in EP01950739.1-1244 dated Mar. 27, 2009. |
Chen, “Extraction of Indicative Summary Sentences from Imaged Documents”, IEEE publication, 1997, pp. 227-232. |
Messer, Alan et al., “SeeNSearch: A context Directed Search Facilitator for Home Entertainment Devices”, Paper, Samsung Information Systems America Inc., San Jose, CA, Sep. 17, 2008. |
Hsin-Min Wang and Berlin Chen, “Content-based Language Models for Spoken Document Retrieval”, ACM, 2000, pp. 149-155. |
Marin, Feldman, Ostendorf and Gupta, “Filtering Web Text to Match Target Genres”, International Conference on Acoustics, Speech and Signal Processing, 2009, Piscataway, NJ, Apr. 19, 2009, pp. 3705-3708. |
European Search Report for application No. 10167947.0, dated Sep. 28, 2010. |
“Ying Zhang and Phil Vines. 2004. Using the web for automated translation extraction in cross-language information retrieval. In Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval (SIGIR '04). ACM, New York, NY, USA, 162-169”. |
IPRP PCT/US2009/069644—dated Jun. 29, 2011. |
ISR PCT/US2009/069644—dated Mar. 4, 2010. |
ESR—EP10154725.5—dated Nov. 2, 2010. |
ESR—EP10155340.2—dated Nov. 25, 2010. |
Partial ESR—EP10155340.2—dated Jul. 12, 2010. |
ESR—EP10162666.1—dated Aug. 4, 2011. |
ESR—EP10167947.0—dated Sep. 28, 2010. |
ISR PCT/US2001/020894—dated Nov. 25, 2003. |
Extended European Search Report—EP 09815446.1—dated May 7, 2013. |
Behrang Mohit and Rebecca Hwa, 2005. Syntax-based Semi-Supervised Named Entity Tagging. In Proceedings of the ACL Interactive Poster and Demonstration Sessions, pp. 57-60. |
Shumeet Baluja, Vibhu Mittal and Rahul Sukthankar, 1999. Applying machine learning for high performance named-entity extraction. In Proceedings of Pacific Association for Computational Linguistics. |
R. Bunescu and M. Pasca. 2006. Using encyclopedic knowledge for named entity disambiguation. In Proceedings of EACL-2006, pp. 9-16. |
S. Cucerzan. 2007. Large-Scale Named Entity Disambiguation Based on Wikipedia Data. In Proceedings of EMNLP-CoNLL 2007, pp. 708-716. |
Radu Florian, 2002. Named entity recognition as a house of cards: Classifier stacking. In Proceedings of CoNL2002, pp. 175-178. |
Martin Jansche, 2002. Named Entity Extraction with Conditional Markov Models and Classifiers. In Proceedings of CoNLL-2002. |
Thamar Solorio, 2004. Improvement of Named Entity Tagging by Machine Learning. Reporte Tecnico No. CCC-04-004. INAOE. |
European Examination—EP Appl. 09180762.8—dated Jan. 19, 2015. |
Canadian Office Action Response—CA App 2,694,943—Filed Apr. 24, 2015. |
European Office Action—EP 10154725.5—dated Apr. 24, 2015. |
Canadian Office Action—CA App 2,697,565—dated Dec. 28, 2016. |
Canadian Office Action—CA Appl. 2,703,569—dated Feb. 8, 2017. |
Canadian Office Action—CA Appl. 2,708,842—dated Apr. 12, 2017. |
Oct. 25, 2017—European Decision to Refuse—EP 09815446.1. |
Arthur De Vany, W. David Walls, “Uncertainty in the Movie Industry: Does Star Power Reduce the Terror of the Box Office?,” Journal of Cultural Economics, 1999, pp. 285-318, Issue 23, Kluwer Academic Publishers, Netherlands. |
Oct. 6, 2017—European Decision to Refuse—EP 09180762.8. |
Dec. 15, 2017—Canadian Office Action—CA 2689376. |
Nov. 28, 2017—European Decision to Refuse—EP 10162666.1. |
Feb. 2, 2018—Canadian Office Action—CA 2,708,842. |
Feb. 15, 2018—Canadian Office Action—CA 2,697,565. |
Chen, Langzhou, et al. “Using information retrieval methods for language model adaptation.” INTERSPEECH. 2001. |
Sethy, Abhinav, Panayiotis G. Georgiou, and Shrikanth Narayanan. “Building topic specific language models from webdata using competitive models” INTERSPEECH. 2005. |
Response to European Office Action—EP Appl. 9180762.8—dated Jul. 29, 2015. |
European Office Action—EP Appl. 10162666.1—dated Jul. 10, 2015. |
Response to European Office Action—EP 10162666.1—dated Oct. 14, 2015. |
Response to European Office Action—EP Appl. 10154725.5—dated Oct. 14, 2015. |
Canadian Office Action—CA Application 2,697,565—dated Dec. 15, 2015. |
European Office Action—EP Appl. 09815446.1—dated Feb. 17, 2016. |
Canadian Office Action—CA Appl. 2,688,921—dated Feb. 16, 2016. |
Canadian Office Action—CA Appl. 2,689,376—dated Feb. 23, 2016. |
Canadian Office Action—CA Appl. 2,703,569—dated Apr. 19, 2016. |
Canadian Office Action—CA Appl. 2,708,842—dated May 9, 2016. |
Canadian Office Action—CA Appl. 2,694,943—dated Jun. 1, 2016. |
Canadian Office Action—CA App 2,695,709—dated Jun. 20, 2016. |
Feb. 28, 2018—Canadian Office Action —2,703,569. |
Mar. 21, 2018—Canadian Office Action—CA 2,694,943. |
Number | Date | Country | |
---|---|---|---|
20160378768 A1 | Dec 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13464186 | May 2012 | US |
Child | 15137743 | US | |
Parent | 12402897 | Mar 2009 | US |
Child | 13464186 | US |