The present invention relates to a method and apparatus for selecting at least one item. In particular, but not exclusively, it relates to automatically selecting content items or content assets.
The number of available content assets, such as digital, audiovisual assets that can be consumed at any point in time is exploding. As a result, instant, interest driven access to that expanding content universe is becoming more and more important. Digitalization of audiovisual materials together with the broad availability of high speed data transfer has resulted in an ever increasing amount of content that any consumer has available for consumption at any point in time. Examples of large repositories of audiovisual assets on the internet are YouTube, Netflix, and Hulu. They are competing with digital, live TV channels and VoD libraries, offered by TV service providers, e.g. Comcast.
As the consumer's time to select and enjoy the available content assets doesn't increase the importance of conveniently controllable guidance towards the most interesting and appropriate audiovisual content gains in importance.
Typically internet sites, service providers, and device manufacturers offer solutions that allow users to search and navigate their content offerings. The first of these is a simple input means for search using text entry, for example, the search bar from Google or YouTube. Such searching is essentially easy to use (at least on a PC), but returns a lot of results that are irrelevant. Attempting to improve the detail of the search request instantly requires the user to go back one step in the search process which is cumbersome and may take many attempts to obtain good results, which is time consuming. Some sites (e.g. specifications like “site: www.philips.com” or “Include entire text”), require expert skills, which makes the solution unsuitable to the general public.
Another known technique is implicit input means, e.g. tracking what the user is browsing via cookies, making use of his purchase history etc. This takes significant time, before providing useful recommendations. Furthermore, this technique does not allow the user to specifically express his interests.
A further known technique is the use of comparative input means, for example, rating a movie as “liked” or “disliked”. This is equally effective as a simple input means, allowing the user to refine the result on the go without going back. However, as above it returns, at least initially, many irrelevant results and can be time consuming to refine it to obtain useful results.
Another known technique uses complex input means, for example, allowing the user to explicitly state his preferences like “I love Alfred Hitchcock” but “slightly dislike Horror Movies”. This allows more control of search results and navigation, but is too overwhelming to be useful for the general public, especially as the user first needs to understand which criteria can be used to constrain the result set, and how they need to be applied. Also these techniques are not suited to refine on the go.
An example of such a complex input means is disclosed by U.S. Pat. No. 7,617,511 in which, whilst browsing an electronic programming guide (EPG), the user can enter new preference ratings or modify previously saved ratings for program attributes. This results in recommendations being made based on the user's preferences. However, as mentioned above, the user first needs to understand how the criteria are used and how to apply it. Furthermore, any further browsing of the EPG is limited to these preferences.
The present invention seeks to provide improved browsing of items allowing easy refinement of selections.
This is achieved according to a first aspect of the present invention by a method for selecting at least one item, the method comprising the steps of: selecting at least one first item, each of the selected at least one first item having a plurality of attributes associated therewith, each of the associated attributes having a rating, the rating indicating the importance of the attribute to the user; displaying each of the selected at least one first item and its associated attributes and corresponding ratings; allowing adjustment of the ratings of any one of the displayed associated attributes; selecting at least one item similar to one of the selected at least one first item based on the rating of at least one of the displayed attributes.
This is also achieved according to a second aspect of the present invention by apparatus for selecting at least one item, the apparatus comprising: a processor for selecting at least one first item, each of the selected at least one first item having a plurality of attributes associated therewith, each of the associated attributes having a rating, the rating indicating the importance of the attribute to the user; a controller for controlling a display for displaying each of the selected at least one first item and its associated attributes and corresponding ratings; an interface for communicating with a user interface device for allowing adjustment of the ratings of any one of the displayed associated attributes; the processor selecting at least one item similar to one of the selected at least one first item based on the rating of at least one of the displayed attributes.
As a result a first selection is made. This may comprise a search of items, like content assets for example, to retrieve the selection based on criteria, for example, a plurality of search terms. This first selection is displayed and browsed by the user if desired. For each displayed item, attributes are identified. This may be, for example, the criteria which were used to select the items, i.e. the search terms. The user can then provide a rating for any of these attributes indicating the importance of the attribute to the user in the selection. The user thus has a clear indication of how the selection of similar items is made, i.e. how the attributes will be used to refine the search. Furthermore, selection is not limited to the user's preferences.
Optionally a selection of the most relevant attributes is made per item so as to reduce the burden on the user in rating attributes. The most relevant may be determined from the user's profile or currently indicated preferences. Any adjustment the user performs on a rating may also be used to update the user's profile to improve the selection of items and/or of attributes to be rated.
The interface for input of ratings may be placed in the user interface directly, for example, included in a display of the interface device and/or the display of the selected items, making the user aware of the configurable options and allowing the user to instantly provide input in the context of the result page.
This may allow the user to refine his profile, search query, or preferences. As a result, there is no need to leave the navigation of items, such as content assets, to adjust/refine a profile, search query, or preference input screen and the adjustment takes place directly in the appropriate context, allowing the user the conveniently take confident decisions, while developing trust in the solution as a whole.
For a more complete understanding of the present invention, reference is made to the following description in conjunction with the accompanying drawings, in which:
With reference to
Operation of the apparatus 100 will now be described with reference to
A first selection of items is made, step 201. This may involve use of a search engine such as Google to input search terms or browsing recommendations of content assets on a television screen. These items are retrieved from a storage means 103 and displayed on the display means 105. Examples of screen shots of such a display are shown in
In
In an alternative embodiment, in
In a specific example, with reference to
A list 303 of a first selection is shown below. For each item, an attribute, for example, “HD material only”, or a particular source, e.g. “YouTube” is highlighted. These are highlighted and a rating can be provided so that the user can indicate the importance of any one attribute to provide preference input related to search results in order to improve the relevancy performance of the listed items. For example, the attributes may include any one of the search terms and the user can indicate how important they are for the selection of items. The attributes, in this specific example, may include the preference on the media source of the selected item (video). Other attributes may be implied from ontologies, e.g. that if a search is performed for the term “documentaries” then for the attribute media source “BBC” is a relevant value that should be displayed for rating. The ratings can then be used to de-prioritize search terms in order to refine the search or select items that have similar attributes that are important to the user.
The user can now directly give feedback, specifying that “documentaries” and the “BBC” are important for him, but “YouTube” as a source isn't. In an embodiment, as soon as one of the sliders 305 is adjusted a control button appears, e.g. marked with the text “Search Again” next to the display of the corresponding item. In this way it is very easy to quickly refine the search taking full advantage of the knowledge available in the search/recommendation engine, but without overloading the user with options.
As shown in
In an embodiment, all relevant attributes 411 are directly shown on the details screen allowing the user to give feedback. Such feedback will instantly change the list of recommendations on the left lower part 407 of the screen, and also the corresponding ratings 413.
Although embodiments of the present invention have been illustrated in the accompanying drawings and described in the foregoing detailed description, it will be understood that the invention is not limited to the embodiments disclosed, but is capable of numerous modifications without departing from the scope of the invention as set out in the following claims.
As will be apparent to a person skilled in the art, the elements listed in the apparatus claims are meant to include any hardware (such as separate or integrated circuits or electronic elements) or software (such as programs or parts of programs) which reproduce in operation or are designed to reproduce a specified function, be it solely or in conjunction with other functions, be it in isolation or in co-operation with other elements. The invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the apparatus claim enumerating several means, several of these means can be embodied by one and the same item of hardware. ‘Computer program product’ is to be understood to mean any software product stored on a computer-readable medium, such as a floppy disk, downloadable via a network, such as the Internet, or marketable in any other manner. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word “comprising” does not exclude the presence of elements or steps other than those listed in a claim. The word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Number | Date | Country | Kind |
---|---|---|---|
10172856 | Aug 2010 | EP | regional |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB2011/053582 | 8/11/2011 | WO | 00 | 2/14/2013 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/023091 | 2/23/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5410344 | Graves et al. | Apr 1995 | A |
5880768 | Lemmons et al. | Mar 1999 | A |
5890152 | Rapaport | Mar 1999 | A |
6006225 | Bowman | Dec 1999 | A |
6029195 | Herz | Feb 2000 | A |
6269395 | Blatherwick et al. | Jul 2001 | B1 |
6327593 | Goiffon | Dec 2001 | B1 |
7072888 | Perkins | Jul 2006 | B1 |
7082428 | Denny | Jul 2006 | B1 |
7117207 | Kerschberg | Oct 2006 | B1 |
7571155 | Choi | Aug 2009 | B2 |
7617511 | Marsh | Nov 2009 | B2 |
8286206 | Aaron et al. | Oct 2012 | B1 |
8316037 | Garg | Nov 2012 | B1 |
8719251 | English | May 2014 | B1 |
8898713 | Price | Nov 2014 | B1 |
20010042060 | Rouse | Nov 2001 | A1 |
20020002438 | Ohmura | Jan 2002 | A1 |
20020065677 | Grainger et al. | May 2002 | A1 |
20020087526 | Rao | Jul 2002 | A1 |
20020198866 | Kraft | Dec 2002 | A1 |
20030126227 | Zimmerman | Jul 2003 | A1 |
20030195872 | Senn | Oct 2003 | A1 |
20030215144 | Kito et al. | Nov 2003 | A1 |
20030225777 | Marsh | Dec 2003 | A1 |
20030226145 | Marsh | Dec 2003 | A1 |
20040117831 | Ellis et al. | Jun 2004 | A1 |
20040221310 | Herrington | Nov 2004 | A1 |
20050065811 | Chu | Mar 2005 | A1 |
20050131762 | Bharat | Jun 2005 | A1 |
20050222981 | Lawrence | Oct 2005 | A1 |
20060020962 | Stark | Jan 2006 | A1 |
20060041548 | Parsons | Feb 2006 | A1 |
20060168622 | Poll et al. | Jul 2006 | A1 |
20060212900 | Ismail et al. | Sep 2006 | A1 |
20060242135 | Weare | Oct 2006 | A1 |
20060253421 | Chen | Nov 2006 | A1 |
20060265283 | Gorodyansky | Nov 2006 | A1 |
20070022440 | Gutta | Jan 2007 | A1 |
20070022442 | Gil | Jan 2007 | A1 |
20070130605 | Chung | Jun 2007 | A1 |
20070185830 | Rubel | Aug 2007 | A1 |
20070276733 | Geshwind | Nov 2007 | A1 |
20070276811 | Rosen | Nov 2007 | A1 |
20070297758 | Seo | Dec 2007 | A1 |
20080072180 | Chevalier | Mar 2008 | A1 |
20080097863 | Spiegelman | Apr 2008 | A1 |
20080120289 | Golan | May 2008 | A1 |
20080216106 | Maxwell et al. | Sep 2008 | A1 |
20080256579 | Verhaegh et al. | Oct 2008 | A1 |
20080270389 | Jones | Oct 2008 | A1 |
20080276278 | Krieger et al. | Nov 2008 | A1 |
20080320531 | Kim et al. | Dec 2008 | A1 |
20090006216 | Blumenthal | Jan 2009 | A1 |
20090012799 | Hornthal | Jan 2009 | A1 |
20090055338 | Kellogg | Feb 2009 | A1 |
20090077033 | McGary | Mar 2009 | A1 |
20090119263 | Jones | May 2009 | A1 |
20090157523 | Jones | Jun 2009 | A1 |
20090234834 | Cozzi | Sep 2009 | A1 |
20090300476 | Vogel | Dec 2009 | A1 |
20090320070 | Inoguchi | Dec 2009 | A1 |
20100088307 | Watanabe | Apr 2010 | A1 |
20100153324 | Downs | Jun 2010 | A1 |
20100251162 | Stallings et al. | Sep 2010 | A1 |
20100306805 | Neumeier et al. | Dec 2010 | A1 |
20110029514 | Kerschberg | Feb 2011 | A1 |
20110099164 | Melman | Apr 2011 | A1 |
20110145822 | Rowe | Jun 2011 | A1 |
20110225156 | Pavlik | Sep 2011 | A1 |
20110252031 | Blumenthal | Oct 2011 | A1 |
20120060186 | Ueno | Mar 2012 | A1 |
20120089996 | Ramer et al. | Apr 2012 | A1 |
20120173502 | Kumar | Jul 2012 | A1 |
Number | Date | Country |
---|---|---|
2000285141 | Oct 2000 | JP |
2003157285 | May 2005 | JP |
WO0045319 | Aug 2000 | WO |
Number | Date | Country | |
---|---|---|---|
20130152114 A1 | Jun 2013 | US |