This application claims the benefit, under 35 U.S.C. §119 of EP Patent Application 10306489.5, filed 22 Dec. 2010.
The present invention relates to a method for filtering a content menu of a receiver for audio/video content.
In current applications, i.e. in a receiver for digital audio/video content, a feature setup is done in a page of the user menu, wherein that page represents the functionality of the feature. Feature setup does mean e.g. the definition of user profiles, the definition of settings of a settop box or the definition of an electronic program guide filter function or a search function. In known devices, a popup contextual menu or a dedicated area on a screen is used for this feature setup. Typically, on a page from a video on demand (VoD) catalogue, filtering options are available in the catalogue itself. Thus, a user first has to navigate to the VoD page and select this page, and afterwards a dedicated menu is presented for selecting and buying a video. The menu is dedicated and opened in the video on demand content. In a remote control as known in the art, specific control buttons exist for navigating in these menus. For a video on demand menu, a “buy” button may exist. For a teletext (videotext) menu, specific buttons exist for fast navigation (usually a red, green, yellow and blue button), a button for switching on and off the teletext, as well as a button for displaying the teletext in a transparent manner in front of the respective program. These dedicated control buttons make a remote control complicated and with increasing functionality of a receiver device, the number of buttons needed on the remote control also increases. In addition, in a gesture control device or gyroscopic device, in general no or only a few buttons are foreseen.
On the other hand, in some devices the dedicated menu buttons are displayed on the screen when a user has activated the corresponding menu and the user navigates using the arrow keys of his remote control (“→”, “⇑”, “⇓”, “←”) and the “ok” button for virtually pressing the keys presented on the screen.
Displaying dedicated buttons on the screen is uncomfortable for the user in case a gesture control or gyroscopic control is used, because pointing at a specific area on the screen is not possible in this case. In a gesture control or gyroscopic control, the controlling and command functionality is realized by movements. These movements are transformed into specific commands. Thus, a menu is needed which can be operated with such a control.
One problem solved by the invention is to find a way of remote controlling of a multimedia receiver having increased functionality such as filtering functionality for EPGs without the need of many buttons on the remote control.
According to the invention, a method for filtering a content menu containing a list of items is proposed. The method contains the steps of:
This has the advantage that it allows users to use new means of interaction with the user interface such as a gyroscopic control or a gesture control without sacrificing ergonomy, and even have a very intuitive application setting that is always present at the same place. The same type of movement is used for control, independently of the type of menu which is controlled. Another major advantage of the inventive method is to enable the feature itself to use any visible space available. Thus, the method reduces the space needed to display contents and it introduces new navigation concepts, which may not be consistent with the global application concepts. Further, the global navigation consistency is improved as navigation is performed in the hierarchy level of the main menu.
According to the invention, the method is applied in case of a device which is controlled by gesture control. Examples for gesture control are gyroscopic control, tracking of the movements of the user's hands or fingers, e.g. by a camera or by a touch screen on which the user moves his fingers. In case of gesture control, only a few buttons are available. Unlike a traditional remote control, no buttons are provided for dedicated menus, e.g. a separate teletext button or navigation buttons (red, green, blue, yellow) for the teletext menu. Thus, a user interface must be operable using only standard functions, which are available in the main menu. The invention has the advantage that for an electronic program guide or for a library for video on demand content no dedicated menus are needed, because the whole control is exported to the main menu and the user interface is consistent with the main menu functions. In addition, the invention has the advantage that there is no need to present these specific buttons virtually on the screen. This is advantageous, because in case a gesture control is used, it is almost impossible for a user to press a specific button displayed on the screen using his gesture control device. In case of a gesture control, a direction to which a user points can be easily recognized but no specific point can be pointed at. The invention provides an interface which can be operated just by interpreting the directions in which a user moves his hand or control device.
Advantageously, the method is used for filtering the content menu in an electronic program guide or video on demand library.
The invention further concerns a control device for filtering a multimedia content menu. The content menu contains a list of items to be filtered. The control device comprises:
Advantageously, the filters are displayed using a 3D representation on different level of depth for including and excluding filters and activating or inactivating settings.
The invention is described below using an electronic program guide received by a broadcasting receiver as an example. It is apparent to a person skilled in the art that the invention is applicable within program guides for other applications.
For better understanding the invention shall now be explained in more detail in the following description with reference to the figures. It is understood that the invention is not limited to this exemplary embodiment and that specific features can also expediently be combined and/or modified without departing from the scope of the present invention.
The receiver 1 also comprises a circuit 11 for displaying data on the screen, often called the OSD circuit, the initials standing for “On Screen Display”. The OSD circuit 11 is a text and graphics generator which enables menus and pictograms (for example, a number corresponding to the station displayed) to be displayed on the screen of the display device 2 and which enables the navigation menus in accordance with the present invention to be displayed. The OSD circuit 11 is controlled by the Central Unit 3 and a navigator 12′ present in the program memory 12. The navigator 12′ is advantageously made in the form of a program module recorded in a read only memory. It may also be embodied in the form of a custom circuit of ASIC type for example.
Via the digital bus 6 and/or the broadcasting network the receiver 1 receives data comprising multimedia documents and descriptive data pertaining to these documents. These data originate either from a broadcasting network, or from the digital network 6. The descriptive data comprise classification elements also called “attributes”, for the accessible multimedia documents. The descriptive data are for example contained in the service information specified in the DVB-SI Standard. These data are stored in the database of the memory 9 of the receiver 1 and are continuously updated. The navigator 12′ thereafter extracts the information from this database and processes it to produce the navigation menus displayed on the screen of the display device 2.
In a next step, the user selects a further filter, e.g. “Filter by Date”, “Filter by Genre”, “Filter by Keyword”. This is again done by gesture control or by a traditional remote control. In case of a gesture control, a movement in left direction selects “Filter by Date”, a movement in up direction selects “Filter by Genre” and a movement in right direction selects “Filter by Keyword”. If the filter by genre category is selected, the user selects the genre accordingly. In
Thus,
In
In other words, according to the invention, when the user requests to see the menu, then there is not a menu structure tree trough which the user has to navigate before he can actually see and access the items to be selected, but he sees immediately the current settings of the device, and can then navigate starting from the current point in the navigation tree. He does not have to start at the root of the navigation tree. When the user activates the menu, the menu displayed shows the ‘actual branch of a tree, together with the leaves’ in which the device is currently in—namely, the current settings. If the user wants to access another branch, he can do this by ‘climbing down’ and then ‘climbing up’ to another branch. In
In
When the user navigates downwards, the upper branches preferably move also upwards and are reduced in size. Advantageously, the user can ‘freeze’ a complete view, which means that the whole tree is displayed, as depicted in
Number | Date | Country | Kind |
---|---|---|---|
10306489 | Dec 2010 | EP | regional |
Number | Name | Date | Kind |
---|---|---|---|
5594469 | Freeman | Jan 1997 | A |
5621456 | Florin | Apr 1997 | A |
6910191 | Segerberg | Jun 2005 | B2 |
7580932 | Plastina et al. | Aug 2009 | B2 |
7752564 | Billmaier et al. | Jul 2010 | B2 |
20020194591 | Gargi | Dec 2002 | A1 |
20030052905 | Gordon et al. | Mar 2003 | A1 |
20030090524 | Segerber | May 2003 | A1 |
20040218104 | Smith et al. | Nov 2004 | A1 |
20050108748 | Nishikawa | May 2005 | A1 |
20070061023 | Hoffberg et al. | Mar 2007 | A1 |
20070277201 | Wong et al. | Nov 2007 | A1 |
20080063381 | Conroy | Mar 2008 | A1 |
20090322676 | Kerr | Dec 2009 | A1 |
20100023862 | Tai | Jan 2010 | A1 |
20100077334 | Yang et al. | Mar 2010 | A1 |
20100225668 | Tatke | Sep 2010 | A1 |
20100229119 | Sawyer | Sep 2010 | A1 |
Number | Date | Country |
---|---|---|
1667444 | Jun 2006 | EP |
WO2009032998 | Mar 2009 | WO |
Entry |
---|
Betge et al., “Visual Tracking of Body Features to Provide Computer Access for Pecipte With Severe Disabitities”, IEEE, vol. 10, No. 1, Mar. 2002, pp. 1-10. |
EP Search Report dated Jun. 7, 2011. |
Margrit Betke et al. “The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People with Severe Disabilities”, IEEE Transactions on Neural Systems and Rehabilitation Engineering vol. 10, No. 1, Mar. 2002, pp. 1-10, 1534-4320/02, 2002 IEEE. |
Number | Date | Country | |
---|---|---|---|
20120167143 A1 | Jun 2012 | US |