This disclosure is related to media processing systems and methods.
Media devices, such as digital video and audio players, can include multiple functions and capabilities, such as playing stored content, browsing and selecting from recorded content, storing and/or receiving content selected by a user, and the like. These various functions can often be grouped according to content types, e.g., movies, music, television programs, photos, etc., and accessed through a user interface. The user interface can include both graphical and textual features. It is desirable that the user interface conveys information to the user in an intuitive manner. However, current media devices often present the information in a poorly organized manner, and do not provide the capability to reorganize the information according to one or more aspects related to the information.
Disclosed herein are systems and methods for organizing menu items. In one implementation, an interface environment includes a menu arranged in the interface environment, the menu including a list of menu items associated with episodic content. The interface environment further includes a sort interface arranged in the interface environment, the sort interface being configured to receive a selection and to sort the list of menu items based upon the selection. The selection is operable to request sorting based upon at least an episode order.
In another implementation, instructions provided on one or more computer readable media are used to cause a processor to perform the operations comprising: generating a choice of sort options for a plurality of menu items associated with episodic content, the sort options comprising at least an episode order sort option; receiving a selection of the episode order sort option; sorting the menu items based upon an episode order in response to receipt of the selection of the episode order sort option; generating a menu comprising a list of menu items based upon the sorting of the menu items; and, presenting the menu arranged within an interface environment.
The media data and related metadata may be provided by a single provider, or may be provided by separate providers. In one implementation, the media processing system 100 can be configured to receive media data from a first provider over a first network, such as a cable network, and receive metadata related to the video data from a second provider over a second network, such as a wide area network (WAN). Example media data include video data, audio data, content payload data, or other data conveying audio, textual and/or video data.
In another implementation, the media processing system 100 can be configured to receive media data and metadata from a computing device, such as a personal computer. In one example of this implementation, a user manages one or more media access accounts with one or more content providers through the personal computer. For example, a user may manage a personal iTunes® account with iTunes® software, available from Apple Computer, Inc. Media data, such as audio and video media data, can be purchased by the user and stored on the user's personal computer and/or one or more data stores. The media data and metadata stored on the personal computer and/or the one or more data stores can be selectively pushed and/or pulled for storage in the data store 102 of the media processing system 100.
In another implementation, the media processing system 100 can be used to process media data stored in several data stores in communication with a network, such as wired and/or wireless local area network (LAN), for example. In one implementation, the media processing system 100 can pull and/or receive pushed media data and metadata from the data stores over the network for presentation to a user. For example, the media processing system 100 may be implemented as part of an audio and video entertainment center having a video display device and an audio output device, and can pull media data and receive pushed media data from one or more data stores for storage and processing. At the entertainment center, a user can, for example, view photographs that are stored on a first computer while listening to music files that are stored on a second computer.
In one implementation, the media processing system 100 includes a remote control device 108. The remote control device 108 can include a rotational input device 110 configured to sense touch actuations and generate remote control signals therefrom. The touch actuations can include rotational actuations, such as when a user touches the rotational input device 110 with a digit and rotates the digit on the surface of the rotational input device 110. The touch actuations can also include click actuations, such as when a user presses on the rotational input device 110 with enough pressure to cause the remote control device 108 to sense a click actuation.
In one implementation, the functionality of the media processing system 100 is distributed across several engines. For example, the media processing system 100 may include a controller engine 112, a user interface (UI) engine 114, and one or more media engines 116-1, 116-2, and 116-n. The engines may be implemented in software as software modules or instructions, or may be implemented in hardware, or in a combination of software and hardware.
The control engine 112 is configured to communicate with the remote control device 108 by a link, such as a wireless infrared signal or radio frequency signal. The remote control device 108 can transmit remote control signals generated, for example, from touch actuations of the rotational input device 110 to the control engine 112 over the link. In response, the control engine 112 is configured to receive the remote control signals and generate control signals in response. The control signals are provided to the processing device 104 for processing.
The control signals generated by the control engine 112 and processed by the processing device 104 can invoke one or more of the UI engine 114 and media engines 116-1-116-n. In one implementation, the UI engine 114 manages a user interface to facilitate data presentation for the media engines 116-1-116-n and functional processing in response to user inputs.
In one implementation, the media engines 116 can include one or more content-specific engines, such as a movies engine, television program engine, music engine, and the like. Each engine 116 can be instantiated to support content-specific functional processing. For example, a movie engine to support movie-related functions can be instantiated by selecting a “Movies” menu item. Example movie-related functions include purchasing movies, viewing movie previews, viewing movies stored in a user library, and the like. Likewise, a music engine to support music-related functions can be instantiated by selecting a “Music” menu item. Example music-related functions include purchasing music, viewing music playlists, playing music stored in a user library, and the like.
The media processing system 100 of
The rotational input device areas 160, 162, 164, 166 and 168 are receptive to press actuations. In one implementation, the areas include a menu area 160, a reverse/previous area 162, a play/pause area 164, a forward/next area 166, and a select area 168. The areas 160-168, in addition to generating signals related to their descriptive functionalities, can also generate signals for context-dependent functionality. For example, the menu area 160 can generate signals to support the functionality of dismissing an onscreen user interface, and the play/pause area 164 can generate signals to support the function of drilling down into a hierarchal user interface. In one implementation, the areas 160-168 comprise buttons disposed beneath the surface of the rotational input device 110. In another implementation, the areas 160-168 comprise pressure sensitive actuators disposed beneath the surface of the rotational input device 110.
The processing device 150 is configured to receive the signals generated by the rotational input device 110 and generate corresponding remote control signals in response. The remote control signals can be provided to the communication subsystem 152, which can wirelessly transmit the remote control signals to the media processing system 100.
Although shown as comprising a circular surface, in another implementation, the rotational input device 110 can comprise a rectangular surface, a square surface, or some other shaped surface. Other surface geometries that accommodate pressure sensitive areas and that can sense touch actuations may also be used, e.g., an oblong area, an octagonal area, etc.
Other actuation area configurations may also be used. For example, in another implementation, the remote control device 108 can also include a separate actuation button 170. In this implementation, the areas comprise a “+” or increase area 160, a reverse/previous area 162, a “−” or decrease area 164, a forward/next area 166, a play/pause area 168, and a menu area 170.
The media data can be received through the network 212 by one of the computing devices, such as computing device 208. The network 212 can include one or more wired and wireless networks, such as the Internet. The media data is provided by one or more content providers 214. For example, the content provider 214-1 may provide media data that is processed by the media processing system 100 and output through the output devices 206, and the content provider 214-2 may provide metadata related to the media data for processing by the media processing system 100. Such metadata may include episodic content, artist information, and the like. A content provider 214 can also provide both media data and related metadata.
In one implementation, the media processing system 100 can also communicate with one or more content providers 214 directly. For example, the media processing system 100 can communicate with the content providers the wireless network 202, the I/O device 203, and the network 212. The media processing system 100 can also communicate with the content providers 214 thorough other network configuration, e.g., through a direct connection to a cable modem, through a router, or through one or more other communication devices. Example communications can include receiving sales information, preview information, or communications related to commercial transactions, such as purchasing audio files and video files.
In another implementation, the media processing system 100 can receive content from any of the computing devices 206 and 208, and other such computing devices or data stores 210 available on the network 202 through sharing. Thus, if any one or more of the computing devices or data stores are unavailable, media data and/or meta data one the remaining computing devices or other such computing devices or data stores can still be accessed.
In one implementation, the interface environment 300 includes a menu 302 and a menu title 304, e.g., “TV Shows.” The menu 302 includes menu items 312, 314, 316, 318, 320, 322, 324. The menu 302 can also include the highlight indicator 326 that highlights a menu item. The menu items, can, for example, correspond to television shows that have either been recorded from a broadcast or purchased from a content provider.
In one implementation, the menu 302 also includes a sort field 306 that includes a first sort option 308 and a second sort option 310. Selection of the first sort option 308 can, for example, sort the content menu items by a program category, e.g., a program title. In one implementation, multiple instances of the same program title are grouped into folders, as indicated by the child indicators 328 of
Selection of the second sort option 310 sorts the content menu items according to a date, as indicated by the date fields 315, 317, 319, 321, 323, and 325 of
In the case where the menu items are to be sorted in accordance with an episode order, metadata including an original air date can be retrieved. As an example, metadata including an original air date can be stored in a data store (e.g., data store 102 of
In one implementation, the first content menu item 312 is a sales content menu associated with content offered for sale. For example, the content menu item 312 is entitled “iTunes Store Presents,” and includes a child indicator 328. Selecting the iTunes Store Presents menu item 312 can, for example, cause the interface environment to transition to include another menu that lists one or more menu items associated with content available for purchase by, for example, download. In one implementation, the content items listed for sale correspond to the content type of the content menu 302. For example, the interface environment 300 of
The interface environment 300 can also include menu item abstractions that correspond to a menu item, which in turn corresponds to associated content. For example, the menu abstraction 334 correspond to one of the menu items 314, 316, 318, 320, 322, 324 in
In one implementation, a set of menu item abstractions can be associated with a single menu item, or can be associated with a plurality of menu items. In the example menu interface environment 300 of
In one implementation, the interface environment 400 includes a menu 302 and a menu title 304, e.g., “TV Shows.” The menu 302 includes menu items 402, 404, 406, 408, 410, 412. The menu 302 can also include the highlight indicator 326 that can highlight a sort order option 308, 310 and/or a menu item 402-412. The menu items, can, for example, correspond to television shows that have either been, for example, recorded from a broadcast or purchased from a content provider.
In one implementation, the menu 302 also includes a sort field 306 that includes a first sort option 414 and a second sort option 416. Selection of the first sort option 414 can, for example, sort the content menu items by an episode title. In various implementations, menu items corresponding to multiple seasons of the same series title can be grouped into folders. Instances of menu items can, for example, be displayed according associated episode titles (e.g., alphabetical order) as shown in
Selection of the second sort option 416 sorts the content menu items according to an air date, as indicated by the date fields 403, 405, 407, 409, 411 and 413 of
In one implementation, the interface environment 400 includes a menu 302 and a menu title 304, e.g., “TV Shows.” The menu 302 includes menu items 402, 404, 406, 408, 410, 412. The menu 302 can also include the highlight indicator 326 that can highlights a sort order option 308, 310 and/or a menu item 402-412. The menu items, can, for example, correspond to television shows that have been recorded from a broadcast or purchased from a content provider.
As described previously, the menu 302 also includes a sort field 306 that includes a first sort option 414 and a second sort option 416. Selection of the first sort option 414 can, for example, sort the content menu items by an episode title. In
In one implementation, the interface environment 500 includes the menu 302 and a menu title 304, e.g., “TV Shows.” The menu 302 includes menu items 402, 404, 406, 408, 410, 412. The menu 302 can also include the highlight indicator 326 that can highlight a sort order option 508, 510 and/or a menu item 402-412. The menu items 402-412, can, for example, correspond to television shows that have been recorded from a broadcast or purchased from a content provider. In some implementations, the menu items 402-412 can include television shows that are available for purchase from a content provider.
In some implementations, the menu 302 also includes a sort field 306 that includes a first sort option 508 and a second sort option 510. Selection of the first sort option 508 can, for example, sort the content menu items by a record date (e.g., an acquisition date). Alternatively, selection of the second sort option 510 sorts the content menu items according to an air date, as indicated by the episode number fields 503, 505, 507, 509, 511 and 513. When menu items are to be sorted in accordance with an episode number, metadata including an episode number can be retrieved. As an example, metadata including an episode number can be stored in a data store (e.g., data store 102 of
In one implementation, the interface environment 400 includes the menu 302 and a menu title 304, e.g., “TV Shows.” The menu 302 includes menu items 402, 404, 406, 408, 410, 412. The menu 302 can also include the highlight indicator 326 that can highlight a menu item 402-412 for selection. The menu items 402-412, can, for example, correspond to television shows that have been recorded from a broadcast or purchased from a content provider. In some implementations, the menu items 402-412 can include television shows that are available for purchase from a content provider.
As described above, upon selection of the “Show” sort option from interface environment 300 of
The interface environment 300 can optionally include menu item abstractions that correspond to a menu item, which in turn corresponds to associated content. For example, the menu abstractions 420, 422 correspond to one of the menu items 402-412. In one implementation, the menu item abstractions 420, 422 can be a graphical representation of the content corresponding to the highlighted menu item or metadata associated with the content corresponding to the highlighted menu item. For example, the menu item abstraction 420, which corresponds to the sorted menu items, can comprise digital representations of television program art or television program stills for television programs that are stored in a library (e.g., a data store 102, 210, content provider 214, etc.). Further, menu item abstraction 422 can comprise, for example, metadata information associated with the highlighted menu item 326
In step 604, the menu items are sorted in at least an episodic order based upon a prompt. As an example, step 602 can be performed by one or more corresponding media engines 116. In some examples, the prompt is user input received through a user interface engine 114. The menu items 402-412 can be ordered, for example, based upon an original air date or an episode number. If the original air date or episode number associated with a menu item 402-412 is not available within associated metadata, the original air date or episode number can be retrieved using a corresponding media engine 116.
In step 704, a selection of a episodic order sort option is received. As an example, step 704 can be provided by one or more corresponding media engines 116 and a user interface engine 114. In some implementations, a user is able to use a cursor or a highlight to navigate a menu structure 302 and to select the episodic order sort option 416.
In step 706, the menu items are sorted. Sorting of the menu items in step 706 can be performed by one or more corresponding media engines 116. The menu items 402-412 can, for example, be sorted based upon the received selected sort option from step 704. In some examples, the sort option is an episodic order sort option 416. Upon receipt of the episodic order sort option 416, the menu items 402-412 can be ordered based upon a sort order (e.g., original air date or an episode number). If the original air date or episode number associated with a menu item 402-412 is not available within associated metadata, the original air date or episode number can be retrieved using a corresponding media engine 116.
In step 708, a menu comprising the list of menu items is generated. As an example, step 708 can be provided by one or more corresponding media engines 116. The menu 302 can be arranged within an interface environment (e.g., interface environment 300, 400, 500). Moreover, proximate to the menu 302 can be arranged one or more menu item abstractions 420, 422 arranged within the interface environment 300, 400, 500. The menu item abstractions 420, 422 can be associated with a highlighted menu item 402-412. The menu item abstractions 420, 422 can include, for example, promotional media (e.g., series art, series poster(s), production stills, etc.), metadata (e.g., series title, episode title, summary description, air date, actor(s), director(s), etc.) associated with content identified by the menu items 402-412.
In step 804, a determination is made whether to organize the menu items by show. As an example, the determination can be made based upon a user selection received through a user interface engine 114. Alternatively, the determination can be made based upon predefined parameters interpreted by one or more corresponding media engines 116. If the determination is that the menu items 402-412 are not being organized by show, the process can return to step 802.
However, if the determination is that the menu items are to be organized by show, then unique titles are extracted from available content in step 806. Available content in some examples can include content captured based upon broadcast media, or downloaded (e.g., purchased, subscription-based, free, etc.). As an example, step 806 can be performed by one or more corresponding media engines 116.
In step 808, a list is generated based upon the extracted unique titles. As an example, step 808 can be performed by one or more corresponding media engines 116. In some examples, the list is intended to provide the appearance of organization of related content into folders. For example, all menu items 402-412 associated with the series entitled “The Simpsons” can abstracted out of the menu items 360-370 and replaced by a plurality abstracted menu items 360-370 (e.g., “The Simpsons”).
In step 810, a determination is made whether there has been a title request received. Step 810 can be performed, for example, by one or more corresponding media engines 116 or a user interface engine 114. If the determination is that no title request has been received, the process returns to step 806.
However, if the determination is that there has been a title request received, a menu based upon the title request is generated in step 812. As an example, step 812 can be performed by one or more corresponding media engines 116. The menu 302 can include any menu items 402-412 that are associated with the requested title. The menu items 402-412 can be selectable by a user, for example, by using a selector 324 (e.g., a cursor or a highlighter).
In step 814, sort options are generated. As an example, the sort options can be generated by one or more corresponding media engines 116. The generation of the sort options 414, 416, 508, 510 can include at least an option for sorting based upon episode order (e.g., original air date, episode number, etc.). The generation of sort options can also include, for example, an acquisition date, title sort, etc. In various examples, the sort options are presented to the user as a list of selectable menu items 402-412.
In step 816, a selection of a episode sort option is received. As an example, step 816 can be provided by one or more corresponding media engines 116 and a user interface engine 114. In some implementations, a user is able to use a cursor or a highlight to navigate a portion of menu structure 302 and to select the episodic order sort option 310, 416.
In step 818, the menu items are sorted. Sorting of the menu items in step 818 can be performed by one or more corresponding media engines 116. The menu items 402-412 can be sorted based upon the received selected sort option from step 816. In some examples, the sort option is an episode sort option. Upon receipt of the episode sort option, the menu items 402-412 can be ordered based upon a sort order (e.g., original air date or an episode number). If the original air date or episode number associated with a menu item 402-412 is not available within associated metadata, the original air date or episode number can be retrieved using a corresponding media engine 116.
In step 820, a menu comprising the list of menu items is generated. As an example, step 820 can be performed by one or more corresponding media engines 116. The menu 302 can be arranged within an interface environment (e.g., interface environment 400 of
Other program content can also be sorted and presented in addition to episodic content. For example, a sports team, such as a baseball, football or basketball team, may have a periodic broadcast schedule. However, the systems and methods described herein can likewise be used to sort recorded periodic events, whether purchased or recorded from broadcasts, according to original air dates or a scheduling number.
The apparatus, methods, flow diagrams, and structure block diagrams described in this patent document can be implemented in computer processing systems including program code comprising program instructions that are executable by the computer processing system. Other implementations can also be used. Additionally, the flow diagrams and structure block diagrams described in this patent document, which describe particular methods and/or corresponding acts in support of steps and corresponding functions in support of disclosed structural means, can also be utilized to implement corresponding software structures and algorithms, and equivalents thereof.
The apparatus, methods, flow diagrams, and structure block diagrams described in this patent document can be implemented in computer processing systems including program code comprising program instructions that are executable by the computer processing system. Other implementations can also be used. Additionally, the flow diagrams and structure block diagrams described in this patent document, which describe particular methods and/or corresponding acts in support of steps and corresponding functions in support of disclosed structural means, can also be utilized to implement corresponding software structures and algorithms, and equivalents thereof.
The methods and systems described herein may be implemented on many different types of processing devices by program code comprising program instructions that are executable by one or more processors. The software program instructions may include source code, object code, machine code, or any other stored data that is operable to cause a processing system to perform methods described herein.
The systems and methods may be provided on many different types of computer-readable media including computer storage mechanisms (e.g., CD-ROM, diskette, RAM, flash memory, computer's hard drive, etc.) that contain instructions for use in execution by a processor to perform the methods' operations and implement the systems described herein.
The computer components, software modules, functions and data structures described herein may be connected directly or indirectly to each other in order to allow the flow of data needed for their operations. It is also noted that software instructions or a module can be implemented for example as a subroutine unit of code, or as a software function unit of code, or as an object (as in an object-oriented paradigm), or as an applet, or in a computer script language, or as another type of computer code or firmware. The software components and/or functionality may be located on a single device or distributed across multiple devices depending upon the situation at hand.
This written description sets forth the best mode of the invention and provides examples to describe the invention and to enable a person of ordinary skill in the art to make and use the invention. This written description does not limit the invention to the precise terms set forth. Thus, while the invention has been described in detail with reference to the examples set forth above, those of ordinary skill in the art can effect alterations, modifications and variations to the examples without departing from the scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
5619249 | Billock et al. | Apr 1997 | A |
5621456 | Florin et al. | Apr 1997 | A |
5717879 | Moran et al. | Feb 1998 | A |
5822123 | Davis et al. | Oct 1998 | A |
5880768 | Lemmons et al. | Mar 1999 | A |
5933811 | Angles et al. | Aug 1999 | A |
6006227 | Freeman et al. | Dec 1999 | A |
6216264 | Maze et al. | Apr 2001 | B1 |
6335737 | Grossman et al. | Jan 2002 | B1 |
6448987 | Easty et al. | Sep 2002 | B1 |
6638313 | Freeman et al. | Oct 2003 | B1 |
6725427 | Freeman et al. | Apr 2004 | B2 |
6760721 | Chasen et al. | Jul 2004 | B1 |
6768999 | Prager et al. | Jul 2004 | B2 |
6847977 | Abajian | Jan 2005 | B2 |
6928433 | Goodman et al. | Aug 2005 | B2 |
6944632 | Stern | Sep 2005 | B2 |
7020652 | Matz et al. | Mar 2006 | B2 |
7096234 | Plastina et al. | Aug 2006 | B2 |
7159000 | Plastina et al. | Jan 2007 | B2 |
7220910 | Plastina et al. | May 2007 | B2 |
7292243 | Burke | Nov 2007 | B1 |
7302639 | Everhart et al. | Nov 2007 | B1 |
7340760 | Wachtfogel et al. | Mar 2008 | B2 |
7362331 | Ording | Apr 2008 | B2 |
7363591 | Goldthwaite et al. | Apr 2008 | B2 |
7367042 | Dakss et al. | Apr 2008 | B1 |
20020033848 | Sciammarella et al. | Mar 2002 | A1 |
20020042920 | Thomas et al. | Apr 2002 | A1 |
20020042923 | Asmussen et al. | Apr 2002 | A1 |
20020083469 | Jeannin et al. | Jun 2002 | A1 |
20020099731 | Abajian | Jul 2002 | A1 |
20020175931 | Holtz et al. | Nov 2002 | A1 |
20020178447 | Plotnick et al. | Nov 2002 | A1 |
20030005445 | Schein et al. | Jan 2003 | A1 |
20030070167 | Holtz et al. | Apr 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030110450 | Sakai | Jun 2003 | A1 |
20030117425 | O'Leary et al. | Jun 2003 | A1 |
20030142751 | Hannuksela | Jul 2003 | A1 |
20030149975 | Eldering et al. | Aug 2003 | A1 |
20030174160 | Deutscher et al. | Sep 2003 | A1 |
20040008211 | Soden et al. | Jan 2004 | A1 |
20040100479 | Nakano et al. | May 2004 | A1 |
20040140995 | Goldthwaite et al. | Jul 2004 | A1 |
20040150657 | Wittenburg et al. | Aug 2004 | A1 |
20040221308 | Cuttner et al. | Nov 2004 | A1 |
20040261031 | Tuomainen et al. | Dec 2004 | A1 |
20050005308 | Logan et al. | Jan 2005 | A1 |
20050041033 | Hilts et al. | Feb 2005 | A1 |
20050044499 | Allen et al. | Feb 2005 | A1 |
20050060741 | Tsutsui et al. | Mar 2005 | A1 |
20050091597 | Ackley | Apr 2005 | A1 |
20050120148 | Jung et al. | Jun 2005 | A1 |
20050149872 | Fong et al. | Jul 2005 | A1 |
20050160375 | Sciammarella et al. | Jul 2005 | A1 |
20050216932 | Danker | Sep 2005 | A1 |
20050246654 | Hally et al. | Nov 2005 | A1 |
20050278656 | Goldthwaite et al. | Dec 2005 | A1 |
20060020962 | Stark et al. | Jan 2006 | A1 |
20060026635 | Potrebic et al. | Feb 2006 | A1 |
20060031776 | Glein et al. | Feb 2006 | A1 |
20060074769 | Looney et al. | Apr 2006 | A1 |
20060090185 | Zito et al. | Apr 2006 | A1 |
20060265409 | Neumann et al. | Nov 2006 | A1 |
20070162853 | Weber et al. | Jul 2007 | A1 |
20070174872 | Jing et al. | Jul 2007 | A1 |
20070288863 | Ording et al. | Dec 2007 | A1 |
20080062894 | Ma et al. | Mar 2008 | A1 |
20080065720 | Brodersen et al. | Mar 2008 | A1 |
20080066009 | Gardner et al. | Mar 2008 | A1 |
20080066010 | Brodersen et al. | Mar 2008 | A1 |
20080066013 | Brodersen et al. | Mar 2008 | A1 |
20080066100 | Brodersen et al. | Mar 2008 | A1 |
20080066110 | Brodersen et al. | Mar 2008 | A1 |
20080092168 | Logan et al. | Apr 2008 | A1 |
20080122870 | Brodersen et al. | May 2008 | A1 |
Number | Date | Country |
---|---|---|
1289287 | Mar 2003 | EP |
1 469 375 | Oct 2004 | EP |
1 510 911 | Mar 2005 | EP |
WO 0033573 | Jun 2000 | WO |
Number | Date | Country | |
---|---|---|---|
20080065638 A1 | Mar 2008 | US |