1. Field
Example aspects of the present disclosure generally relate to browsing content stored in a content source.
2. Related Applications
The present patent application is related to the following patent applications each assigned to a common assignee:
Attorney Docket Number 2147.042US1, filed on Sep. 3, 2010 entitled, “A USER INTERFACE FOR CONTENT BROWSING AND SELECTION IN A CONTENT SYSTEM”, U.S. patent application Ser. No. 12/875,245, which is hereby incorporated by reference in its entirety.
Attorney Docket Number 03449.000029, filed on Sep. 3, 2010 entitled, “GUIDED NAVIGATION”, U.S. patent application Ser. No. 12/875,457, which is hereby incorporated by reference in its entirety.
Attorney Docket Number 03449.000037, filed on Sep. 3, 2010 entitled, “GENERATING BROWSING HIERARCHIES”, U.S. patent application Ser. No. 12/875,491, which is hereby incorporated by reference in its entirety.
3. Related Art
Media servers have changed the way consumers store and view media content on televisions and/or other consumer electronic (“CE”) devices. Home entertainment networks further allow media stored on or accessible by a media server at a central location to be presented at multiple endpoints. A media server can be combined with or incorporated into a digital video recorder (DVR), a game console, a set top box, or as a media server application running, for example, on a PC. A media server also can be configured to automatically record media content, such as a television program, that is scheduled for broadcast at some time in the future.
Similarly, a media server can be configured to download or stream media content from the Internet, or from devices coupled either directly or through a network to the media server. Common devices used in conjunction with media servers include flash drives, hard drives, digital cameras, PC's, mobile telephones, personal digital assistants, and music players. The consumer controls the media server to view photos or video, play music, or present online content on a television or other CE device.
In an example embodiment provided herein, content stored in a content source is browsed. A hierarchical tree structure is accessed. The hierarchical tree structure has nodes that correspond to at least one query. At least one static visual representation of a node that is in a top level of the hierarchical tree structure is displayed such that the at least one static visual representation is selectable by a user. In response to user selection of the at least one static visual representation, a corresponding static query is executed to receive visual representations of content stored in the content source, and the received visual representations are displayed such that they are selectable by the user. In response to user selection of a received visual representation, a corresponding dynamic query is executed to receive visual representations of content stored in the content source, and the visual representations received from the dynamic query are displayed such that they are selectable by the user. The dynamic query corresponds to a node that is a child of a node that corresponds to a previously executed query. The visual representations received from the dynamic query match the corresponding selected visual representation.
In another aspect, visual representations include at least one of display names, icons and thumbnails.
In another aspect, the queries corresponding to the nodes of the hierarchical tree structure are executed by using a search functionality of the content source, and the search functionality includes at least one of Universal Plug and Play search and Digital Living Network Alliance DLNA type search.
In another aspect, the queries corresponding to the nodes of the hierarchical tree structure include queries for at least one of music content, photographic content, and video content.
In another aspect, the visual representations are received asynchronously.
In another aspect, the content source includes at least one of a Universal Plug and Play Content Directory Service, a local content library, a mini media server content library, an external content provider, and an aggregated external content provider.
Further features and advantages, as well as the structure and operation, of various example embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
The features and advantages of the example embodiments presented herein will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference numbers indicate identical or functionally similar elements.
Example aspects and embodiments are now described in more detail herein. This is for convenience only and is not intended to limit the application of the present description. In fact, after reading the following description, it will be apparent to one skilled in the relevant art(s) how to implement alternative embodiments.
The following terms are defined below for reference. These terms are not rigidly restricted to these definitions. A term may be further defined by its use in other sections of this description.
“Album” means a collection of tracks. An album is typically originally published by an established entity, such as a record label (for example, a recording company such as Warner Brothers and Universal Music).
The terms “program,” “multimedia program,” “show,” and the like include video content, audio content, applications, animations, and the like. Applications include code, scripts, widgets, games and the like. Video content includes television programs, movies, video recordings, and the like. Audio content includes music, audio recordings, podcasts, radio programs, spoken audio, and the like. The terms “program,” “multimedia program,” and “show,” include scheduled content and unscheduled content. Scheduled content includes, for example, broadcast content and multicast content. Unscheduled content includes, for example, on-demand content, pay-per-access content, downloaded content, streamed content, and stored content.
The terms “content,” “media content,” “multimedia content,” and the like include video content, audio content, still imagery, applications, animations, and the like. Applications include code, scripts, widgets, games and the like. Video content includes television programs, movies, video recordings, and the like. Audio content includes music, audio recordings, podcasts, radio programs, spoken audio, and the like. Still imagery includes photos, graphics, and the like. The terms “content,” “media content,” and “multimedia content” include scheduled content and unscheduled content. Scheduled content includes, for example, broadcast content and multicast content. Unscheduled content includes, for example, on-demand content, pay-per-access content, downloaded content, streamed content, and stored content.
“Electronic program guide” or “EPG” data are typically displayed on-screen and can be used to allow a viewer to navigate, select, and discover content by time, title, channel, genre, etc. by use of a remote control, a keyboard, a mouse, a trackball, a touchpad, a stylus, or other similar input devices. In addition, EPG data can be used to schedule future recording by a digital video recorder (DVR) or personal video recorder (PVR).
“Song” means a musical composition. A song is typically recorded onto a track by a record label (such as, a recording company). A song may have many different versions, for example, a radio version and an extended version.
“Track” means an audio and/or video data block. A track may be on a disc, such as, for example, a Blu-ray Disc, a CD or a DVD.
“User” means a consumer, client, and/or client device in a marketplace of products and/or services.
“User device” (such as “client”, “client device”, “user computer”) is a hardware system, a software operating system and/or one or more software application programs. A user device may refer to a single computer or to a network of interacting computers. A user device may be the client part of a client-server architecture. A user device typically relies on a server to perform some operations. Examples of a user device include without limitation a television, a CD player, a DVD player, a Blu-ray Disc player, a personal media device, a portable media player, an iPod™, a Zoom Player, a laptop computer, a palmtop computer, a smart phone, a cell phone, a mobile phone, an MP3 player, a digital audio recorder, a digital video recorder, an IBM-type personal computer (PC) having an operating system such as Microsoft Windows™, an Apple™ computer having an operating system such as MAC-OS, hardware having a JAVA-OS operating system, and a Sun Microsystems Workstation having a UNIX operating system.
“Web browser” means any software program which can display text, graphics, or both, from Web pages on Web sites. Examples of a Web browser include without limitation Mozilla Firefox™ and Microsoft Internet Explorer™.
“Web page” means any documents written in mark-up language including without limitation HTML (hypertext mark-up language) or VRML (virtual reality modeling language), dynamic HTML, XML (extended mark-up language) or related computer languages thereof, as well as to any collection of such documents reachable through one specific Internet address or at one specific Web site, or any document obtainable through a particular URL (Uniform Resource Locator).
Multimedia content includes video content, audio content, still imagery, applications, animations, and the like. Applications include code, scripts, widgets, games and the like. Video content includes television programs, movies, video recordings, and the like. Audio content includes music, audio recordings, podcasts, radio programs, spoken audio, and the like. Still imagery includes photos, graphics, and the like. The terms “content,” “media content,” and “multimedia content” include scheduled content and unscheduled content. Scheduled content includes, for example, broadcast content and multicast content. Unscheduled content includes, for example, on-demand content, pay-per-access content, downloaded content, streamed content, and stored content.
In one embodiment, the media server 104 is a personal computer (PC) running a media server application such as Windows Media Center, or the like. Content from the content source 102 may be delivered through different types of transmission paths. Example transmission paths include a variety and/or combination of wired and/or wireless audio, video and/or television content distribution and/or delivery networks such as, for example, cable, satellite, terrestrial, analog, digital, standard definition, high definition, RF (UHF, VHF) and/or broadcast networks. Example transmission paths also include a variety and/or combination of wired and/or wireless wide-area data networks, such as, for example, the Internet, an intranet, and the like.
The media server 104 records multimedia content in a selected format to a disk drive or to another suitable storage device. The media server 104 is communicatively coupled to a user device 106, such as a television, an audio device, a video device, and/or another type of user and/or CE device. The media server 104 delivers the multimedia content to the user device 106 upon receiving the appropriate instructions from a suitable user input device, such as a remote control, a keyboard, a mouse, a trackball, a touchpad, a stylus, buttons located on the media server 104, itself, or other similar input devices. In turn, the user device 106 presents the multimedia content to a user. In some cases the user device 106 is part of a network, as further described below in relation to
A user can control the operation of the user device 106 via a suitable user input means, such as buttons located on the user device 106, itself or a remote control device, a keyboard, a mouse, a trackball, a touchpad, a stylus, or other similar input devices. In one embodiment, a single remote control device can be used to control both the user device 106 and the media server 104. The multimedia content recorded onto the media server 104 is viewed and/or heard by the user at a time chosen by the user.
The media server 104 may be located in close proximity to a user device 106, or may exist in a remote location, such as in another room of a household, or on a server of a multimedia content provider.
The media server 104 periodically receives scheduled listings data 110 via a traditional scheduled listings data path 114 through a network, such as a proprietary network or the Internet. The media server 104 stores the received scheduled listings data 110 in a suitable storage device.
The scheduled listings data 110, are typically provided by a content provider, and include schedule information corresponding to specific multimedia programs. The scheduled listings data 110 typically are used in conjunction with EPG data, which, as described above, are used to provide media guidance for content including scheduled and unscheduled television content as well as other forms of content. The media guidance is provided by, for example, a media guidance module. The media guidance allows a user to navigate, select, discover, search, browse, view, “consume,” schedule, record, and/or playback recordings of content by time, title, channel, genre, etc., by use of a user input device, such as a remote control device, a keyboard, a mouse, a trackball, a touchpad, a stylus, buttons located on the media server, itself, or other similar input devices.
As shown in
In one embodiment, an external database 116 is located on a server remote from the media server 104, and communicates with the media server 104 via a network 112, such as a proprietary network or the Internet. As new theme song data is generated and/or discovered, updates can be requested by the internal database 108, or automatically pushed to the internal database 108 from the external database 116 over the network 112. For example, if a new multimedia program is scheduled to appear in an upcoming season, new corresponding theme song data can be generated, stored in the external database 116, and downloaded to the internal database 108 before the new program is broadcasted.
Internal database 108 and/or the external database 116 may also be divided into multiple distinct databases. For example, the internal database 108 may be divided based on the type of data being stored by generating a database configured for storing photos, video, music, etc.
Upon scheduling a multimedia program, the media server 104 tunes to the channel based on received scheduled listings data 110 at a predetermined amount of time prior to the scheduled program start time. Once tuned to the channel, the media server 104 captures a portion of audio content received from the content source 102.
The media server 104 accesses content source(s) 102 and retrieves content in a form such as audio and video streams from the content source(s) 102 via multimedia signal lines 330 of
The media server 104 also includes a main memory 214. In one example embodiment, the main memory 214 is random access memory (RAM). The media server 104 also includes a storage device 216. In one example embodiment, the database 108, which, as described above, stores theme song data, is included in the storage device 216. The storage device 216 (also sometimes referred to as “secondary memory”) may also include, for example, a hard disk drive and/or a removable storage drive, representing a disk drive, a magnetic tape drive, an optical disk drive, etc. As will be appreciated, the storage device 216 may include a computer-readable storage medium having stored thereon computer software and/or data.
In alternative embodiments, the storage device 216 may include other similar devices for allowing computer programs or other instructions to be loaded into the media server 104. Such devices may include, for example, a removable storage unit and an interface, a program cartridge and cartridge interface such as that found in video game devices, a removable memory chip such as an erasable programmable read only memory (EPROM), or programmable read only memory (PROM) and associated socket, and other removable storage units and interfaces, which allow software and data to be transferred from the removable storage unit to the media server 104.
The communications interface 210 provides connectivity to a network 112, such as a proprietary network or the Internet. The communications interface 210 also allows software and data to be transferred between the media server 104 and external devices. Examples of the communications interface 210 may include a modem, a network interface such as an Ethernet card, a communications port, a Personal Computer Memory Card International Association (PCMCIA) slot and card, and the like. In one example embodiment, communications interface 210 is an electronic communications interface, but in other embodiments, communications interface 210 can be an electromagnetic, optical, or other suitable type of communications interface 210. The transferred software and data are provided to and/or from the communications interface 210 via a communications path. This communication path may be implemented by using wire, cable, fiber optics, a telephone line, a cellular link, an RF link, and/or other suitable communication path.
In one embodiment, the communications interface 210 provides connectivity between the media server 104 and the external database 116 via the network 112. The communications interface 210 also provides connectivity between the media server 104 and the scheduled listings data 110 via the traditional scheduled listings data path 114. The network 112 preferably includes a proprietary network and/or the Internet.
A remote control interface 218 decodes signals received from a remote control 204, such as a television remote control or other user input device, and communicates the decoded signals to the processor 212. The decoded signals, in turn, are translated and processed by the processor 212.
A processor 212 first loads the computer-executable process steps (encoded in machine-executable instructions) from storage device 216, or another storage device into a region of a memory 214. Once loaded, the processor 212 executes the stored process steps stored in the memory 214.
As shown in
As will be described below in more detail, the presentation layer module 401 accesses the guided browse function 404, which includes a hierarchical tree structure having nodes that correspond to at least one query. The presentation layer module 401 sends the guided browse function 404 a request to receive at least one static visual representation of a node that is in a top level of the hierarchical tree structure. The presentation layer module 401 displays the received static visual representation such that it is selectable by a user. In response to user selection of the static visual representation, the presentation layer module 401 sends the guided browse function 404 a request to execute a corresponding static query to receive visual representations of content stored in the content source, and displays the received visual representations such that they are selectable by the user. In response to user selection of a received visual representation, the presentation layer module 401 sends the guided browse function 404 a request to execute a corresponding dynamic query to receive visual representations of content stored in the content source, and displays the visual representations received from the dynamic query such that they are selectable by the user. The dynamic query corresponds to a node that is a child of a node that corresponds to a previously executed query. The visual representations received from the dynamic query match the corresponding selected visual representation.
In the example embodiment, the presentation layer module 401 is stored as computer-executable process steps encoded in machine-executable instructions. The computer-executable process steps are for browsing content stored in the content source. The computer-executable process steps of the presentation layer module 401 are stored in storage device 216 of the media server 104 of
In other embodiments, the presentation layer module 401 of
The guided browse function 404 is constructed from a content source identifier. The content source identifier identifies a content source that is searched by the guided browse function 404. In response to receiving a request to browse content, the guided browse function 404 is constructed to search the content stored in the identified content source.
In the example embodiment, the guided browse function 404 is stored as computer-executable process steps encoded in machine-executable instructions. The computer-executable process steps are for searching the content stored in the content source. The computer-executable process steps of the guided browse function 404 are stored in storage device 216 of the media server 104 of
In other embodiments, the guided browse function 404 of
The guided browse function 404 of
In a case where the guided browse function 404 is in the non-native browse mode, the guided browse function 404 includes a hierarchical structure that defines a hierarchy of content stored in the content source that is independent of the file structure of the content stored in the content source. The hierarchical structure includes nodes that represent search queries. In response receiving a request to browse content corresponding to a selected node in the hierarchical tree structure, the guided browse function 404, when in the non-native browse mode, searches the content stored in the content source by using a search query corresponding to the selected node in the hierarchical structure. Thus, the search query used by the guided browse function 404 in the non-native browse mode is determined in accordance with the hierarchical structure that defines the hierarchy of content stored in the content source. In this manner, the guided browse function 404 in the non-native browse mode browses content stored in the content source by sequentially executing queries corresponding to nodes of the hierarchical tree structure, in accordance with a hierarchy of the hierarchical tree structure. In the embodiments described above in which the guided browse function 404 of
In the example embodiment, and as described above with respect to the presentation layer module 401, the hierarchical structure is a tree structure that contains tree nodes. The tree nodes are composed of two groups, “static nodes” and “dynamic nodes”.
A “static node” corresponds to a static query for content stored in the content source. An example static query for music content is a query to search for all “Artists” represented by the content stored in the content source. A “dynamic node” represents the result set of a search operation. Queries corresponding to dynamic nodes are dynamic queries, meaning that they are based on a selected search result of a previously executed query. An example dynamic query for music content is a query for all “Albums” of a selected artist that is identified by performing a static query for all “Artists”. Example hierarchical structures are described in more detail below with respect to
The data returned by the guided browse function 404 includes content objects and container objects. A container object represents a collection of related content objects. A content object represents media content that is presented by the presentation layer module 401. As described above, media content includes video content, audio content, still imagery, applications, animations, and the like. Applications include code, scripts, widgets, games and the like. Video content includes television programs, movies, video recordings, and the like. Audio content includes music, audio recordings, podcasts, radio programs, spoken audio, and the like. Still imagery includes photos, graphics, and the like. The terms “content,” “media content,” “multimedia content” include scheduled content and unscheduled content. Scheduled content includes, for example, broadcast content and multicast content. Unscheduled content includes, for example, on-demand content, pay-per-access content, downloaded content, streamed content, and stored content.
A content object includes an Application Programming Interface (API) that exposes a getName( ) module. The getName( ) module returns the display name, or other visual representation, such as, for example, an icon or thumbnail of the content object, and a module that is called by the presentation layer module 401 to present the media content that is represented by the content object. The content object's interface or API also exposes a getInterface( ) module that is used to determine that the content object is a content object, as distinguished from a container object.
A container object includes an API that exposes a displayName( ) module that returns the display name or other visual representation, such as, for example, an icon or thumbnail of the container object. The container object's interface or API also exposes a getInterface( ) module that is used to determine that the container object is a container object, as distinguished from a content object.
In the example embodiment, the content object's getName( ) module, the content object's getInterface( ) module, the container object's displayName( ) module, and the container object's getInterface( ) module are each stored as computer-executable process steps encoded in machine-executable instructions. The computer-executable process steps of the modules are stored in storage device 216 of the media server 104 of
In other embodiments, one or more of the content object's getName( ) module, the content object's getInterface( ) module, the container object's displayName( ) module, and the container object's getInterface( ) module are hardware devices that include electronic circuitry constructed to perform the respective process. In an example embodiment, the electronic circuitry includes special purpose processing circuitry. In other example embodiments, the electronic circuitry includes at least one general purpose processor that is constructed to execute computer-executable process steps encoded in machine-executable instructions that are stored on a computer-readable storage medium of the hardware device.
In the case where the guided browse function 404 of
In the case where the guided browse function 404 is in the native browse mode, each container object corresponds to a container in the native tree hierarchy of the content source.
Generally, a user controls the media server application 400 to browse and play media content. By using an input device, the user interacts with a user interface module 402 to select a displayed item for example, that is displayed on a display or user device 106. The displayed items include display names, or other visual representations, such as, for example, icons or thumbnails of content objects and container objects.
In response to the user's selection of the displayed item, the presentation layer module 401 determines whether the item corresponds to a content object or a container object. If the selected item corresponds to a content object, then the presentation layer module 401 presents the content represented by the content object, for example, by playing audio, video, or an animation, by running an application, or by displaying still imagery.
If the selected item is a container object, then the user interface module 402 asks the guided browse function 404 for objects such as container objects, or content objects that are contained within the selected container object. In a case where the guided browse function 404 is in the non-native browse mode, the objects contained in the selected container object are defined according to the hierarchical structure used by the guided browse function 404. In a case where the guided browse function 404 is in the native browse mode, the objects contained in the selected container object are defined according to the native tree hierarchy of the content source corresponding to the container object. The user interface module 402 asks the guided browse function 404 for objects contained in the selected container object by invoking or calling a getChildren( ) module that is exposed by the interface or API of the guided browse function 404. The getChildren( ) module provides objects contained in a selected container object.
In the example embodiment, the guided browse function 404's getChildren( ) module is stored as computer-executable process steps encoded in machine-executable instructions. The computer-executable process steps of the getChildren( ) module are stored in storage device 216 of the media server 104 of
In other embodiments, the getChildren( ) module is a hardware device that includes electronic circuitry constructed to provide objects contained in a selected container object. In an example embodiment in which the guided browse function 404 is a hardware device, the getChildren( ) module is electronic circuitry that is included in the guided browse function 404 hardware device. However, in other embodiments, the guided browse function 404 and the getChildren( ) module are separate hardware devices. In an example embodiment, the electronic circuitry includes special purpose processing circuitry. In other example embodiments, the electronic circuitry includes at least one general purpose processor that is constructed to execute computer-executable process steps encoded in machine-executable instructions that are stored on a computer-readable storage medium of the hardware device.
It should be understood that in various embodiments both the guided browse function 404 and the getChildren( ) module are hardware devices. In other embodiments, the guided browse function 404 is a hardware device and the getChildren( ) module is computer-executable process steps stored on a computer-readable storage medium. In other embodiments, the guided browse function 404 is computer-executable process steps stored on a computer-readable storage medium, and the getChildren( ) module is a hardware device. In other embodiments, both the guided browse function 404 and the getChildren( ) module are computer-executable process steps stored at least one computer-readable storage medium.
Reverting to the discussion of user selection of a displayed item, in a case where the presentation layer module 401 determines that a user has selected a display item that corresponds to a container object, and the guided browse function 404 of
In a case where the presentation layer module 401 determines that a user has selected a display item that corresponds to a container object, and the guided browse function 404 is in the native browse mode, in response to the selection of the container object, the guided browse function 404 browses the file structure of the content source, and returns the content stored in the content source to the presentation layer module 401, asynchronously, via the control module 403. The presentation layer module 401 presents received data to the user by, for example, displaying the results data on a display of the device 106. Thus, the native browse function returns data, such as the objects contained in the selected container object, returned in response to the user's selection according to the file structure of the content stored in the content source.
In the example embodiment, the modules provided by the guided browse function 404 are stored as computer-executable process steps encoded in machine-executable instructions. The computer-executable process steps of the modules are stored in storage device 216 of the media server 104 of
The presentation layer module 401 of
For instance, as shown in
The presentation layer module 401 also uses the content object interface 502 to get data for a selected content object and uses the playlist interface 501, of a playlist object, to get data for a selected playlist. In response to the user selection, for each content object included in the selected playlist, the playlist object uses the content object interface 502 to get the corresponding name of the content object that is to be displayed by the presentation layer module 401.
The presentation layer module 401 uses the media player interface 503, of a media player, to play, run or display either a selected playlist or a selected content object. In the case where a selected playlist is to be played, the media player uses the playlist interface 501 to get data for the selected playlist that is to be played. In turn, the playlist object uses the content object interface 502 to get the data for each content object included in the selected playlist to be played, run, or displayed by the media player. In the example embodiment, the media player is a software media player application that is stored in the storage device 216 of the media server 104 of
In other example embodiments, the media player is stored and executed by an external hardware device, such as, for example, the device 106.
In the case where a selected content object is to be played, run, or displayed, the media player uses the content object interface 502, of the selected content object, to get the corresponding data to be played, run or displayed by the media player.
Generally, the guided browse module 604 generates and manages guided browse functions for content sources. As shown in
The minims content library (“mimi media server content library”) 601 provides content stored on a mass storage device, such as, for example, a USB memory stick, or the like. The active search module 603 provides content by communicating with a search service via a network. The contents messaging module 605 provides content by communicating with a messaging service via a network. The Mediaspace module 602 provides content from content servers via a network. The mlight_cds (“Mediabolic lightweight content directory service”) content source 606 is a Universal Plug and Play Content Directory Service. The MPV (“Music/Photo/Video”) content library 607 is a content source for audio, still imagery, and video contents. The IMDiscovery module 608 discovers Universal Plug and Play servers on a network.
The presentation layer module 401 communicates with guided browse module 604 in an asynchronous manner. The guided browse module 604 includes a function generation module 612 and one or more guided browse functions 404 that are generated by the function generation module 612. The guided browse module 604 communicates with a plurality of content sources, such as minims content library module 601, Mediaspace module 602, Active Search module 603, and Content Messaging module 605.
The guided browse module 604 communicates with minims content library module 601 and Active Search module 603 in a synchronous manner, and communicates with Mediaspace module 602 and Content Messaging module 605 in an asynchronous manner.
Mediaspace module 602 communicates with mlight_cds module 606 and MPV content library 607 in a synchronous manner, and communicates with IMDiscovery module 608 in an asynchronous manner.
The presentation layer module 401 communicates with playlist module 609 in an asynchronous manner. The playlist module 609 corresponds to playlist interface 501 described in relation to
The presentation layer module 401 communicates with media player module 610 in an asynchronous manner. The media player module 610 corresponds to the media player interface 503 of
The media player module 610 provides media playback. For example, the media player module 610 determines what media format is preferred, for example, according to the media player device's compatibility. The media player module 610 switches to a next song in a playlist, handles transition effects, and the like. The playback manager module 611 provides media playback capability such as, for example, decoding video and/or audio codecs, trick mode, controlling the video and/or audio hardware, and the like.
As will be described in more detail below, the function generation module 612 of
As described above, a hierarchical structure defines a hierarchy of content stored in the content source that is independent from the file structure of the content stored in the content source.
The trees returned from any top level container are known as the result level. As shown in
As shown in
As shown in
The data returned by browsing the “album” top level container node are container nodes for letters corresponding to album names represented in the content source. The data returned by browsing an individual letter container for the album top level container are album container nodes. Each individual album letter container node represents a search query for all albums in the content source that whose names start with the respective letter. The data returned by browsing an individual album container are song content objects. Each individual album container node represents a search query for all songs in the content source that are contained in the respective album.
The data returned by browsing the “artist” top level container node are container nodes for letters corresponding to artist container nodes for each artist represented in the content source. The data returned by browsing an individual letter container for the artist top level container are artist container nodes. Each individual artist letter container node represents a search query for all artists in the content source whose names start with the respective letter. The data returned by browsing an individual artist container are song content objects. Each individual artist container node represents a search query for all songs in the content source that are related to the respective artist.
The data returned by browsing the “all tracks” top level container node are container nodes for letters corresponding to the song content objects contained in the content source. The data returned by browsing an individual letter container for the “all tracks” top level container are song content objects. Each individual song letter container node represents a search query for all songs in the content source whose names start with the respective letter.
UPnP is a set of networking protocols promulgated by the UPnP Forum. The goals of UPnP are to allow devices to couple seamlessly and to simplify the implementation of networks for data sharing, communications, and entertainment, and in corporate environments for simplified installation of computer components. UPnP achieves this by defining and publishing UPnP device control protocols (DCP) built upon open, Internet-based communication standards. The term UPnP is derived from plug-and-play, a technology for dynamically attaching devices to a computer, although UPnP is not directly related to the earlier plug-and-play technology. UPnP devices are “plug-and-play” in that when coupled to a network they automatically announce their network address and supported device and services types, enabling clients that recognize those types to use the device. See <http://en.wikipedia.org/wiki/Upnp>, the entire contents of which are incorporated by reference as if set forth in full herein.
DLNA (Digital Living Network Alliance) is a standard used by manufacturers of consumer electronics to allow entertainment devices to share their content with each other across a home network. DLNA provides for the use of digital media between different consumer electronic devices. For example, a DLNA compliant TV will interoperate with a DLNA compliant PC to play music, photos or videos. The specification also includes DRM (digital rights management). See <http://en.wikipedia.org/wiki/Dlna>, the entire contents of which are incorporated by reference as if set forth in full herein.
Regardless of the particular protocol used, at step 902 of
Example content sources include a Universal Plug and Play Content Directory Service (“UPnP CDS”), a local content library, a mimims content library and external content provider, and an aggregated external content provider. External content providers include, for example, Internet content providers such as www.Youtube.com and the like, and television content providers such as CBS and the like. Aggregated external content providers include external content providers that aggregate information from different content providers. For example, an aggregated external content provider can provide content from different external content providers, such as, for example, content from www.Netflix.com and content from www.Blockbuster.com.
As shown at step 903, the presentation layer module 401 selects a content source and a content type, and asks the function generation module 612 to determine whether the selected content source supports search functionality for the selected content type. Example search functionality include UPnP Search, DLNA type search, or another type of search functionality. In other words, presentation layer module 401 asks the function generation module 612 to determine whether the selected content source supports a guided browse function of the received content type, such that the guided browse function provides browsing of the selected content type in accordance with a hierarchical structure of content stored in the content source, the hierarchical structure being independent from the file structure of the content stored in the content source.
As shown at step 904, the presentation layer module 401 receives a response from the function generation module 612 which indicates that the selected content source supports search functionality for the selected content type, and thus supports a guided browse function that provides browsing in accordance with the hierarchical structure.
As shown at step 905, the presentation layer module 401 asks the function generation module 612 to generate the hierarchical structure to be used by the guided browse function to browse content stored in the content source. In the example embodiment illustrated in
As shown at step 906, the presentation layer module 401 invokes a generateFunction( ) module provided by the function generation module 612 to generate the guided browse function 404. The generateFunction( ) module takes as inputs a content source identifier for the selected content source, a content type, and a hierarchical structure.
In the example embodiment, the function generation module 612's generateFunction( ) module is stored as computer-executable process steps encoded in machine-executable instructions. The computer-executable process steps are for generating the guided browse function 404. The computer-executable process steps of the generateFunction( ) module are stored in storage device 216 of the media server 104 of
In other embodiments, the generateFunction( ) module is a hardware device that includes electronic circuitry constructed to generate the guided browse function 404. In an example embodiment in which the function generation module 612 is a hardware device, the generateFunction( ) module is electronic circuitry that is included in the function generation module 612 hardware device. However, in other embodiments, the function generation module 612 and the generateFunction( ) module are separate hardware devices. In an example embodiment, the electronic circuitry includes special purpose processing circuitry. In other example embodiments, the electronic circuitry includes at least one general purpose processor that is constructed to execute computer-executable process steps encoded in machine-executable instructions that are stored on a computer-readable storage medium of the hardware device.
It should be understood that in various embodiments both the function generation module 612 and the generateFunction( ) module are hardware devices. In other embodiments, the function generation module 612 is a hardware device and the generateFunction( ) module is computer-executable process steps stored on a computer-readable storage medium. In other embodiments, the function generation module 612 is computer-executable process steps stored on a computer-readable storage medium, and the generateFunction( ) module is a hardware device. In other embodiments, both the function generation module 612 and generateFunction( ) module are computer-executable process steps stored at least one computer-readable storage medium.
As shown in the example embodiment illustrated in
In other embodiments, the hierarchical structure can be a video content tree structure, audio content tree structure, still imagery tree structure, applications tree structure, animations tree structure, television programs tree structure, movies tree structure, video recordings tree structure, music tree structure, audio recordings tree structure, podcasts tree structure, radio programs tree structure, spoken audio tree structure, photos tree structure, or graphics tree structure.
After the guided browse function 404 has been generated, event notifications are sent to the presentation layer 401. The event notifications comply with one or more protocols such as UPnP, DLNA, and/or another protocol. The event notifications contain the root container object of the guided browse function 404. The root container object includes the top level contents of the content source represented by the guided browse function 404. In particular, the root container object contains the top level container objects such as top level nodes in the hierarchical structure. In the example embodiment of
As shown at step 907, the presentation layer 401 detects user selection of a top level container object, and invokes the getChildren( ) module provided by the guided browse interface 504 to ask the guided browse function 404 for the list children, or contents, of the selected top level container object such as, for example, top level nodes in the hierarchical structure. As shown at step 908, the presentation layer 401 asynchronously receives the list of child objects 921. As shown at step 909, for each received child object, the presentation layer 401 invokes the getName( ) module of the child object to get the name of the child object 921.
As shown at step 910, for each child object 921, the presentation layer 401 invokes the getInterface( ) module of the child object to determine whether the child object is a container object or a content object. If the getInterface( ) module returns a container object interface, then the child is a container object. If the getInterface( ) module returns a content object interface, then the child is a content object.
As shown at step 911, the presentation layer 401 displays the names of the child objects in a manner such that they are selectable by a user. In a case where a displayed name of an item is selected, the presentation layer 401 determines whether the object corresponding to the selected item is a container object or a content object, by using the getInterface( ) module.
In a case where the item corresponds to a container object, the presentation layer 401 invokes the getChildren( ) module of the guided browse interface 504 to ask the guided browse function 404 for the list of children, or contents, of the selected container object. For each child object, the presentation layer 401 invokes the getName( ) module of the child object's interface to get the name of the child object 921, and displays the names of the child objects in a manner such that they are selectable by a user.
In a case where the item corresponds to a content object, the presentation layer 401 determines the type of the content object, such as video content, audio content, still imagery, applications, animations, etc., and generates the appropriate type of media player for the type of content, then enqueues the item for playback by the media player. When the media player is playing, running, or displaying items, it sends playback status events to the presentation layer 401, which displays the status to the user.
If presentation layer module 401 receives a response from function generation module 612 which indicates that the selected content source does not support search for the selected content type (“No” at block 1003), processing proceeds to block 1004. In this case, the content source does not support a guided browse function that provides browsing in accordance with the hierarchical structure. Accordingly, at block 1004, the presentation layer module 401 invokes the generateFunction( ) module provided by the function generation module 612 to generate the guided browse function. In this case, the generateFunction( ) module takes as inputs a content source identifier for the selected content source, and a native browse content type. Because the guided browse function has the native browse content type, any hierarchical structure input is ignored. The hierarchical structure is not used in the case a guided browse function having the native browse content type because such a guided browse function returns the content stored in the content source according to the file structure of the content stored in the content source. As with other types of guided browse functions, the guided browse function having the native browse content type returns content to the presentation layer module 401 asynchronously.
If the presentation layer module 401 receives a response from function generation module 612 which indicates that the selected content source does support search for the selected content type (“Yes” at block 1003), processing proceeds to block 1005. In this case, the guided browse function is generated as described above with respect to
At block 1006, the guided browse function sends notification events to the presentation layer 401. The notification events contain the root container object of the guided browse function.
At block 1007, the presentation layer 401 detects user selection of a top level container object, and invokes the getChildren( ) module of the guided browse interface to ask the guided browse function for the list of children, or contents, of the selected top level container object. In response to receiving the call to the getChildren( ) module, at block 1008, the guided browse function determines whether the guided browse function has a native browse type, meaning that it is in the native browse mode. In other words, the guided browse function determines whether a hierarchical tree structure is available.
If the guided browse function determines that the guided browse function has a native browse type (“No” at block 1008), then at block 1009, the guided browse function uses a browse functionality of the content source to generate the child nodes which are the results to be returned to the presentation layer module 401. In the example embodiment described with respect to
If the guided browse function determines that the guided browse function does not have a native browse type (“No” at block 1008), then at block 1010, the guided browse function uses a search functionality of the content source to generate the child nodes which are the results to be returned to presentation layer module 401. The child nodes are generated by searching the content source according to the hierarchical tree structure of the guided browse function. In particular, the guided browse function searches the content stored in the content source by using a search query corresponding to the selected top level container object. The search query is defined by the hierarchical tree structure of the guided browse function. In the example embodiment described with respect to
At block 1011, the guided browse function sends notification events to the presentation layer module 401. The notification events contain the generated child nodes, which can be either container objects or content objects. The generated child notes, which are the result of the browse or search operation, are sent to the presentation layer module 401 in an asynchronous manner. The presentation layer module 401 displays the names of received child nodes, or items, as described above with respect to
At block 1012, the presentation layer module 401 detects user selection of a displayed child node. In response to detection of user selection of a displayed child node, (“Yes” at block 1012), processing proceeds to block 1013. At block 1013, the presentation layer 401 determines whether a selected child node is a container object or a content object, by using the getInterface( ) module.
In a case where the selected child node is a content object (“No” at block 1013), processing proceeds to block 1014, where the presentation layer 401 determines the type of the content object, such as video content, audio content, still imagery, applications, animations, etc., and generates the appropriate type of media player for the type of content, then enqueues the item for playback by the media player.
In a case where the selected child node is a container object (“Yes” at block 1013), processing returns to block 1007, where the presentation layer 401 invokes the getChildren( ) module of the guided browse interface to ask the guided browse function for the list of children, or contents, of the selected container object. If the content type of the guided browse function is native browse and the content source is UPnP CDS, the guided browse function sends the presentation layer module 401 asynchronous updates for each UPnP container object referenced by the presentation layer module 401. UPnP content directory services are discussed above in relation to
The nodes correspond to at least one query. In an example embodiment, queries corresponding to the nodes of the hierarchical tree structure include the following: a query for all music artists represented by the content stored in the content source; a query for all music albums represented by the content stored in the content source; a query for all music genres represented by the content stored in the content source; a query for all music playlists represented by the content stored in the content source; a query for all music tracks represented by the content stored in the content source; a query for all photo albums represented by the content stored in the content source; a query for all photo slideshows represented by the content stored in the content source; a query for all photos represented by the content stored in the content source; a query for all video playlists represented by the content stored in the content source; a query for all video clips represented by the content stored in the content source; a query for content matching a selected music artist; a query for content matching a selected music album; a query for content matching a selected music genre; a query for content matching a selected music playlist; a query for content matching a selected music track; a query for content matching a selected photo album; a query for content matching a selected photo slideshow; a query for content matching a selected photo; a query for content matching a selected video playlist; a query for content matching a selected video clip; a query for all video content represented by the content stored in the content source; a query for all audio content represented by the content stored in the content source; a query for all still imagery represented by the content stored in the content source; a query for all applications represented by the content stored in the content source; a query for all animations represented by the content stored in the content source; a query for all games represented by the content stored in the content source; a query for all television programs represented by the content stored in the content source; a query for all movies represented by the content stored in the content source; a query for all video recordings represented by the content stored in the content source; a query for all music represented by the content stored in the content source; a query for all audio recordings represented by the content stored in the content source; a query for all podcasts represented by the content stored in the content source; a query for all radio programs represented by the content stored in the content source; a query for all spoken audio represented by the content stored in the content source; a query for all photos represented by the content stored in the content source; a query for all graphics represented by the content stored in the content source; a query for all meta tags represented by the content stored in the content source; a query for all dates represented by the content stored in the content source; a query for content matching a selected meta tag; a query for content matching a selected date; a query for content matching a selected movie; a query for content matching a selected television program; a query for content matching a selected video content; a query for content matching a selected audio content; a query for content matching a selected still image; a query for content matching a selected application; a query for content matching a selected animation; a query for content matching a selected video recording; a query for content matching a selected audio recording; a query for content matching a selected podcast; a query for content matching a selected radio program; a query for content matching a selected spoken audio; a query for content matching a selected game; a query for content matching a selected music track; a query for content matching a selected music album; a query for content matching a selected music artist; a query for content matching a selected graphic; a query for content matching a selected photo; a query for all actors represented by the content stored in the content source; a query for all directors represented by the content stored in the content source; a query for all genres represented by the content stored in the content source; a query for content stored in the content source that matches a current user; a query for all new content stored in the content source; a query for all high definition content stored in the content source; a query for favorite content stored in the content source; a query for content matching a selected actor; a query for content matching a selected director; a query for content matching a selected run time; a query for content matching a selected MPAA (Motion Picture Academy of America) rating; and a query for content matching a selected review rating; a query for television episodes matching a selected television program; a query for content matching a selected television episode; a query for photos matching a selected content; a query for video clips matching a selected content; a query for audio clips matching a selected content; a query for content matching a selected content; a query for video content matching a selected content; a query for audio content matching a selected content; a query for still imagery matching a selected content; a query for applications matching a selected content; a query for animations matching a selected content; a query for games matching a selected content; a query for television programs matching a selected content; a query for movies matching a selected content; a query for video recordings matching a selected content; a query for music matching a selected content; a query for audio recordings matching a selected content; a query for podcasts matching a selected content; a query for radio programs matching a selected content; a query for spoken audio matching a selected content; a query for photos matching a selected content; a query for graphics matching a selected content; a query for awards matching a selected content; a query for cast and crew matching a selected content; a query for actors matching a selected content; a query for directors matching a selected content; a query for synopsis matching a selected content; a query for biographies matching a selected content; a query for credits matching a selected content; a query for meta tags matching a selected content, a query for all container objects matching a selected content.
A guided navigation feature for an electronic and/or interactive program guide uses the hierarchy of nodes structure to keep track of the footprints in the tree. The basic unit of the hierarchical tree structure is a tree node. The tree nodes are application specific and can be utilized as a building block to make a tree structure.
The tree nodes of the hierarchical tree structure include nodes for at least one of video content, audio content, still imagery, applications, and animations. Thus, the queries corresponding to the nodes of the hierarchical tree structure include queries for at least one of video content, audio content, still imagery, applications, animations, and the like. The following table lists the possible node types for an example embodiment.
It should be understood that the node types listed in Table 1 are presented by way of example, and not limitation, and that other embodiments can include different node types that correspond to any category of content. In particular, other embodiments include for example, node types corresponding to any one of video content, audio content, still imagery, applications, animations, games, television programs, movies, video recordings, music, audio recordings, podcasts, radio programs, spoken audio, photos, graphics, directors, actors, genres, new content, high definition content, favorite content, content for a particular user, run times, MPAA ratings, review ratings, television episodes, awards, cast and crew, synopsis, biographies, credits, meta tags, and the like.
The tree nodes are composed of two groups, “static nodes” and “dynamic nodes”. A static node in the tree structure is a virtual node in the media server application. It does not refer to any existing entity on the content source. A static node is usually the top level node in a content tree and is used as a parent container of a specific content type. For example, MUSIC_ARTIST_STATIC is displayed as “Artists” and its children are the music artist content containers. A dynamic node in the tree structure represents the result set of a search operation. A dynamic node represents at least one of content objects and container objects of the content source.
Queries corresponding to static nodes are static queries, meaning that they are not based on a previously executed query. Queries corresponding to dynamic nodes are dynamic queries, meaning that they are based on a selected search result of a previously executed query. For example, when the user navigates to the static node “Artists”, a static query for all “Artists” is executed. The visual representations of matching artists (such as “Bon Jovi”, “Nina Simone” and “Patti Austin”) will be displayed as the results of the static query, and these results correspond to a dynamic node. The dynamic node is associated with a dynamic query that is based on selected search results that correspond to the dynamic node.
In the example shown in
A tree node also supports sorting. Different sort criteria can be specified for each node. For example, objects represented by a tree node can be sorted by the name of the objects, the date of the objects, and the original order of the objects. The hierarchical tree structure is generated by adding nodes. Thus, sort criteria for at least one query in the hierarchical tree structure can be specified, such that for each query having a specified sort criteria, search results obtained by executing the query are sorted in accordance with the respective sort criteria. An existing hierarchical tree structure is configurable by adding, removing, or replacing nodes.
The static node “Artists” represents a container object. If the user selects the visual representation for the static node “Artists” via the user interface presented by the presentation layer module 401, the guided browse function 404 executes the following static query to search for all “Artists” of the content source: “upnp:class derievedfrom “object.container.person.musicArtist””. As indicated in this example, the guided browse function 404 searches for a class derived from an object container for music artists. One of ordinary skill recognizes other searches such as for or by genre or album. As mentioned above, the search may use the UPnP and/or DLNA protocol, or another type of protocol. The guided browse function 404 returns visual representations for artists “Bon Jovi”, “Nina Simone”, “Patti Austin” and “[Unknown Artist]” as results to the presentation layer module 401. The results “Bon Jovi”, “Nina Simone”, “Patti Austin” and “[Unknown Artist]” correspond to the dynamic node MUSIC_ARTISTS_DYNAMIC. In the example embodiment, each of these results corresponds to a container object. The dynamic node MUSIC_ARTISTS_DYNAMIC is associated with a dynamic query that is based on selected search results that correspond to the dynamic node MUSIC_ARTISTS_DYNAMIC. In the example depicted in
The example embodiments described above such as, for example, the systems 100, 200, and network 101, or any part(s) or function(s) thereof, may be implemented in one or more computer systems or other processing systems. Useful machines for performing the operation of the example embodiments presented herein include general purpose digital computers or similar devices.
The computer system 1400 preferably includes without limitation a processor device 1410, a main memory 1425, and an interconnect bus 1405. The processor device 1410 may include without limitation a single microprocessor, or may include a plurality of microprocessors for configuring the computer system 1400 as a multi-processor system. The main memory 1425 stores, among other things, instructions and/or data for execution by the processor device 1410. The main memory 1425 may include banks of dynamic random access memory (DRAM), as well as cache memory.
The computer system 1400 may further include a mass storage device 1430, peripheral device(s) 1440, portable storage medium device(s) 1450, input control device(s) 1480, a graphics subsystem 1460, and/or an output display 1470. For explanatory purposes, all components in the computer system 1400 are shown in
The portable storage medium device 1450 operates in conjunction with a nonvolatile portable storage medium, such as, for example, a compact disc read only memory (CD-ROM), to input and output data and code to and from the computer system 1400. In some embodiments, the media server application may be stored on a portable storage medium, and may be inputted into the computer system 1400 via the portable storage medium device 1450. The peripheral device(s) 1440 may include any type of computer support device, such as, for example, an input/output (I/O) interface configured to add additional functionality to the computer system 1400. For example, the peripheral device(s) 1440 may include a network interface card for interfacing the computer system 1400 with a network 1420.
The input control device(s) 1480 provide a portion of the user interface for a user of the computer system 1400. The input control device(s) 1480 may include a keypad and/or a cursor control device. The keypad may be configured for inputting alphanumeric and/or other key information. The cursor control device may include, for example, a mouse, a trackball, a stylus, and/or cursor direction keys. In order to display textual and graphical information, the computer system 1400 preferably includes the graphics subsystem 1460 and the output display 1470. The output display 1470 may include a cathode ray tube (CRT) display and/or a liquid crystal display (LCD). The graphics subsystem 1460 receives textual and graphical information, and processes the information for output to the output display 1470.
Each component of the computer system 1400 may represent a broad category of a computer component of a general and/or special purpose computer. Components of the computer system 1400 are not limited to the specific implementations provided here.
Portions of the disclosure may be conveniently implemented by using a conventional general purpose computer, a specialized digital computer and/or a microprocessor programmed according to the teachings of the present disclosure, as will be apparent to those skilled in the computer art. Appropriate software coding may readily be prepared by skilled programmers based on the teachings of the present disclosure.
Some embodiments may also be implemented by the preparation of application-specific integrated circuits, field programmable gate arrays, or by interconnecting an appropriate network of conventional component circuits.
Some embodiments include a computer program product. The computer program product may be a computer-readable storage medium or media having instructions stored thereon or therein which can be used to control, or cause, a computer to perform any of the processes of the disclosure. The computer-readable storage medium may include without limitation a floppy disk, a mini disk, an optical disc, a Blu-ray Disc, a DVD, a CD-ROM, a micro-drive, a magneto-optical disk, a ROM, a RAM, an EPROM, an EEPROM, a DRAM, a VRAM, a flash memory, a flash card, a magnetic card, an optical card, nanosystems, a molecular memory integrated circuit, a RAID, remote data storage/archive/warehousing, and/or any other type of device suitable for storing instructions and/or data.
Stored on any one of the computer readable storage medium or media, some implementations include software for controlling both the hardware of the general and/or special computer or microprocessor, and for enabling the computer or microprocessor to interact with a human user or other mechanism utilizing the results of the disclosure. Such software may include without limitation device drivers, operating systems, and user applications. Ultimately, such computer readable storage media further includes software for performing aspects of the disclosure, as described above.
Included in the programming and/or software of the general and/or special purpose computer or microprocessor are software modules for implementing the processes described above.
While various example embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art(s) that various changes in form and detail can be made therein. Thus, the present disclosure should not be limited by any of the above described example embodiments, but should be defined only in accordance with the following claims and their equivalents.
In addition, it should be understood that the figures are presented for example purposes only. The architecture of the example embodiments presented herein is sufficiently flexible and configurable, such that it may be utilized and navigated in ways other than that shown in the accompanying figures.
Further, the purpose of the Abstract is to enable the U.S. Patent and Trademark Office and the public generally, and especially the scientists, engineers and practitioners in the art who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The Abstract is not intended to be limiting as to the scope of the example embodiments presented herein in any way. It is also to be understood that the procedures recited in the claims need not be performed in the order presented.
This application claims the benefit of U.S. Provisional Patent Application Nos. 61/345,877, 61/345,813, and 61/346,030, all filed on May 18, 2010, the disclosure of each of which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
61345813 | May 2010 | US | |
61345877 | May 2010 | US | |
61346030 | May 2010 | US |