The present disclosure generally related to a user interface is used for displaying media collections. More particularly, the present disclosure relates to a method for generating media collections which graphically correspond to the time of the media assets within such media collections.
When using a media device such as a media player or a tablet, it is likely that a user will have a variety of media assets (such as video, audio, pictures, and the like) that they will want to organize and playback. A user can, for example, use a playlist which provides a listing of media assets and a play order for the playback of such media assets. A playlist however can be cumbersome to use because most playlists exist in the form of text and they do not provide a user an easy way to determine how selected media assets correlate to other media assets.
A method and apparatus is presented where media assets are organized graphically in the form of media collection shelves. Such shelves can be used to specify a total amount of time a user wants media assets to be played and the play order of media assets. Selected media assets are modified when placed within a media collection shelf to reflect the amount of time the media asset will occupy in relation to the total amount of time designated for a media collection shelf.
These, and other aspects, features and advantages of the present disclosure will be described or become apparent from the following detailed description of the preferred embodiments, which is to be read in connection with the accompanying drawings.
In the drawings, wherein like reference numerals denote similar elements throughout the views:
The present disclosure provides several different embodiments of a user interface that is used for receiving, recording, playing back, purchasing, and the like media such as videos, television shows, movies, audio, music, video games, and the like. Such a user interface can be implemented on devices such as a computer, set top box, media server, tablet, mobile phone, personal media, device, portable video game system, video game system, and so forth.
The present disclosure provides several different embodiments of a user interface that is used for receiving, recording, playing back, purchasing, and the like media such as videos, television shows, movies, audio, music, video games, and the like. Such a system can be implemented on devices such as a computer, set top box, media server, tablet, mobile phone, personal media, device, portable video game system, video game system, and so forth.
Turning now to
A second form of content is referred to as special content. Special content may include content delivered as premium viewing, pay-per-view, or other content otherwise not provided to the broadcast affiliate manager, e.g., movies, video games or other video elements. In many cases, the special content may be content requested by the user. The special content may be delivered to a content manager 110. The content manager 110 may be a service provider, such as an Internet website, affiliated, for instance, with a content provider, broadcast service, or delivery network service. The content manager 110 may also incorporate Internet content into the delivery system. The content manager 110 may deliver the content to the user's media device 108 over a separate delivery network, delivery network 2 (112). Delivery network 2 (112) may include high-speed broadband Internet type communications systems. It is important to note that the content from the broadcast affiliate manager 104 may also be delivered using all or parts of delivery network 2 (112) and content from the content manager 110 may be delivered using all or parts of delivery network 1 (106). In addition, the user may also obtain content directly from the Internet via delivery network 2 (112) without necessarily having the content managed by the content manager 110.
Several adaptations for utilizing the separately delivered content may be possible. In one possible approach, the special content is provided as an augmentation to the broadcast content, providing alternative displays, purchase and merchandising options, enhancement material, etc. In another embodiment, the special content may completely replace some programming content provided as broadcast content. Finally, the special content may be completely separate from the broadcast content, and may simply be a media alternative that the user may choose to utilize. For instance, the special content may be a library of movies that are not yet available as broadcast content.
The media device 108 may receive different types of content from one or both of delivery network 1 and delivery network 2. The media device 108 processes the content, and provides a separation of the content based on user preferences and commands. The media device 108 may also include a storage device, such as a hard drive or optical disk drive, for recording and playing back audio and video content. Further details of the operation of the media device 108 and features associated with playing back stored content will be described below in relation to
The media device 108 may also be interfaced to a second screen such as a touch screen control device 116. The touch screen control device 116 may be adapted to provide user control for the media device 108 and/or the display device 114. The touch screen device 116 may also be capable of displaying video content. The video content may be graphics entries, such as user interface entries, or may be a portion of the video content that is delivered to the display device 114 The touch screen control device 116 may interface to media device 108 using any well known signal transmission system, such as infra-red (IR) or radio frequency (RF) communications and may include standard protocols such as infra-red data association (IRDA) standard, Wi-Fi, Bluetooth and the like, or any other proprietary protocols. Operations of touch screen control device 116 will be described in further detail below.
Optionally, media device 108 and touch screen control device 116 can be integrated into the same device. Examples of these media devices with a touch screen include computers, laptops, cell phones, personal media player, MP3 players, personal desk assistants, tablet devices, digital video recorders, and the like. For purposes of the this specification, the term media device 108 can encompass all of these type of devices with set top boxes, digital video recorders, gateway devices, and the like.
In the example of
Turning now to
In the device 200 shown in
The decoded output signal is provided to an input stream processor 204. The input stream processor 204 performs the final signal selection and processing, and includes separation of video content from audio content for the content stream. The audio content is provided to an audio processor 206 for conversion from the received format, such as compressed digital signal, to an analog waveform signal. The analog waveform signal is provided to an audio interface 208 and further to the display device or audio amplifier. Alternatively, the audio interface 208 may provide a digital signal to an audio output device or display device using a High-Definition Multimedia Interface (HDMI) cable or alternate audio interface such as via a Sony/Philips Digital Interconnect Format (SPDIF). The audio interface may also include amplifiers for driving one more sets of speakers. The audio processor 206 also performs any necessary conversion for the storage of the audio signals.
The video output from the input stream processor 204 is provided to a video processor 210. The video signal may be one of several formats. The video processor 210 provides, as necessary a conversion of the video content, based on the input signal format. The video processor 210 also performs any necessary conversion for the storage of the video signals.
A storage device 212 stores audio and video content received at the input. The storage device 212 allows later retrieval and playback of the content under the control of a controller 214 and also based on commands, e.g., navigation instructions such as fast-forward (FF) and rewind (Rew), received from a user interface 216 and/or touch panel interface 222. The storage device 212 may be a hard disk drive, one or more large capacity integrated electronic memories, such as static RAM (SRAM), or dynamic RAM (DRAM), or may be an interchangeable optical disk storage system such as a compact disk (CD) drive or digital video disk (DVD) drive.
The converted video signal, from the video processor 210, either originating from the input or from the storage device 212, is provided to the display interface 218. The display interface 218 further provides the display signal to a display device of the type described above. The display interface 218 may be an analog signal interface such as red-green-blue (RGB) or may be a digital interface such as HDMI. It is to be appreciated that the display interface 218 will generate the various screens for presenting the search results in a three dimensional gird as will be described in more detail below.
The controller 214 is interconnected via a bus to several of the components of the device 200, including the input stream processor 202, audio processor 206, video processor 210, storage device 212, and a user interface 216. The controller 214 manages the conversion process for converting the input stream signal into a signal for storage on the storage device or for display. The controller 214 also manages the retrieval and playback of stored content. Furthermore, as will be described below, the controller 214 performs searching of content and the creation and adjusting of the gird display representing the content, either stored or to be delivered via the delivery networks, described above.
The controller 214 is further coupled to control memory 220 (e.g., volatile or nonvolatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.) for storing information and instruction code for controller 214. Control memory 220 may store instructions for controller 214. Control memory may also store a database of elements, such as graphic elements containing content, various graphic elements used for generating a displayed user interface for display interface 218, and the like. Alternatively, the memory may store the graphic elements in identified or grouped memory locations and use an access or location table to identify the memory locations for the various portions of information related to the graphic elements. In addition, various graphic elements can be generated in response to computer instructions interpreted by controller 214 for output to display interface 218. Additional details related to the storage of the graphic elements will be described below. Further, the implementation of the control memory 220 may include several possible embodiments, such as a single memory device or, alternatively, more than one memory circuit communicatively connected or coupled together to form a shared or common memory. Still further, the memory may be included with other circuitry, such as portions of bus communications circuitry, in a larger circuit.
Optionally, controller 214 can be adapted to extract metadata from audio and video media by using audio processor 206 and video processor 210, respectively. That is, metadata that is contained in video signal in the vertical blanking interval, auxiliary data fields associated with video, or in other areas in the video signal can be harvested by using the video processor 210 with controller 214 as to generate metadata that can be used for functions such as generating an electronic program guide, have descriptive information about received video, supporting an auxiliary information service, and the like. Similarly, the audio processor 206 working with controller 214 can be adapted to recognize audio watermarks that may be in an audio signal. Such audio watermarks can then be used to perform some action such as the recognition of the audio signal, security which identifies the source of an audio signal, or perform some other service. Furthermore, metadata to support the actions listed above can come from a network source which are processed by controller 214.
Turning now to
Turning now to
Bumping 420 is defined by a two-stroke drawing indicating pointing in one direction, either up, down, left or right. The bumping gesture is associated with specific commands in context. For example, in a TimeShifting mode, a left-bump gesture 420 indicates rewinding, and a right-bump gesture indicates fast-forwarding. In other contexts, a bump gesture 420 is interpreted to increment a particular value in the direction designated by the bump. Checking 440 is defined as in drawing a checkmark. It is similar to a downward bump gesture 420. Checking is identified in context to designate a reminder, user tag or to select an item or element. Circling 440 is defined as drawing a circle in either direction. It is possible that both directions could be distinguished. However, to avoid confusion, a circle is identified as a single command regardless of direction. Dragging 450 is defined as an angular movement of the controller (a change in pitch and/or yaw) while pressing a button (virtual or physical) on the tablet 300 (i.e., a “trigger drag”). The dragging gesture 450 may be used for navigation, speed, distance, time-shifting, rewinding, and forwarding. Dragging 450 can be used to move a cursor, a virtual cursor, or a change of state, such as highlighting outlining or selecting on the display. Dragging 450 can be in any direction and is generally used to navigate in two dimensions. However, in certain interfaces, it is preferred to modify the response to the dragging command. For example, in some interfaces, operation in one dimension or direction is favored with respect to other dimensions or directions depending upon the position of the virtual cursor or the direction of movement. Nodding 460 is defined by two fast trigger-drag up-and-down vertical movements. Nodding 460 is used to indicate “Yes” or “Accept.” X-ing 470 is defined as in drawing the letter “X.” X-ing 470 is used for “Delete” or “Block” commands. Wagging 480 is defined by two trigger-drag fast back-and-forth horizontal movements. The wagging gesture 480 is used to indicate “No” or “Cancel.”
Depending on the complexity of the sensor system, only simple one dimensional motion or gestures may be allowed. For instance, a simple right or left movement on the sensor as shown here may produce a fast forward or rewind function. In addition, multiple sensors could be included and placed at different locations on the touch screen. For instance, a horizontal sensor for left and right movement may be placed in one spot and used for volume up and down, while a vertical sensor for up down movement may be place in a different spot and used for channel up and down. In this way specific gesture mappings may be used. As discussed in further detail below, a two finger swipe gesture may be utilized to initiate the throwing or moving of content from the tablet 300 to the main screen or display device 114.
Label 1060 indicates that user interface 1010 is related to “collections”. Menu 1070 can be used to resume back to previous collection views as shown for user interfaces 500, 600, 700, 800, and 900. The activation of add collection button 1085 creates a new media collection shelf to be populated where the shelf can be named with a user designated label. Edit collection button 1080 lets a user change the order of media collection shelves whereby a user can select and drag a media collection shelf to a new position.
Activation of an edit button such as 1025 or 1035 brings up an interface as shown in
The activation of menu 1070 for user interface 1100 will let a user select media assets from different media collections as shown in user interfaces 500, 600, 700, 800, and 900 when constructing a media collection shelf. The activation of 1090 indicates that a user is done constructing a media collection shelf, while the activation of cancellation button 1095 cancels any of the modifications made to a media collection shelf. Label 1060 indicates that the presently created media collection shelf is called “trip”.
In the description of TABLE I, number of asset represents the number shown in the figures of a described media asset. Type of asset corresponds to the category of a media asset being audio, video, program, and the like. Title represents the name of a media asset. Location is the physical location of a media asset which can be local memory storage, remote memory storage, a server, a video on demand service, a cloud storage service, a streaming media service, and the like. Notably, TABLE I accommodates media in different locations as long as there is information that references the location of a media asset. Length is the length of time in minutes and seconds of a media asset. Such a value is known as the time length of a media asset as denoted by a value Ta. The actor field lists the actors performing in a media asset. Optionally, when a media asset is audio, the actor field represents the name of the band or performer responsible for the audio. The director field represents the director of a media asset. The ratings field represents a critical rating (0-5) that a media asset has received from a critic or is an averaged rating score from a source such as RottenTomatoes.com. The information of TABLE I can be generated from metadata that comes with a media asset, from electronic program guide information, from a local database, from a remote database, a metadata descriptive service, a combination of these sources, and the like.
In
Step 2610 represents the generation of a media collection shelf where the horizontal length of the shelf is proportional to a time duration value (Ts) selected by a user. As described previously, a user can specify that a media collection shelf have a time duration value that is the total amount of time that is afforded to assets that will eventually occupy the media collection shelf. For example, a user can specify that a media collection shelf be an hour whereby media assets that are eventually selected for the media selection shelf should not exceed an hour in cumulative time duration (Tc) where Tc is equal to the total time value of all of the previously selected media assets [Tc=Ta(1)+Ta(2)+Ta(3) . . . . Ta(n), n=total number of assets selected].
Step 2615 will populate a media collection shelf with a graphical element representing a selected media asset, when such an asset is selected by a user. A graphical element representing an audio, video, application, or other type of media asset can be dragged and dropped into the area afforded to a media collection shelf. The positioning of the graphical element in the media collection shelf represents that a media asset can be associated with the media collection shelf and/or the media asset will be played back when the media collection shelf is activated for playback.
Determining whether a time duration of a selected media asset exceeds the total time duration left for a media collection shelf is calculated in step 2620. As explained above for step 2610, a time duration value is associated with a media collection shelf. When media assets are added to a media collection shelf, a device can calculate the whether the time duration of a media asset plus the cumulative time of previously selected media assets exceed the time duration of a media collection shelf (Ta+Tc>Ts). If this statement is true, a message can be generated to indicate that the time of the selected media asset exceeds the remaining time left for a media collection shelf (as shown in
If this calculation is not true, then step 2625 takes place where the corresponding graphical element is scaled in a horizontal direction as compared to the duration of the time associated with the media collection shelf. This scaling operation can take the graphical element associated with a media asset and cause the element to occupy (lengthwise) a certain proportion of the media collection shelf. One approach for calculating this value takes Ta/Ts and multiples this value by the length of the media collection shelf. The selected graphical element is then scaled to this calculated length in accordance with the principles listed herein. Other approaches are also implementable in accordance with the principles described herein.
Step 2630 will determine if the media shelf collection can accommodate an additional selected media asset because the duration value of the shelf is not exceeded by the cumulative duration of time of previously selected media assets. This is similar to the calculation performed in step 2620, where Tc is updated with the Ta (Tc=Tc (previous time value)+Ta) corresponding to a selected media asset. If Tc<Ts, then steps 2620-2630 can be repeated until Tc=Ts. Of course this repetition in steps 2620-2630 does not need to take place, where a user can opt to have step 2635 performed directly. Optionally, a user can request that the user interface display all of the media assets with a time Ta that is less than Ts−Tc. The result of this request is shown in
If Tc=Ts or a user opts to skip the repetition of steps 2620-2630, step 2635 can take place where all of the selected media assets whose graphical elements populate a media collection shelf can be played back. The playback of media assets can be performed in accordance with the principles described throughout this specification, where such media assets can be played from remote or local sources using the components as represented in
It should be understood that the elements shown in the FIGS. may be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces.
The present description illustrates the principles of the present disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its scope.
All examples and conditional language recited herein are intended for informational purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (“DSP”) hardware, read only memory (“ROM”) for storing software, random access memory (“RAM”), and nonvolatile storage.
Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
Although embodiments which incorporate the teachings of the present disclosure have been shown and described in detail herein, those skilled in the art can readily devise many other varied embodiments that still incorporate these teachings. It is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings.
This application claims the benefit of U.S. Provisional Application Ser. No. 61/426,509 filed Dec. 22, 2010 and U.S. Provisional Application Ser. No. 61/429,741 filed on Jan. 4, 2011, which are incorporated by reference herein in their entirety.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US11/65999 | 12/20/2011 | WO | 00 | 2/18/2014 |
Number | Date | Country | |
---|---|---|---|
61426509 | Dec 2010 | US | |
61429741 | Jan 2011 | US |