The media processing device 100 also includes a storage device 110 that can be configured to store information including media, configuration data, and operating instructions. The storage device 110 can be any type of non-volatile storage, including a hard disk device or a solid-state drive. For example, media received from an external media server can be stored on the storage device 110. The received media thus can be locally accessed and processed. Further, configuration information, such as the resolution of a coupled display device or information identifying an associated media server, can be stored on the storage device 110. Additionally, the storage device 110 can include operating instructions executed by the processor 105 to control operation of the media processing device 100. In one implementation, the storage device 110 can be divided into a plurality of partitions, wherein each partition can be utilized to store one or more types of information and can have custom access control provisions.
A communication bus 115 couples the processor 105 to the other components and interfaces included in the media processing device 100. The communication bus 115 can be configured to permit unidirectional and/or bidirectional communication between the components and interfaces. For example, the processor 105 can retrieve information from and transmit information to the storage device 110 over the communication bus 115. In an implementation, the communication bus 115 may be comprised of a plurality of busses, each of which couples at least one component or interface of the media processing device 100 with another component or interface.
The media processing device 100 also includes a plurality of input and output interfaces for communicating with other devices, including media servers and presentation devices. A wired network interface 120 and a wireless network interface 125 each can be configured to permit the media processing device 100 to transmit and receive information over a network, such as a local area network (LAN) or the Internet. Additionally, an input interface 130 can be configured to receive input from another device through a direct connection, such as a USB or an IEEE 1394 connection. Other types of input interfaces may also be implemented to receive a user input. For example, an input interface may use touch-based operations, near-contact operations or combinations thereof to receive input. For example, an input interface (e.g., a remote control device) may include a proximity detection mechanism that can sense the presence of an input (e.g., a user's finger). As such, a remote control device may sense an input absent user contact with a surface of the remote control device. In some implementations, a user may use a key board and virtually any suitable pointing device (e.g., mouse, track ball, stylus, touch screen, etc.) for interaction. The pointing device can also be operated by a near contact screen that employs a regional sensing field to detect objects in the proximity.
Further, an output interface 135 can be configured to couple the media processing device 100 to one or more external devices, including a television, a monitor, an audio receiver, and one or more speakers. For example, the output interface 135 can include one or more of an optical audio interface, an RCA connector interface, a component video interface, and a High-Definition Multimedia Interface (HDMI). The output interface 135 also can be configured to provide one signal, such as an audio stream, to a first device and another signal, such as a video stream, to a second device. Further, a memory 140, such as a random access memory (RAM) and/or a read-only memory (ROM) also can be included in the media processing device 100. As with the storage device 110, a plurality of types of information, including configuration data and operating instructions, can be stored in the memory 140.
Additionally, the media processing device 100 can include a remote control interface 145 that can be configured to receive commands from one or more remote control devices (not pictured). The remote control interface 145 can receive the commands through wireless signals, such as infrared and radio frequency signals. The received commands can be utilized, such as by the processor 105, to control media playback or to configure the media processing device 100. Similar to the input interface mentioned above, the remote control interface may receive commands from remote control devices that implement touch-based operations, near-contact operations or combinations thereof.
Further, the media processing device 205 and the local media server 215 can include network connections 235 and 240 respectively, which provide access to a network 245, such as the Internet. In one implementation, the media processing device 205 can communicate with a remote media server 250 and/or a media store 255 over the network 245. For example, a connection can be established between the media processing device 205 and the remote media server 250. The connection can be secure or un-secure. Thereafter, the media processing device 205 can receive media content from the remote media server 250, such as by streaming or downloading.
Similarly, the media processing device 205 can be configured to receive media content from a media store 255. For example, upon establishing a connection, the media processing device 205 can request a list of available media content from the media store 255. The list of available media content can include free content, such as trailers and pod casts, and for-purchase content, such as movies, television programs, and music. Additionally, the media processing device 205 can be configured to communicate with the media store 255 to validate media content, such as by verifying digital rights management information. Other types of media devices and systems may also used.
The media data and related metadata may be provided by a single provider, or may be provided by separate providers. In one implementation, the media processing system 300 can be configured to receive media data from a first provider over a first network, such as a cable network, and receive metadata related to the video data from a second provider over a second network, such as a wide area network (WAN). Example media data include video data, audio data, content payload data, or other data conveying audio, textual and/or video data.
In another implementation, the media processing system 300 can be configured to receive media data and metadata from a computing device, such as a personal computer. In one example of this implementation, a user manages one or more media access accounts with one or more content providers through the personal computer. For example, a user may manage a personal iTunes® account with iTunes® software, available from Apple Computer, Inc. Media data, such as audio and video media data, can be purchased by the user and stored on the user's personal computer and/or one or more data stores. The media data and metadata stored on the personal computer and/or the one or more data stores can be selectively pushed and/or pulled for storage in the data store 302 of the media processing system 300.
In another implementation, the media processing system 300 can be used to process media data stored in several data stores in communication with a network, such as wired and/or wireless local area network (LAN), for example. In one implementation, the media processing system 300 can pull and/or receive pushed media data and metadata from the data stores over the network for presentation to a user. For example, the media processing system 300 may be implemented as part of an audio and video entertainment center having a video display device and an audio output device, and can pull media data and receive pushed media data from one or more data stores for storage and processing. At the entertainment center, a user can, for example, view photographs that are stored on a first computer while listening to music files that are stored on a second computer.
In one implementation, the media processing system 300 includes a remote control device 308. The remote control device 308 can include a rotational input device 310 configured to sense touch actuations and generate remote control signals therefrom. The touch actuations can include rotational actuations, such as when a user touches the rotational input device 310 with a digit and rotates the digit on the surface of the rotational input device 310. The touch actuations can also include click actuations, such as when a user presses on the rotational input device 310 with enough pressure to cause the remote control device 308 to sense a click actuation.
In one implementation, the functionality of the media processing system 300 is distributed across several engines. For example, the media processing system 300 may include a controller engine 312, a user interface (UT) engine 314, and one or more media engines 316-1, 316-2, and 316-n. The engines may be implemented in software as software modules or instructions, or may be implemented in hardware, or in a combination of software and hardware.
The control engine 312 is configured to communicate with the remote control device 308 by a link, such as a wireless infrared signal or radio frequency signal. The remote control device 308 can transmit remote control signals generated, for example, from touch actuations of the rotational input device 310 to the control engine 312 over the link. In response, the control engine 312 is configured to receive the remote control signals and generate control signals in response. The control signals are provided to the processing device 304 for processing.
The control signals generated by the control engine 312 and processed by the processing device 304 can invoke one or more of the UT engine 314 and media engines 316-1-316-n. In one implementation, the UT engine 314 manages a user interface to facilitate data presentation for the media engines 316-1-316-n and functional processing in response to user inputs.
In one implementation, the media engines 316 can include one or more content-specific engines, such as a movies engine, television program engine, music engine, and the like. Each engine 316 can be instantiated to support content-specific functional processing. For example, a movie engine to support movie-related functions can be instantiated by selecting a “Movies” menu item. Example movie-related functions include purchasing movies, viewing movie previews, viewing movies stored in a user library, and the like. Likewise, a music engine to support music-related functions can be instantiated by selecting a “Music” menu item. Example music-related functions include purchasing music, viewing music playlists, playing music stored in a user library, and the like.
The media processing system 300 of
The rotational input device areas 360, 362, 364, 366 and 368 are receptive to press actuations. In one implementation, the areas include a menu area 360, a reverse/previous area 362, a play/pause area 364, a forward/next area 366, and a select area 368. The areas 360-368, in addition to generating signals related to their descriptive functionalities, can also generate signals for context-dependent functionality. For example, the menu area 360 can generate signals to support the functionality of dismissing an onscreen user interface, and the play/pause area 364 can generate signals to support the function of drilling down into a hierarchal user interface. In one implementation, the areas 360-368 comprise buttons disposed beneath the surface of the rotational input device 310. In another implementation, the areas 360-368 comprise pressure sensitive actuators disposed beneath the surface of the rotational input device 310.
The processing device 350 is configured to receive the signals generated by the rotational input device 310 and generate corresponding remote control signals in response. The remote control signals can be provided to the communication subsystem 352, which can wirelessly transmit the remote control signals to the media processing system 300.
Although shown as comprising a circular surface, in another implementation, the rotational input device 310 can comprise a rectangular surface, a square surface, or some other shaped surface. Other surface geometries that accommodate pressure sensitive areas and that can sense touch actuations may also be used, e.g., an oblong area, an octagonal area, etc.
Other actuation area configurations may also be used. For example, in another implementation, the remote control device 308 can also include a separate actuation button 370. In this implementation, the areas comprise a “+” or increase area 360, a reverse/previous area 362, a “−” or decrease area 364, a forward/next area 366, a play/pause area 368, and a menu area 370.
The media data can be received through the network 412 by one of the computing devices, such as computing device 408. The network 412 can include one or more wired and wireless networks, such as the Internet. The media data is provided by one or more content providers 414. For example, the content provider 414-1 may provide media data that is processed by the media processing system 300 and output through the output devices 404, and the content provider 414-2 may provide metadata related to the media data for processing by the media processing system 300. Such metadata may include episodic content, artist information, and the like. A content provider 414 can also provide both media data and related metadata.
In one implementation, the media processing system 300 can also communicate with one or more content providers 414 directly. For example, the media processing system 300 can communicate with the content providers the wireless network 402, the I/O device 403, and the network 412. The media processing system 300 can also communicate with the content providers 414 thorough other network configuration, e.g., through a direct connection to a cable modem, through a router, or through one or more other communication devices. Example communications can include receiving sales information, preview information, or communications related to commercial transactions, such as purchasing audio files and video files.
In another implementation, the media processing system 300 can receive content from any of the computing devices 406 and 408, and other such computing devices or data stores 410 available on the network 402 through sharing. Thus, if any one or more of the computing devices or data stores are unavailable, media data and/or meta data one the remaining computing devices or other such computing devices or data stores can still be accessed.
In some implementations, the media items 502-512 can include digital representations of photographs, video clips, movies, promotional media, or combinations thereof. In some implementations, the media items 502-512 can be retrieved from among media items stored in the data store 302 of
Moreover, once an instance of a media item 502-512 transitions through and exits from the display environment 500, a new media item can be retrieved from the data store 302 to replace the exiting media item 502-512. An instance for the new media item can be generated and transitioned on and through the display environment 500. In some example implementations, the number of instances of media items 502-512 in the display environment 500 can be variable. For example, the number of instances of media items 502-512 can vary based upon user preferences (e.g., input through a user interface engine). Alternatively, the number of instances of media items 502-512 can vary quasi-randomly. Furthermore, in some examples, it is not necessary that an instance exit the display environment 500 before another instance of a media item enters the display environment 500. In these display environments 500 the instances of media can be randomly transitioned into the display environment 500. A pre-set limit (e.g., user defined, program defined, etc.) can define an upper limit to the number of instances displayed within the display environment at a single time, the number of instances that appear on a single path, etc.
In some implementations, the instances of the media items can be quasi-randomly scaled to provide the appearance of depth to the display environment. The scaling of the instances of the media items can be maintained while the instances of media items are displayed in the display environment. Instances of new media items can be quasi-randomly scaled prior to being transitioned into the display environment 500. Further, in some implementations, media items can be scaled as they transition along a path, e.g., changing scale as they change position along the path.
In further implementations, the instances of media items may be selectable by the user. For example, the user can select media item instance 510 by clicking on the instance using an input device (e.g., a mouse pointer representation). Media item instance 510 can then be enlarged, thereby providing a better view of the selected media item, while maintaining the transitioning of the media item instances through the display environment.
As discussed above, instances of the media instances 602-612 can be generated and scaled for display in a display environment 600. In some implementations, the scaling of the instances can be quasi-random, to provide the appearance of depth to the display environment 600. Moreover, as various instances of the media items 602-612 transition out of the display environment 600, replacement media items can be selected to replace any of the media items transitioning out of the display environment 600. In some example implementations, the number of instances of media items 602-612 in the display environment 600 can be variable. In such instances, the number of instances of media items 602-612 can vary based upon user preferences (e.g., input through a user interface engine). Alternatively, the number of instances of media items 602-612 can vary quasi-randomly. Furthermore, in some examples, it is not necessary that an instance exit the display environment 600 before another instance of a media item enters the display environment 600. In these display environments 600 the instances of media can be randomly transitioned into the display environment 600. A pre-set limit (e.g., user defined, program defined, etc.) can define an upper limit to the number of instances displayed within the display environment at a single time.
As discussed above, instances of the media instances 702-710 can be generated and scaled for display in a display environment 700. Moreover, as various instances of the media items 702-710 transition out of the display environment 700, replacement media items can be selected to replace any of the media items transitioning out of the display environment 700.
In some example implementations, transitioning the instances of media items forward in the display environment can include intermittent pauses. The intermittent pauses can allow the user to view the instances of media items in a foreground position for a period of time before continuing to transition a next layer of media item instances forward to the foreground position.
For example, in some implementations there can be several layers of instances of digital photos. The first layer can correspond, for example, to a foreground position, while the second and third through n-th layers can correspond, for example, to increasingly smaller scale photographs. The transitioning would pause for a period of time having the first layer in the foreground position. After expiration of a period of time, the first layer can transition out of the display environment and the second layer can transition into a foreground position, while the third through n-th layers are transitioned to the next larger scale (e.g., the original scale of the second layer digital photos). The display environment can display the second layer in the foreground position for a period of time, and then continue to successively transition the third through n-th layers through the foreground position, with foreground pauses for each successive layer. Moreover, in implementations where there are numerous layers, for example, the n-th layer might not initially appear in the display environment. In various implementations, the display environment can be limited by the programmer or the user to displaying four levels of images. In these implementations, each successive transition of a layer out of the display environment can be complemented by a new layer transitioning into the display environment.
As discussed above, instances of the media instances 802-812 can be generated and scaled for display in a display environment 800. In some implementations, the scaling of the instances can be quasi-random, to provide the appearance of depth to the display environment 800. Moreover, the height or amplitude and/or the length associated with the bouncing movement can be quasi-random, or controlled by input received from the user. Further, as various instances of the media items 802-812 transition out of the display environment 800, replacement media items can be selected to replace any of the media items transitioning out of the display environment 800. In various other examples, the paths of movement can be a quasi-random linear path or any other type of path.
In the previous example, the media items transition along one or more paths along a display environment, however, the media items may also exhibit other types of movement such as rotation motion. These different types of movements may be exhibited independent of, or in combination with transitions along one or more paths.
Rotational axes may also be orientated at other angular positions such as angles between vertical orientation (θ=0°) and horizontal orientation (θ=90°) or other angles. Media items may also overlap as illustrated by media items 902 and 904 (shown in
Rotational axis orientation may also change over a period of time. For example, over the course of a predefined time period, one or more of the axes 902-1-912-1 may change from one orientation (e.g., vertical orientation shown in
As discussed above, instances of the media items 902-912 (and 1002-1012) can be generated and scaled for display in the respective display environment 900 (and display environment 1000). In some implementations, the scaling of the instances can be quasi-random, to provide the appearance of depth to the respect display environments 800, 900. Moreover, the rotation position and/or rotational velocity associated with the rotational motion can be quasi-random, or controlled by input received from the user. Further, as various instances of the media items 902-912 rotating within the display environment 900 (or media items 1002-1012 rotating within display environment 1000), replacement media items can be selected to replace any of the media items rotating within the respective display environments 900, 1000. In various other examples, rotational velocity associated with one or more media items 902-912 (or 1002-1012) may vary based upon a deterministic or quasi-random angular acceleration or by another manner.
In this example, one rotational direction (e.g., clockwise) has been selected, however, in other implementations another rotational direction (e.g., counterclockwise) may be selected. Also, in this illustration, all of the media items 1106-1116 are grouped onto a single plane (i.e., common plane 1102). However, in other scenarios, the media items 1106-1116 may be grouped onto multiple planes. For example, referring to
Furthermore, in some implementations, the display environment can be configured to use any of the aforementioned transitions in combination with any other rotations and transitions. For example, the upward/downward transition shown in
In step 1204, instances for each of the selected media items are generated. For example, the instances of the selected media items can be generated by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine configured to receive data and render graphics to a display device).
Optionally, in step 1206, the instances of the media items are scaled. For example, the instances of the media items can be scaled by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine). The instances of the media items can be scaled such that the media items fit within a display environment. In some examples, the display environment can include, among others, a screen saver, a media presentation, a slideshow viewer, a library player, or an audio player. Scaling can also be used to give the appearance of depth or emphasis as desired. Scaling can occur prior to rendering the media item a first time or as the item is transitioned along a path in the display environment.
In step 1208, the instances of the media items can be rotated within the display environment. For example, the instances of the selected media items can be rotated by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine). In some implementations, the instances of the media items may also be transitioned, for example, using sequential refreshing of the instances in slightly different locations, thereby providing the appearance to a user of movement (e.g., linear movement) along a path through the display environment.
Moreover, the instances of media items can be rotated and transitioned at different rates. Rotating and transitioning the instances of media items at different rates, can add to the appearance of depth to the display environment. For example, items that are scaled larger can be rotated or transitioned at a faster rate than items that are scaled smaller. The rate of rotation and transition for a media item that can be linearly proportional to the scaled size of the media item. Thus, small items can have a slower rotation or transition rate, while larger items can have a faster rotation or transition rate. Rotating and transitioning the media items at different rates can also help to prevent a larger item from covering a smaller item as both of the items appear on the display environment.
Further paths which different media items use to transition through the display environment can intersect with each other in some examples. In some implementations, when paths of the media item instances intersect, an instance which is scaled larger can take precedence (e.g., the larger instance can be drawn on top of the smaller instance).
In step 1304, instances for each of the selected media items are generated. For example, the instances of the selected media items can be generated by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine).
Optionally, in step 1306, the instances of the media items are quasi-randomly scaled. For example, the instances of the media items can be quasi-randomly scaled by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine). The instances of the media items can be scaled such that the media items fit within a display environment. In some examples, the display environment can include, among others, a screen saver, a media presentation, a slideshow viewer, a library player, or an audio player.
In step 1308, the media viewer rotates the instances of the media items within the display environment. For example, the instances of the selected media items can be rotated within the display environment by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine). In some implementations, the instances of the media items may also be transitioned, for example, using sequential refreshing of the instances in slightly different locations, thereby providing the appearance to a user of movement (e.g., linear movement) along a path through the display environment.
In step 1310, an audio item is selected. The audio item can be selected, for example, based upon user input received through the user interface 314 of
In step 1312, the audio item is presented. Alternatively, the audio item may be presented using an audio interface engine selected from among the media engines 316-1, 316-2, 316-n, the audio interface engine being configured to generate audio signals suitable for output to speakers based upon received data. In some implementations, the audio item can correspond to at least one of the media items from which a selection is made. For example, promotional media (e.g., album art) associated with the audio item can be mixed among the media items for selection in step 1302. Thus, the display environment can present the promotional material, alerting the user to the audio that is playing.
In step 1404, the instances of each of the selected media items are generated. For example, the instances of the selected media items can be generated by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine).
Optionally, in step 1406, the instances of the media items are quasi-randomly scaled. For example, the instances of the media items can be quasi-randomly scaled by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine). The instances of the media items can be quasi-randomly scaled such that the media items fit within a display environment. In some examples, the display environment can include, among others, a screen saver, a media presentation, a slideshow viewer, a library player, or an audio player.
In step 1408, the instances of the media items are rotated within the display environment. For example, the instances of the selected media items can be rotated by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine). In some implementations, the instances of the media items may also be transitioned, for example, using sequential refreshing of the instances in slightly different locations, thereby providing the appearance to a user of movement (e.g., linear movement) along a path through the display environment.
Step 1410 determines whether any of the instances of the media items are terminating from the display environment. As an example, step 1410 can be performed by a corresponding media engine 316-1, 316-2, 316-n (e.g., presentation engine). When there are no instances of the media items that are terminating from the display environment, the instances of the media items continue to be rotated within the display environment. When any of the instances of the media items are terminating from the display environment, replacement media items are selected in step 1412. Replacement media items can be quasi-randomly selected by a media selection engine from a data store 302. Alternatively, the media items can be selected base upon input received from the user through a user interface 314 of
In step 1414 instances for any replacement media items are generated. For example, the instances of the replacement media items can be generated by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine).
Optionally, in step 1416, the instances of any replacement media items are quasi-randomly scaled. For example, the instances of the replacement media items can be quasi-randomly scaled by a corresponding media engine 316-1, 316-2, 316-n (e.g., a presentation engine). Any instances of the replacement media items can be quasi-randomly scaled such that the media items fit within the display environment. The scaled instances of the replacement media items can be transitioned with other instances in step 1408.
The apparatus, methods, flow diagrams, and structure block diagrams described in this patent document can be implemented in computer processing systems including program code comprising program instructions that are executable by the computer processing system. Other implementations can also be used. Additionally, the flow diagrams and structure block diagrams described in this patent document, which describe particular methods and/or corresponding acts in support of steps and corresponding functions in support of disclosed structural means, can also be utilized to implement corresponding software structures and algorithms, and equivalents thereof.
The apparatus, methods, flow diagrams, and structure block diagrams described in this patent document can be implemented in computer processing systems including program code comprising program instructions that are executable by the computer processing system. Other implementations can also be used. Additionally, the flow diagrams and structure block diagrams described in this patent document, which describe particular methods and/or corresponding acts in support of steps and corresponding functions in support of disclosed structural means, can also be utilized to implement corresponding software structures and algorithms, and equivalents thereof.
The methods and systems described herein may be implemented on many different types of processing devices by program code comprising program instructions that are executable by one or more processors. The software program instructions may include source code, object code, machine code, or any other stored data that is operable to cause a processing system to perform methods described herein.
The systems and methods may be provided on many different types of computer-readable media including computer storage mechanisms (e.g., CD-ROM, diskette, RAM, flash memory, computer's hard drive, etc.) that contain instructions for use in execution by a processor to perform the methods' operations and implement the systems described herein.
The computer components, software modules, functions and data structures described herein may be connected directly or indirectly to each other in order to allow the flow of data needed for their operations. It is also noted that software instructions or a module can be implemented for example as a subroutine unit of code, or as a software function unit of code, or as an object (as in an object-oriented paradigm), or as an applet, or in a computer script language, or as another type of computer code or firmware. The software components and/or functionality may be located on a single device or distributed across multiple devices depending upon the situation at hand.
This written description sets forth the best mode of the invention and provides examples to describe the invention and to enable a person of ordinary skill in the art to make and use the invention. This written description does not limit the invention to the precise terms set forth. Thus, while the invention has been described in detail with reference to the examples set forth above, those of ordinary skill in the art can effect alterations, modifications and variations to the examples without departing from the scope of the invention.
This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/530,665, filed on Sep. 11, 2006, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 11530665 | Sep 2006 | US |
Child | 11620876 | US |