This disclosure is related to media processing systems and methods.
Media devices, such as digital video and audio players, can include multiple functions and capabilities, such as playing stored content, browsing and selecting from recorded content, storing and/or receiving content selected by a user, and the like. These various functions can often be grouped according to content types, e.g., movies, music, television programs, photos, etc. The functions can then be accessed through various user interfaces that are typically arranged in a hierarchal manner, having a “root” or “home” user interface at the top of the hierarchy, from which the various context-dependent user interfaces are accessible. The user interfaces can include both graphical and textual features. It is desirable that the user interface conveys information to the user in an intuitive manner, and readily provides access to various functions.
Disclosed herein are systems and methods for processing a media user interface. In one example implementation, a media menu interface comprising a plurality of media menu items is generated. One or more of the media menu items can be highlighted. A plurality of media menu item abstractions corresponding to the media menu items is generated, and the media menu item abstractions are arranged so that a media menu item abstraction corresponding to a highlighted media menu item is displayed in a foreground position. The media menu item abstractions that do not correspond to the highlighted media menu item are arranged in background positions.
In another example implementation, a computer readable medium stores instructions that are executable by a processing device. Upon execution of the instructions, the processing device generates a home interface environment, and generates a home menu within the home interface environment. The home menu comprises a plurality of home menu items. The processing device also generates a plurality of home menu item abstractions, and one of the home menu item abstractions is arranged in a foreground position while the remaining home menu item abstractions are arranged in background positions in the home interface environment. The foreground positions and background positions define a multidimensional path extending from a terminus.
The media data and related metadata may be provided by a single provider, or may be provided by separate providers. In one implementation, the media processing system 100 can be configured to receive media data from a first provider over a first network, such as a cable network, and receive metadata related to the video data from a second provider over a second network, such as a wide area network (WAN). Example media data include video data, audio data, content payload data, or other data conveying audio, textual and/or video data.
In another implementation, the media processing system 100 can be configured to receive media data and metadata from a computing device, such as a personal computer. In one example of this implementation, a user manages one or more media access accounts with one or more content providers through the personal computer. For example, a user may manage a personal iTunes® account with iTunes® software, available from Apple Computer, Inc. Media data, such as audio and video media data, can be purchased by the user and stored on the user's personal computer and/or one or more data stores. The media data and metadata stored on the personal computer and/or the one or more data stores can be selectively pushed and/or pulled for storage in the data store 102 of the media processing system 100.
In another implementation, the media processing system 100 can be used to process media data stored in several data stores in communication with a network, such as wired and/or wireless local area network (LAN), for example. In one implementation, the media processing system 100 can pull and/or receive pushed media data and metadata from the data stores over the network for presentation to a user. For example, the media processing system 100 may be implemented as part of an audio and video entertainment center having a video display device and an audio output device, and can pull media data and receive pushed media data from one or more data stores for storage and processing. At the entertainment center, a user can, for example, view photographs that are stored on a first computer while listening to music files that are stored on a second computer.
In one implementation, the media processing system 100 includes a remote control device 108. The remote control device 108 can include a rotational input device 110 configured to sense touch actuations and generate remote control signals therefrom. The touch actuations can include rotational actuations, such as when a user touches the rotational input device 110 with a digit and rotates the digit on the surface of the rotational input device 110. The touch actuations can also include click actuations, such as when a user presses on the rotational input device 110 with enough pressure to cause the remote control device 108 to sense a click actuation.
In one implementation, the functionality of the media processing system 100 is distributed across several engines. For example, the media processing system 100 may include a controller engine 112, a user interface (UI) engine 114, and one or more media engines 116-1, 116-2, and 116-n. The engines may be implemented in software as software modules or instructions, or may be implemented in hardware, or in a combination of software and hardware.
The control engine 112 is configured to communicate with the remote control device 108 by a link, such as a wireless infrared signal or radio frequency signal. The remote control device 108 can transmit remote control signals generated, for example, from touch actuations of the rotational input device 110 to the control engine 112 over the link. In response, the control engine 112 is configured to receive the remote control signals and generate control signals in response. The control signals are provided to the processing device 104 for processing.
The control signals generated by the control engine 112 and processed by the processing device 104 can invoke one or more of the UI engine 114 and media engines 116-1-116-n. In one implementation, the UI engine 114 manages a user interface to facilitate data presentation for the media engines 116-1-116-n and functional processing in response to user inputs.
In one implementation, the media engines 116 can include one or more content-specific engines, such as a movies engine, television program engine, music engine, and the like. Each engine 116 can be instantiated to support content-specific functional processing. For example, a movie engine to support movie-related functions can be instantiated by selecting a “Movies” menu item. Example movie-related functions include purchasing movies, viewing movie previews, viewing movies stored in a user library, and the like. Likewise, a music engine to support music-related functions can be instantiated by selecting a “Music” menu item. Example music-related functions include purchasing music, viewing music playlists, playing music stored in a user library, and the like.
The media processing system 100 of
The rotational input device areas 160, 162, 164, 166 and 168 are receptive to press actuations. In one implementation, the areas include a menu area 160, a reverse/previous area 162, a play/pause area 164, a forward/next area 166, and a select area 168. The areas 160-168, in addition to generating signals related to their descriptive functionalities, can also generate signals for context-dependent functionality. For example, the menu area 160 can generate signals to support the functionality of dismissing an onscreen user interface, and the play/pause area 164 can generate signals to support the function of drilling down into a hierarchal user interface. In one implementation, the areas 160-168 comprise buttons disposed beneath the surface of the rotational input device 110. In another implementation, the areas 160-168 comprise pressure sensitive actuators disposed beneath the surface of the rotational input device 110.
The processing device 150 is configured to receive the signals generated by the rotational input device 110 and generate corresponding remote control signals in response. The remote control signals can be provided to the communication subsystem 152, which can wirelessly transmit the remote control signals to the media processing system 100.
Although shown as comprising a circular surface, in another implementation, the rotational input device 110 can comprise a rectangular surface, a square surface, or some other shaped surface. Other surface geometries that accommodate pressure sensitive areas and that can sense touch actuations may also be used, e.g., an oblong area, an octagonal area, etc.
Other actuation area configurations may also be used. For example, in another implementation, the remote control device 108 can also include a separate actuation button 170. In this implementation, the areas comprise a “+” or increase area 160, a reverse/previous area 162, a “−” or decrease area 164, a forward/next area 166, a play/pause area 168, and a menu area 170.
The media data can be received through the network 212 by one of the computing devices, such as computing device 208. The network 212 can include one or more wired and wireless networks, such as the Internet. The media data is provided by one or more content providers 214. For example, the content provider 214-1 may provide media data that is processed by the media processing system 100 and output through the output devices 206, and the content provider 214-2 may provide metadata related to the media data for processing by the media processing system 100. Such metadata may include episodic content, artist information, and the like. A content provider 214 can also provide both media data and related metadata.
In one implementation, the media processing system 100 can also communicate with one or more content providers 214 directly. For example, the media processing system 100 can communicate with the content providers the wireless network 202, the I/O device 203, and the network 212. The media processing system 100 can also communicate with the content providers 214 thorough other network configuration, e.g., through a direct connection to a cable modem, through a router, or through one or more other communication devices. Example communications can include receiving sales information, preview information, or communications related to commercial transactions, such as purchasing audio files and video files.
In another implementation, the media processing system 100 can receive content from any of the computing devices 206 and 208, and other such computing devices or data stores 210 available on the network 202 through sharing. Thus, if any one or more of the computing devices or data stores are unavailable, media data and/or metadata one the remaining computing devices or other such computing devices or data stores can still be accessed.
In one implementation, the media menu interface environment 300 includes a media menu 302 identified in part by an icon 304 in a title location and a title 306, e.g., “iTv.” The media menu 302 includes media menu items 310, 312, 314, 316, 318, 320 and 322, respectively entitled “Movies,” “TV Shows,” “Music,” “Podcasts,” “Photos,” “Settings,” and “Streaming.” The media menu 302 can also include a highlight indicator 324 that highlights a media menu item. In one implementation, the highlight indicator 324 is a graphical indicator that provides the effect of a diffused backlighting, e.g., a glow highlight that provides the appearance of a backlit surface beneath the highlighted menu item.
A highlight selection of a menu item by the highlight indicator 324 indicates that the menu item is eligible for a further selection action, e.g., eligible to be selected by actuating the select area 168 on the rotational input device 110. The highlight indicator 324 can be moved vertically, for example, by actuating menu area 160 and the play/pause area 164 on the rotational input device 110.
Upon the further selection, a process associated with the highlighted menu item is performed. In one implementation, selection of the media menu item 310 when highlighted generates a movie content menu environment for processing media data related to movies, such as movie previews and full-length movies. Selection of the media menu item 312 when highlighted generates a TV Shows content menu environment for processing media data related to television programs, such as program episodes. Selection of the media menu item 314 when highlighted generates a Music content menu environment for processing media data related to music, such as audio files and music video files. Selection of the media menu item 316 when highlighted generates a Podcasts content menu environment for processing media data related to podcasts. Selection of the media menu item 318 when highlighted generates a Photos content menu environment for processing media data related to photos, such as photographs and videos. Selection of the media menu item 320 when highlighted generates a settings menu environment for changing settings of the media system, such as setting restrictions and shared files. Selection of the media menu item 322 when highlighted generates a Streaming menu environment for identifying and selecting media data stored on data stores or computer devices accessible through a network, such as media data stored on computing devices 206 and 208 and data store 210 and accessible over the network 202 of
The media menu 302 can also include a child indicator 326 associated with a media menu item. The child indicator 326 indicates that one or more sub-menus or sub-items, e.g., folders, will become available or will be accessed upon selection of the corresponding media menu item.
The media menu interface environment 300 also includes media menu item abstractions that correspond to one or more of the media menu items. For example, the media menu item abstractions 330, 332, 334 and 336 correspond to media menu items 310, 312, 314 and 316, respectively. In one implementation, the media menu item abstractions are graphical representations of the content of corresponding media menu items. For example, the media menu item abstraction 330, which corresponds to the Movies media menu item 310, can be a movie icon. Likewise, the media menu item abstraction 332, which corresponds to the TV Shows media menu item 312, can be a television icon; the media menu item abstraction 334, which corresponds to the Music media menu item 314, can be a music icon, etc.
In one implementation, the media menu item abstractions 330-336 are arranged such that at least one of the media menu item abstractions, e.g., icon 330, is in a foreground position, and the remaining the media menu item abstractions, e.g., icons 332-336, are in one or more background positions. The foreground and background positions define a multidimensional path 350, and the media menu item abstraction 336 is in a background position that defines a terminus of the multidimensional path. In one implementation, the terminus is at the edge 352 of the media menu interface environment 300.
In one implementation, the media menu item abstraction corresponding to a highlighted menu is displayed in the foreground position. For example, in
In another implementation, one or more of the media menu item abstractions in the background positions may be displayed with an effect, e.g., a blurring effect. The blurring effect can be used to further deemphasize the media menu item abstractions. For example, in
In another implementation, media menu item abstractions are scaled in size, for example, substantially or functionally proportionally to the proximity of the media menu item abstraction to the foreground position. For example, the media menu item abstraction 336 can be displayed at approximately 20% of full scale, and the media menu item abstraction 330 can be displayed at 100% of full scale.
In one implementation, changing the position of the highlight indicator 324 causes the highlight indicator to transition from a highlighted media menu item to a media menu item to be highlighted, e.g., an adjacent media menu item. The transition of the highlight indicator 324 likewise causes the media menu item abstractions to transition between the foreground and background positions along the multidimensional path 350 until the media menu item abstraction corresponding to the newly-highlighted media menu item is in the foreground position.
Because the media menu item abstraction 332 is transitioning from a background position into the foreground position previously occupied by media menu item abstraction 330, the media menu item abstraction 330 transitions out of the media menu interface environment 300, as indicated by directional arrow 364. In the example implementation shown in
In one implementation, the scale of the media menu item abstraction transitioning off the edge 352 of the media menu interface environment 300 can be increased to simulate a “fly by” effect. For example, the scale of the media menu item abstraction 330 can be proportionally increased from 100% to 150% of full scale as the media menu item abstraction 330 transitions from the foreground position to the edge 352 of the media menu interface environment 300.
The media menu item abstraction 330, which corresponds to the Movies menu item 310 which is to be highlighted by the highlight indicator 324, emerges from the edge 352 and back into the foreground position, as indicated by the directional arrow 374. In the example implementation shown in
Once the transitions indicated by the directional arrows 370, 372 and 374 are complete, the media menu interface environment 300 returns to the state as depicted in
Likewise, transitioning the highlight indicator 324 to the photos media menu item 318 will cause the media menu item abstraction 340 to transition into a background position and the media menu item abstraction 342 to transition further into the background positions, and will also cause the media menu item abstraction 338 to emerge into the foreground position.
Other processes can be implemented to illustrate a transition of a media menu item abstraction out of the media menu interface environment. For example, in one implementation, a media menu item abstraction is increased in size and fades out, simulating an effect of vertically launching from the media menu interface environment. In another implementation, a media menu item abstractions follows a straight path, or a linear path, from the foreground position out of the media menu interface environment. Other visual effects can also be used.
In another implementation, the media menu item abstractions can include a reflection effect. For example, the media menu item abstractions 332, 334, 336 and 338 include reflections 333, 335, 337 and 339. The reflection effect further emphasizes a multidimensional visual effect, and can be implemented in any of the implementations described herein.
In another implementation, the number of media menu item abstractions displayed along the multidimensional path 350 can vary depending on the size of the media menu item abstractions. For example, the media processing system 100 may normally display four media menu item abstractions; however, if the media menu item abstractions are increased or decreased in display size, the number of media menu item abstractions to be displayed can be decreased or increased, respectively.
In the example implementations described above, the transitioning of the media menu item abstractions corresponds to the transitioning of the highlight indicator 324, e.g., as the highlight indicator 324 transitions from one media menu item to the next, the media menu item abstractions likewise transition through the multidimensional path 350 in a substantially synchronous manner.
In another implementation, the media menu item abstractions do not transition until the highlight indicator 324 has settled on a media menu item and no further commands to transition the highlight indicator 324 are received. In yet another implementation, the media menu item abstractions and the highlight indicator 324 transition substantially instantaneously, e.g., within several video frames or within one video frame. Other transition animations may also be used.
Stage 402 arranges a foreground position and background positions in a display area according to a multidimensional path extending from a terminus. For example, the processing device 104 and/or UI engine 114 can arrange the media menu item abstractions according to the multidimensional path 350 of
Stage 404 displays one or more icons in the one or more background positions in the display area. For example, the processing device 104 and/or UI engine 114 can display one or more media menu item abstractions in one or more background positions.
Stage 406 displays an icon in the foreground position in the display area. For example, the processing device 104 and/or UI engine 114 can display one or more media menu item abstractions, e.g., media icons in foreground positions.
Stage 408 transitions icons from the foreground and background positions along the multidimensional path. For example, the processing device 104 and/or UI engine 114 can transition the media menu item abstractions as described with reference to
Stage 422 transitions an icon in the foreground position into a background position. For example, the processing device 104 and/or UI engine 114 can transition a media menu item abstraction from a foreground position to a background position.
Stage 424 transitions an icon in the background position nearest the terminus of the multidimensional path out of the display environment. For example, the processing device 104 and/or UI engine 114 can transition media menu item abstractions in the terminus position out of the display environment.
Stage 442 generates a selection menu defining a plurality of menu items. For example, the processing device 104 and/or UI engine 114 can generate the media menu 302 and media menu items 310-322.
Stage 444 displays the selection menu proximate to corresponding icons. For example, the processing device 104 and/or UI engine 114 can display the media menu 302 proximate to the media menu item abstractions 330-342.
Stage 446 highlights a menu item. For example, the processing device 104 and/or UI engine 114 can generate the highlight indicator 324 to highlight a menu item.
Stage 448 transitions an icon that corresponds to the highlighted menu item into the foreground position. For example, the processing device 104 and/or UI engine 114 can transition a media menu item abstraction into the foreground position.
Stage 462 receives a command to change the highlight indicator from a highlighted menu item to a menu item to be highlighted. For example, the processing device 104 and/or UI engine 114 can receive a command to change the highlight indicator 324 from a highlighted media menu item to a media menu item to be highlighted.
Stage 464 determines if the direction of the highlight indicator during transition is in a first direction or a second direction. For example, the processing device 104 and/or UI engine 114 can determine if the highlight indicator 324 is transitioning in an up direction or a down direction. While an example two direction method is described, other multi-directional methods for traversing more or less complicated paths can be used.
If the direction is in the first direction, stage 466 transitions the icon corresponding to the media menu item to be highlighted from a background position to the foreground position. For example, the processing device 104 and/or UI engine 114 can transition a media menu item abstraction from a background position to a foreground position.
If the direction is in the first direction, stage 468 transitions the highlighted icon in the foreground position off the display area. For example, the processing device 104 and/or UI engine 114 can transition a highlighted media menu item abstraction off the media menu interface environment 300.
If the direction is in the second direction, stage 470 emerges the icon corresponding to the media menu item to be highlighted into the foreground position. For example, the processing device 104 and/or UI engine 114 can emerge a media menu item abstraction into the foreground position.
If the direction is in the second direction, stage 472 transitions an icon in the background position off the display area. For example, the processing device 104 and/or UI engine 114 can transition a media menu item abstraction in the terminus position off the media menu interface environment 300.
The screenshots 502 and 504 of
The screenshot 506 of
In another implementation, only one media menu item abstraction is shown in the foreground position, and additional media menu item abstractions are not shown in the background position. In this implementation, the media menu item abstractions that to do not correspond to the highlighted menu item transition off the display area through the multidimensional path as described above, e.g., through the terminus position if transitioning into the background position, or by being displaced by a media menu item abstraction emerging into the terminus position and transitioning from the background into the foreground position. Accordingly, only the media menu item abstraction corresponding to the highlighted menu item is shown.
In one implementation, selection of a media menu item when highlighted generates a content menu interface environment for processing media data related to such content, e.g., Movies, TV Shows, Music, etc. Upon selection of a highlighted media menu item, e.g., media menu item 310, the corresponding media menu item abstraction, e.g., media menu item abstraction 330, transitions to the title location occupied by the icon 304. Likewise, the title 306 is replaced by the context title of the media menu item, e.g., “Movies” for media menu item 310.
In one implementation, the size of the media menu item abstraction is scaled from a first display size in the foreground position to a smaller display size as the media menu item abstraction transitions from the foreground position to the title position, as indicated by the directional arrow 522 and example size indicators 606 and 608. The size can be proportioned, for example, according to a linear function of the distance of the media menu item abstraction from the title position, or proportioned according to a nonlinear function of the distance of the media menu item abstraction from the title position.
In one implementation, the media menu item abstractions in the background positions transition out of the display area through the terminus position. For example, as shown in
In another implementation, the media menu 302 fades out of view during the transition from the media menu interface environment 300 to the content menu interface environment, as depicted in
In one implementation, the content menu interface environment 600 includes a content menu 602 identified in part by a media menu item abstraction, such as the media menu item abstraction 330, in a title location and a title 606, e.g., “Movies.” The content menu 602 includes content menu items 610, 612, 614, 616, 618, 620 and 622. The content menu 602 can also include the highlight indicator 324 that highlights a content menu item. A highlight selection of a menu item by the highlight indicator 324 indicates that the menu item is eligible for a further selection action, e.g., eligible to be selected by actuating the select area 168 on the rotational input device 110.
In one implementation, the first content menu item 610 is a sales content menu associated with content offered for sale. For example, the content menu item 610 is entitled “iTunes Store Presents,” and includes a child indicator 326. Selecting the iTunes Store Presents content menu item 610 can, for example, transition to another content menu that lists one or more content items available for purchase by download. In one implementation, the content items listed for sale correspond to the content type of the content menu 602. For example, the content menu interface environment 600 of
In another implementation, another content menu item 612 is a preview content menu item. For example, the content menu item 612 is entitled “Theatrical Trailers” and includes a child indicator 326. Selecting the Theatrical Trailers content menu item 612 can, for example, transition to another content menu that lists one or more theatrical trailers that may be streamed to the media processing device 100. Other preview content menus may also be used, such as a “Previews” content menu, for example, that provides previews of movies that are currently available for purchase by download, or song clips for songs that are currently available for purchase by download, etc. In one implementation, the content items available for preview correspond to the content type of the content menu 602.
The content menu interface environment 600 also includes content abstractions that correspond to one or more content menu items. For example, the content abstractions 630, 632 and 634 correspond to the content menu item 610. In one implementation, the content abstractions are graphical representations of the content corresponding to the highlighted content menu item. For example, the content abstractions 630, 632 and 634, which correspond to the iTunes Presents content menu item 610, can comprise digital representations of movie posters for movies that are presently offered for sale at iTunes. Alternatively, digital representations of movie stills can be used, or video clips of the movies can be used, or some other content abstraction.
In one implementation, the content abstractions can include a reflection effect. For example, the content abstractions 630, 632 and 634 can include reflections 631, 633, and 635.
In one implementation, a set of content abstractions can be associated with a single content menu item, or can be associated with a plurality of content menu items. In the example content menu interface environment 600 of
The first set of content abstractions, for example, can comprise digital representations of movie posters for movies that are offered for sale through the sales content menu item 610. The second set of content abstractions, for example, can comprise movie clips for movies that are available for preview through the preview content menu item 612. Thus, changing the highlight indicator from the first content menu item 610 to the second content menu item 612 will likewise cause the content abstractions displayed to change from the first set of content abstractions to the second set of content abstractions. The remaining content menu items 614-622 correspond to content stored in a user library, and thus the third set of content abstractions, for example, can comprise digital representations of movie posters or movie stills of the corresponding movies listed in the library content menu items 614-622. Thus, changing the highlight indicator from the second content menu item 612 to the any of the library content menu items 614-622 will likewise cause the content abstractions displayed to change from the second set of content abstractions to the third set of content abstractions.
In one implementation, the content abstractions, e.g., content abstractions 630, 632, and 634, transition along a multidimensional path 650 having an ingress terminus 651 and an egress terminus 653 in a manner that provides the effect of transitioning in depth. For example, in
To further emphasize the multidimensional aspect, the content abstractions may rotate about an axis during the transition from the ingress terminus 651 to the egress terminus 653. For example, upon elimination at the egress terminus 653, the content abstractions may rotate about an axis 654 in the direction indicated by the rotational arrow 655. Likewise, upon entering at the ingress terminus 651, the content abstractions may begin a slight rotation about the axis 656 as indicated by the rotational arrow 657. In one implementation, the rotation begins at a relatively low angular rate and increases as the content abstraction nears the egress terminus 653. In one implementation, the rate of rotation increases nonlinearly to simulate an effect that the content abstraction is “flipped” out of the content menu interface environment 600.
In one implementation, the content abstractions are cycled sequentially through the multidimensional path 650, e.g., a set of twenty content abstractions cycle through the multidimensional path in a given order. In another implementation, the content abstractions are cycled randomly through the multidimensional path 650, e.g., a set of twenty content abstractions cycle through the multidimensional path in a random order.
In one implementation, content abstractions can repetitively emerge into the multidimensional path 650 at the ingress terminus 651. Thus, the content abstractions appear to cycle through the multidimensional path 650. The number of content abstractions that may cycle through the multidimensional path can, for example, depend on the amount of content associated with each content menu item or set of content menu items. For example, the content menu item 610 may provide access to a list of twenty titles available for purchase by download, and thus the first set of content abstractions associated with the content menu item 610 may comprise twenty digital representations of movie posters. Likewise, the content menu item 612 may provide access to a list of fifteen titles available for preview, and thus the second set of content abstractions associated with the content menu item 612 may comprise fifteen digital representations of movie posters. Similarly, if the library content menu items 614, 616, 618, 620 and 622 comprise the entire list of content titles in a user's library, then the content abstractions associated with the library content menu items 614616, 618, 620 and 622 may comprise five digital representations of movie posters.
In another implementation, content abstractions are scaled in size, for example, proportionally to the proximity of the content abstraction from the egress terminus. For example, in
The ingress terminus 651 and egress terminus 653 can be positioned elsewhere in the content menu interface environment 600. For example, the ingress terminus 651 and egress terminus 653 of
In another implementation, content abstractions associated with multiple content menu items may not constantly cycle; instead, a content abstraction can be transitioned to a foreground position when a corresponding content menu item is highlighted. For example, a highlighting of any one of the content menu items 614-622 may cycle a content abstraction corresponding to the highlighted content menu item to a foreground position, e.g., the position occupied by content abstraction 630 in
In another implementation, changing a set of content abstractions is facilitated by introducing the new set of abstractions through the ingress terminus 651 and eliminating the previous set through the egress terminus 653. For example, changing the highlight indicator 324 from the content menu item 610 to the content menu item 612 causes the content abstractions associated with the content menu item 612 to emerge from the ingress terminus 651, and precludes the content abstractions associated with the content menu item 610 from reemerging after elimination through the egress terminus 653.
In one implementation, the sales content menu interface environment 700 includes a sale content menu 702 identified in part by a media menu item abstraction, such as the media menu item abstraction 330, in a title location and a title 706, e.g., “iTunes Store Presents.” The sales content menu 702 includes sales content menu items 710, 712, 714, 716, 718, 720, 722 and 724. The bottom sales content menu item 724 is partially faded, indicating that the list of sales content menu items continues beyond the sales content menu interface environment 700. Scrolling down to the sales content menu item 724 can, for example, causes additional sales content menu items to scroll into the sales content menu interface environment 750. The sales content menu 702 can also include the highlight indicator 324 that highlights a sales content menu item.
A sales content menu item abstraction 730 can be generated proximate to the sales content menu 702. In one implementation, the sales content menu item abstraction 730 includes a reflection 731. In the example implementation of
In one implementation, the sales content menu items 710-724 can be generated according to sales metrics of the provider, e.g., the most popular selling content titles. In another implementation, the sales content menu items 710-724 can be generated using a collaborative filter and based on a user's library titles and/or sales history. Other methods of identifying sales content can also be used.
In one implementation, the example purchase content menu interface environment 740 includes a purchase content menu 742 identified in part by a media menu item abstraction, such as the media menu item abstraction 330, in a title location and a title 746. In one implementation, the title corresponds to the content title available for purchase. For example, selecting the content menu item 710 of
The purchase content menu 742 can include purchase content menu items 750, 752 and 754. Selection of the purchase content menu item 750, entitled “Preview,” generates a preview of the content available for purchase. In one implementation, the preview can, for example, be streamed from the content provider.
Selecting the purchase content menu item 752, entitled “Purchase,” debits a user's account for the purchase price of the content title and downloads the content title. In one implementation, the content title is downloaded to a user's computing device, such as computing device 208, for long term storage. The content title may later be copied to a data store on the media processing device 100 for viewing, or may be streamed from the computing device 208 to the media processing device 100 for viewing.
Selecting the purchase menu item 754, entitled “Other Option,” invokes one or more functions related to other purchase options. For example, reviews of the content title can be provided, or a synopsis of the content title can be provided, or other options can be provided.
Selection of a highlighted content abstraction can generate a preview of the content title. In one implementation, the preview is streamed from the content provider. For example, selection of the content abstraction 764 can generate a preview for the movie entitled “Cars.” In this implementation, the preview comprises a theatrical trailer. In other implementations, other previews can be shown, e.g., a short documentary, such as “The Making of Cars” can be shown.
The type of content abstraction displayed can depend on the type of content to be previewed. For example, if the content is movies, then the content abstractions can be digital representations of movie posters or movie stills. Likewise, if the content is audio books, then the content abstractions can be digital representations of book jackets. Other content abstractions can also be displayed.
In one implementation, the content menu interface environment 800 includes a content menu 802 identified in part by a media menu item abstraction, such as the media menu item abstraction 332, in a title location and a title 804, e.g., “TV Shows.” The content menu 802 includes content menu items 812, 814, 816, 818, 820, 822 and 824. The content menu 802 can also include the highlight indicator 324 that highlights a content menu item. The content menu items, can, for example, correspond to television shows that have either been recorded from a broadcast or purchased from a content provider.
In one implementation, the content menu 802 also includes a sort field 806 that includes a first sort option 808 and a second sort option 810. Selection of the first sort option 808 can, for example, sort the content menu items by a program category, e.g., a program title. In one implementation, multiple instances of the same program title are grouped into folders, as indicated by the child indicators 326 of
Selection of the second sort option 810 sorts the content menu items according to a date, as indicated by the date fields 815, 817, 819, 821, 823, and 825 of
In one implementation, the first content menu item 812 is a sales content menu associated with content offered for sale. For example, the content menu item 812 is entitled “iTunes Store Presents,” and includes a child indicator 326. Selecting the iTunes Store Presents content menu item 812 can, for example, transition to another content menu that lists one or more content items available for purchase by download. In one implementation, the content items listed for sale correspond to the content type of the content menu 802. For example, the content menu interface environment 800 of
The content menu interface environment 800 also includes content abstractions that correspond to one or more content menu items. For example, the content abstractions 830, 832 and 834 correspond to the content menu items 814-824 in
In one implementation, the content abstractions can include a reflection effect. For example, the content abstractions 830, 832 and 834 can include reflections 831, 833, and 835.
In one implementation, a set of content abstractions can be associated with a single content menu item, or can be associated with a plurality of content menu items. In the example content menu interface environment 800 of
In one implementation, the content abstractions, e.g., content abstractions 830, 832, and 834, transition along a multidimensional path 850 having an ingress terminus 851 and an egress terminus 853. In one implementation, the ingress terminus 851 is within the content menu interface environment 800, e.g., beneath the content abstraction 834 of
In one implementation, the content menu interface environment 900 includes a content menu 902 identified in part by a media menu item abstraction, such as the media menu item abstraction 334, in a title location and a title 906, e.g., “Music.” The content menu 902 can include, for example, content menu items 910, 912, 914, 916, 918, 922, and 924. The content menu 902 can also include the highlight indicator 324 that highlights a content menu item.
In one implementation, the first content menu item 910 is a sales content menu associated with content offered for sale. For example, the content menu item 910 is entitled “iTunes Store Presents,” and includes a child indicator 326. Selecting the iTunes Store Presents content menu item 910 can, for example, transition to another content menu that lists one or more content items available for purchase by download. In one implementation, the content items listed for sale correspond to the content type of the content menu 902. For example, the content menu interface environment 900 of
In one implementation, selection of the content menu item 912, entitled “Shuffle Play,” initiates a shuffle play of content titles, as indicated by the shuffle indicator 913. Selection of the content menu item 914, entitled “Music Videos,” lists music videos according to one or more music video hierarchal categories. Selection of the content menu item 916, entitled “Playlists,” lists playlists according to one or more playlist hierarchal categories. Selection of the content menu item 918, entitled “Artists,” lists audio content according to one or more artists hierarchal categories. Selection of the content menu item 920, entitled “Albums,” lists audio content according to one or more hierarchal album categories. Selection of the content menu item 922, entitled “Songs,” lists audio content according to one or more songs hierarchal categories. Selection of the content menu item 924, entitled “Audio Books,” lists audio books according to one or more audio book hierarchal categories. Other content menu items can also be used.
The content menu interface environment 900 also includes content abstractions that correspond to one or more content menu items. For example, the content abstractions 930, 932 and 934 correspond to the content menu item 910. In one implementation, the content abstractions are graphical representations of the content corresponding to the highlighted content menu item. For example, the content abstractions 930, 932, 934 and 936, which correspond to the iTunes Presents content menu item 910, can comprise digital representations of album art for songs that are presently offered for sale at iTunes.
In one implementation, the content abstractions can include a reflection effect. For example, the content abstractions 930, 932, 934 and 936 can include reflections 931, 933, 937, and 937.
In one implementation, a set of content abstractions can be associated with a single content menu item, or can be associated with a plurality of content menu items. In the example content menu interface environment 900 of
The first set of content abstractions can, for example, comprise digital representations of album art for songs that are offered for sale through the sales content menu item 910. The second set of content abstractions can, for example, comprise digital representations of content titles that are eligible for selection through a shuffle play. The third set of content abstractions can, for example, comprise digital representations of music videos, e.g., video stills or video clips, that are categorized under the content menu item 914. The fourth set of content abstractions can, for example, comprise digital representations of content titles categorized under the content menu items 916, 918, 920 and 922. The fifth set of content abstractions can, for example, comprise digital representations of book jacket art for audio books that are categorized under the content menu item 924.
In one implementation, the content abstractions, e.g., content abstractions 930, 932, and 934, transition along a multidimensional path 950 having an ingress terminus 951 and an egress terminus 953. In one implementation, the ingress terminus 951 is within the content menu interface environment 900, e.g., beneath the content abstraction 934 of
Additionally, the content abstractions can initially rotate in a clockwise direction, as indicated by the rotational arrow 957, about an axis 956. In one implementation, the content abstractions enter the ingress terminus 951 at approximately a normal disposition, e.g., approximately 90 degrees, and rotate in the clockwise direction during the transition through the multidimensional path 950.
In one implementation, the content menu interface environment 1000 includes a content menu 1002 identified in part by a media menu item abstraction, such as the media menu item abstraction 338, in a title location and a title 1006, e.g., “Photos.” The content menu 1002 can include, for example, content menu items 1010, 1012, 1014, 1016, 1018, 1022, and 1024. The content menu 1002 can also include a highlight indicator 325 that highlights a content menu item. In this implementation, the highlight indicator 325 is rectangular, and includes a backlit-style glow highlight.
In one implementation, the first content menu item 1010 is an “All” content menu associated with all photographic content stored in a user library. Selection of the content menu item 1010 can, for example, list all photographs stored in a user library. Selection of the content menu item 1012, entitled “Shuffle,” initiates a shuffle presentation of photos, as indicated by the shuffle indicator 1013. Selection of the content menu item 1014, entitled “Last Roll,” lists photographs collected during a most recent photographic session. Selection of the content menu items 1016-1024 lists photographs categorized under each respective content menu item. Content menu items that include a child indicator 326, e.g., content menu items 1020, 1022, and 1024, can include one or more sub-folder categories. Other content menu items can also be used.
The content menu interface environment 1000 also includes content abstractions that correspond to one or more content menu items. For example, the content abstractions 1030, 1032, 1034 and 1036 correspond to the content menu item 1010. In one implementation, the content abstractions are the photographs associated with each content menu item. In one implementation, a set of content abstractions can be associated with a single content menu item, or can be associated with a plurality of content menu items. In the example content menu interface environment 1000 of
In one implementation, the content abstractions can include a reflection effect. For example, the content abstractions 1030, 1032, 1034 and 1036 can include reflections 1031, 1033, 1035 and 1037.
In one implementation, the content abstractions, e.g., content abstractions 1030, 1032, 1034 and 1036, transition along a multidimensional path 1050 having an ingress terminus 1051 and an egress terminus 1053. In one implementation, the ingress terminus 1051 is within the content menu interface environment 1000, e.g., beneath the content abstraction 1034 of
A plurality of content abstractions 1110, 1112, 1114 and 1116 transition along a multidimensional path defined by positions 1120, 1122, 1124 and 1126. Each of the content abstractions 1110, 1112, 1114 and 1116 has a corresponding front surface depiction 1111, 1113, 1115 and 1117 on which content may be displayed, e.g., movie poster art, album art, photos, video clips, text, or other content types.
The example rendering of the multidimensional environment 1110 shows a top view of a frame during which each of the content abstractions 1110, 1112, 1114 and 1116 are coincident with respective positions 1120, 1122, 1124 and 1126, respectively. The x- and z-coordinates and the angle of the front surface of each content abstraction relative to the x-axis are provided in Table 1 below:
0°
30°
In the example implementation of
Also in the example implementation of
Also in the example implementation of
As the content abstractions 1110, 1112, 1114 and 1116 transition to adjacent positions, the respective angle of each abstraction is rotated as indicated by rotational direction arrows 1130, 1132 and 1134. Thus, during the transition from a first position, e.g., position 1120, to a second position, e.g. position 1122, a content abstraction, e.g., content abstraction 1110, rotates in a direction, e.g., clockwise, from 90 degrees to 0 degrees. Similarly, during the transition from position 1122 to position 1124, the content abstraction rotates from 0 degrees to 30 degrees in a direction, e.g., counterclockwise; and during the transition from position 1124 to position 1126, the content abstraction rotates from 30 degrees to 90 degrees in a direction, e.g., counterclockwise. Other rotational ranges and rotational directions can also be selected. Additionally, the number of positions can be increased, e.g., to five, six, etc., or decreased, e.g., to three or even two.
In one implementation, the rate of rotation between positions is substantially linear. For example, if a content abstraction moves one unit in the x-direction and approximately 14.5 units in the z-direction during each video frame, and each location is separated by 69 units along the x-axis and 1000 units along the z-axis, then approximately 69 video frames are generated during a transition of a content abstraction from any position to an adjacent position. Accordingly, during the transition from position 1120 to 1122, a content abstraction will rotate approximately 90/69 degrees for each video frame, or about 1.3 degrees for each video frame. Likewise, during the transition form position 1122 to 1124, a content abstraction will rotate approximately 30/69 degrees, or about 0.43 degrees for each video frame; and during the transition form position 1124 to 1126, a content abstraction will rotate approximately 30/69 degrees, or about 0.87 degrees for each video frame.
In another implementation, the rotational rate between positions can be substantially non-linear. For example, the rotational rate during a transition from a first position to a second position may exponentially decrease (or increase), thus providing the appearance of an initially rapid but overdamped rotation beginning at each position (e.g., positions 1120, 1122, 1124 and 1126). Other rotational rate processes can also be used.
In another implementation, the content abstraction can obtain a final angle associated with an adjacent position before transitioning to the adjacent position. For example, during a transition from a first position to a second position, a content abstraction can obtain the angle associated with the second position mid-way through the transition and cease rotating. The content abstraction can again rotate upon reaching the second position, at which time a transition to a third position associated with another angle begins.
In another implementation, a content abstraction may also “fade-in” upon emergence into the multidimensional path defined by the positions 1120, 1122, 1124 and 1126. For example, the content abstraction 1110 may fade into view during the transition from the position 1120 to position 1122.
In one implementation, the multidimensional environment 1100 is rendered utilizing a rendering thread and a queuing thread. The rendering thread updates the positions of the content abstractions in the multidimensional environment 1100, and frees memory and processing resources associated with content abstractions that have transitioned beyond position 1126. The queuing thread loads image data from media assets, e.g., image files of movie posters, for example. When the rendering thread pulls a content abstraction from the queue, the queuing thread loads another image to generate another content abstraction. In one implementation, the queuing thread maintains at least two content abstractions in a queue for the rendering thread. Other processing methods and allocations of resources can also be used.
In one implementation, the content menu interface environment 1200 includes a content menu 1202 identified in part by a media menu item abstraction, such as the media menu item abstraction 342, in a title location and a title 1206, e.g., “Streaming.” The content menu 1202 can include, for example, content menu items 1210, 1212, 1214, and 1216. The content menu 1202 can also include the highlight indicator 324 that highlights a content menu item.
Each of the content menu items 1210, 1212, 1214 and 1216 can, for example, correspond to a data store accessible over a local area network, e.g., one or more computers accessible over a wireless or wired network. Each of the corresponding data stores can store content that can, for example, be pushed to or pulled to the media processing system 100. For example, the data store corresponding to the content menu item 1210 may store photographic files; the data store corresponding to the content menu item 1212 may store movie and video files; the data store corresponding to content menu item 1214 may store music files; and the data store corresponding to content menu item 1216 may store all of the data files stored in the data stores corresponding to content menu items 1210, 1212, and 1214. The files may be downloaded to the media processing system 100 or streamed to the media processing 100 for processing.
Stage 1302 generates a media menu interface comprising a plurality of media menu items. For example, the processing device 104 and/or UI engine 114 can display one or more media menu items 310-322 in the media menu interface environment 330.
Stage 1304 generates a plurality of media menu item abstractions corresponding to the media menu items. For example, the processing device 104 and/or UI engine 114 can generate one or more media menu item abstractions 330-342.
Stage 1306 arranges the media menu item abstractions so that a media menu item abstraction corresponding to a highlighted media menu item is displayed in a foreground position. For example, the processing device 104 and/or UI engine 114 can display a media menu item abstraction in a foreground position, such as media menu item abstraction 330 in
Stage 1308 arranges the media menu item abstractions so that the media menu item abstractions corresponding to media menu items that are not highlighted are displayed in background positions. For example, the processing device 104 and/or UI engine 114 can display media menu item abstractions in the background positions, such as the media menu item abstractions 332-336 in
Stage 1322 receives a selection of a highlighted media menu item, and stage 1324 transitions the media menu item abstraction corresponding to the highlighted media menu item from a present position to a title position. For example, the processing device 104 and/or UI engine 114 can process a selection and perform the transition such as the transition depicted in
Stage 1326 generates a content menu in proximity to the title position. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate a corresponding content menu in proximity to the title position, such as the content menu 602 shown in
Stage 1342 generates content abstractions. In one implementation, the content abstractions can correspond to content items or content menu items. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate the content abstractions, such as the content abstractions 630-634 shown in
Stage 1344 defines a multidimensional path having an ingress terminus and an egress terminus. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can define a multidimensional path having an ingress terminus and an egress terminus, such as the multidimensional path 650 shown in
Stage 1346 emerges the content abstractions into the multidimensional path at the ingress terminus. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can emerge the content abstractions into the multidimensional path at the ingress terminus, such as the ingress terminus 651 shown in
Stage 1348 eliminates the content abstractions from the multidimensional path at the egress terminus. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can eliminate the content abstractions from the multidimensional path at the egress terminus, such as the egress terminus 653 shown in
Stage 1350 generates depth transitions of the content abstractions through the multidimensional path from the ingress terminus to the egress terminus. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate depth transitions of the content abstractions through the multidimensional path from the ingress terminus to the egress terminus, such as the depth transitions shown in
Stage 1352 reemerges the content abstractions eliminated at the egress terminus into the multidimensional path at the ingress terminus. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can reemerge the content abstractions eliminated at the egress terminus into the multidimensional path at the ingress terminus, such as the ingress terminus 651.
Stage 1362 generates a content menu in proximity to a title position. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate a content menu in proximity to a title position, such as the content menu 602 shown in
Stage 1364 generates a sales content menu item. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate a sales content menu item, such as the “iTunes Store Presents” content menu item 610 shown in
Stage 1366 generates library content menu items. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate the content library menu items, such as the content menu items 614-622 shown in
Stage 1368 generates content abstractions corresponding to the sales content menu item. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate content abstractions for the corresponding sale content menu item by receiving content information from a content provider, such as the content abstractions 630, 632 and 634 shown in
Stage 1370 generates content abstractions corresponding to a group of library content menu items. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate content abstractions corresponding to a group of library content menu items from content data stored in one or more user libraries, such as content abstractions corresponding to library content menu items 614-622.
Stage 1382 generates a content menu in proximity to a title position. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate a content menu in proximity to a title position, such as the content menu 802 shown in
Stage 1384 generates a title sort menu item. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate a title sort menu item, such as the sort option 808 shown in
Stage 1386 generates a date sort menu item. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate a date sort menu item, such as the sort option 810 shown in
Stage 1388 generates one or more library content menu items. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can generate the library content menu items, such as the library content menu items 814-824 shown in
Stage 1390 sorts the library content menu items according to a sort selection. For example, the processing device 104 and/or UI engine 114 and corresponding media engine 116 can sort the library content menu items according to a selection of either the title sort menu item or the date sort menu item, such as the sorted library content menu items 814-824 as shown in
The stages recited in the example processes of
The example implementations described herein can be implemented for various other media types and content. For example, access to and management of satellite radio programs, web blogs, syndicated media content, or other media types and content can be realized by the example implementations described herein.
The apparatus, methods, flow diagrams, and structure block diagrams described in this patent document may be implemented in computer processing systems including program code comprising program instructions that are executable by the computer processing system. Other implementations may also be used. Additionally, the flow diagrams and structure block diagrams described herein, which describe particular methods and/or corresponding acts in support of steps and corresponding functions in support of disclosed structural means, may also be utilized to implement corresponding software structures and algorithms, and equivalents thereof.
This written description sets forth the best mode of the invention and provides examples to describe the invention and to enable a person of ordinary skill in the art to make and use the invention. This written description does not limit the invention to the precise terms set forth. Thus, while the invention has been described in detail with reference to the examples set forth above, those of ordinary skill in the art may effect alterations, modifications and variations to the examples without departing from the scope of the invention.
Notice: More than one reissue application has been filed for the reissue of U.S. Pat. No. 8,656,309. The reissue applications are U.S. patent application Ser. No. 15/046,416 (the present reissue application), filed Feb. 17, 2016, and U.S. patent application Ser. No. 15/908,514 (a divisional reissue application of the present reissue application), filed Feb. 28, 2018, both of which are reissue applications of U.S. Pat. No. 8,656,309, the entire disclosures of which are incorporated by reference for all purposes. This application is a continuation (and claims the benefit of priority under 35 USC 120) of U.S. patent application Ser. No. 11/530,834, entitled “User Interface With Menu Abstractions And Content Abstractions,” which was filed on Sep. 11, 2006. The disclosure of this application is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5479602 | Baecker et al. | Dec 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5499330 | Lucas et al. | Mar 1996 | A |
5619249 | Billock et al. | Apr 1997 | A |
5717879 | Moran et al. | Feb 1998 | A |
5822123 | Davis et al. | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5880768 | Lemmons et al. | Mar 1999 | A |
6006227 | Freeman et al. | Dec 1999 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6229542 | Miller | May 2001 | B1 |
6243724 | Mander et al. | Jun 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6335737 | Grossman et al. | Jan 2002 | B1 |
6448987 | Easty et al. | Sep 2002 | B1 |
6466237 | Miyao et al. | Oct 2002 | B1 |
6638313 | Freeman et al. | Oct 2003 | B1 |
6678891 | Wilcox et al. | Jan 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6725427 | Freeman et al. | Apr 2004 | B2 |
6768999 | Prager et al. | Jul 2004 | B2 |
6944632 | Stern | Sep 2005 | B2 |
7015894 | Morohoshi | Mar 2006 | B2 |
7096431 | Tambata | Aug 2006 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7249327 | Nelson et al. | Jul 2007 | B2 |
7292243 | Burke | Nov 2007 | B1 |
7350157 | Billmaier | Mar 2008 | B1 |
7362331 | Ording | Apr 2008 | B2 |
7363591 | Goldthwaite et al. | Apr 2008 | B2 |
7418674 | Robbins | Aug 2008 | B2 |
7433885 | Jones | Oct 2008 | B2 |
7581195 | Sciammarella | Aug 2009 | B2 |
7587683 | Ito | Sep 2009 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7743338 | Madden | Jun 2010 | B2 |
7747968 | Brodersen et al. | Jun 2010 | B2 |
8112718 | Nezu | Feb 2012 | B2 |
8296677 | Brodersen et al. | Oct 2012 | B2 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8694917 | Yasui | Apr 2014 | B2 |
20020033848 | Sciammarella et al. | Mar 2002 | A1 |
20020083469 | Jeannin et al. | Jun 2002 | A1 |
20020175931 | Holtz et al. | Nov 2002 | A1 |
20030110450 | Sakai | Jun 2003 | A1 |
20030117425 | O Leary et al. | Jun 2003 | A1 |
20030142751 | Hannuksela | Jul 2003 | A1 |
20030174160 | Deutscher et al. | Sep 2003 | A1 |
20040008211 | Soden et al. | Jan 2004 | A1 |
20040100479 | Nakano et al. | May 2004 | A1 |
20040140995 | Goldthwaite et al. | Jul 2004 | A1 |
20040150657 | Wittenburg et al. | Aug 2004 | A1 |
20040205176 | Ting et al. | Oct 2004 | A1 |
20040221243 | Twerdahl et al. | Nov 2004 | A1 |
20040250217 | Tojo et al. | Dec 2004 | A1 |
20040261031 | Tuomainen et al. | Dec 2004 | A1 |
20050041033 | Hilts et al. | Feb 2005 | A1 |
20050044499 | Allen et al. | Feb 2005 | A1 |
20050060741 | Tsutsui et al. | Mar 2005 | A1 |
20050086611 | Takabe | Apr 2005 | A1 |
20050091597 | Ackley | Apr 2005 | A1 |
20050154988 | Proehl et al. | Jul 2005 | A1 |
20050160375 | Sciammarella et al. | Jul 2005 | A1 |
20050246654 | Hally et al. | Nov 2005 | A1 |
20050278656 | Goldthwaite et al. | Dec 2005 | A1 |
20060020900 | Kumagai et al. | Jan 2006 | A1 |
20060020962 | Stark et al. | Jan 2006 | A1 |
20060031776 | Glein et al. | Feb 2006 | A1 |
20060143574 | Ito | Jun 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060209062 | Drucker et al. | Sep 2006 | A1 |
20060265409 | Neumann et al. | Nov 2006 | A1 |
20070083911 | Madden et al. | Apr 2007 | A1 |
20070162853 | Weber et al. | Jul 2007 | A1 |
20070288863 | Ording et al. | Dec 2007 | A1 |
20080062894 | Ma et al. | Mar 2008 | A1 |
20080065638 | Brodersen et al. | Mar 2008 | A1 |
20080065720 | Brodersen et al. | Mar 2008 | A1 |
20080066010 | Brodersen et al. | Mar 2008 | A1 |
20080066013 | Brodersen et al. | Mar 2008 | A1 |
20080066110 | Brodersen et al. | Mar 2008 | A1 |
20080092168 | Longan et al. | Apr 2008 | A1 |
20080122870 | Brodersen et al. | May 2008 | A1 |
20100235792 | Brodersen et al. | Sep 2010 | A1 |
Number | Date | Country |
---|---|---|
1 289 287 | Mar 2003 | EP |
1 289 287 | Mar 2003 | EP |
1 469 375 | Oct 2004 | EP |
1 510 911 | Mar 2005 | EP |
2000-163031 | Jun 2000 | JP |
2002-342033 | Nov 2002 | JP |
WO-0033573 | Jun 2000 | WO |
WO-2008033835 | Mar 2008 | WO |
WO-2008033835 | Mar 2008 | WO |
Entry |
---|
“Animated Image Blur,” http://web.archive.org/web/20060430062528/http://www.tutrio.com/tutorial/animated-image-blur , Apr. 30, 2006, 2 pages. |
“Fading Image Rollovers,” http://web.archive.org/web/20060111080357/http://www.javascript-fx.com/fade_rollovers/general_help/help.html. Jan. 11, 2006, one page. |
Final Office Action dated Feb. 3, 2009, for U.S. Appl. No. 11/530,834, filed Sep. 11, 2006, seventeen pages. |
Final Office Action dated May 8, 2009, for U.S. Appl. No. 11/530,824, filed Sep. 11, 2006, 13 pages. |
Final Office Action dated May 13, 2009, for U.S. Appl. No. 11/530,808, filed Sep. 11, 2006, 11 pages. |
Final Office Action dated Apr. 14, 2010, for U.S. Appl. No. 11/530,834, filed Sep. 11, 2006, fifteen pages. |
International Search Report dated Aug. 11, 2008, for PCT/US2007/078156, filed Mar. 20, 2008, three pages. |
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25. |
Non-Final Office Action dated Aug. 19, 2008, for U.S. Appl. No. 11/530,834, filed Sep. 11, 2006, fifteen pages. |
Non-Final Office Action dated Nov. 26, 2008, for U.S. Appl. No. 11/530,808, filed Sep. 11, 2006, 15 pages. |
Non-Final Office Action dated Jul. 9, 2009, for U.S. Appl. No. 11/530,834, filed Sep. 11, 2006, thirteen pages. |
Non-Final Office Action dated Dec. 1, 2011, for U.S. Appl. No. 12/786,843, filed May 25, 2010, 10 pages. |
Non-Final Office Action dated Apr. 29, 2013, for U.S. Appl. No. 13/038,868, filed Mar. 2, 2011, seventeen pages. |
Notice of Allowance dated Feb. 22, 2010, for U.S. Appl. No. 11/530,808, filed Sep. 11, 2006, seven pages. |
Notice of Allowance dated Jan. 24, 2011, for U.S. Appl. No. 11/530,834, filed Sep. 11, 2006, ten pages. |
Notice of Allowance dated Jun. 22, 2012, for U.S. Appl. No. 12/786,843, filed May 25, 2010, five pages. |
Notice of Allowance dated Oct. 8, 2013, for U.S. Appl. No. 13/038,868, filed Mar. 2, 2011, nine pages. |
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages. |
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660. |
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages. |
Number | Date | Country | |
---|---|---|---|
Parent | 11530834 | Sep 2006 | US |
Child | 13038868 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13038868 | Mar 2011 | US |
Child | 15046416 | US |