Graphical user interfaces simplify user interaction with computer programs and are designed such that knowledge of specific commands and/or combinations of keystrokes is not required to efficiently and effectively use the computer program. Thus, a function can be carried out by the computer application, which owns the graphical user interface (GUI), by selecting or clicking with a mouse a particular selection available in a GUI.
In a typical window-based GUI system, a group of visually distinct display objects are provided on the display to form a menu screen. The display objects are commonly referred to as “icons”. Each of the icons represents a function or object, and may be configured as a pointer or symbol connecting the function or object to a file or contents. Presenting the file or contents to a user with a computer using multiple windows on a display device with a GUI is known in the art. The technique of using a pointing device, such as a mouse or a trackball to select data within the file before a function is applied to the data is also known in the art. Further, the method of using the pointing device to provide graphical input to the window is also well known.
However, these techniques present various difficulties when the GUI menu screen is invoked while the selected medium is playing or executing on the display. Further, the difficulties may vary according to the type of selected media.
The present invention includes systems and methods that provide control of multimedia contents displayed in the background when the foreground graphical user interface (GUI) is invoked. In one implementation, a contents display control system includes: a media type determination unit configured to determine a media type of selected media contents; a contents display parameter adjustment unit operating to generate at least one adjustment value for at least one display parameter of the selected media contents; and a contents display processor configured to control display of the selected media contents in the background when graphical user interface is present in the foreground, the contents display processor controlling the display by appropriately adjusting the at least one display parameter of the selected media contents with the at least one adjustment value for the determined media type.
In another implementation, a contents display control method includes: determining a media type of selected media contents; generating at least one adjustment value for at least one display parameter of the selected media contents; and controlling display of the selected media contents in the background when graphical user interface is present in the foreground, by appropriately adjusting the at least one display parameter of the selected media contents with the at least one adjustment value for the determined media type.
In another implementation, a computer program, stored in a tangible storage medium, for use in controlling contents display includes executable instructions that cause a computer to: determine a media type of selected media contents; generate at least one adjustment value for at least one display parameter of the selected media contents; and control display of the selected media contents in the background when graphical user interface is present in the foreground, by appropriately adjusting the at least one display parameter of the selected media contents with the at least one adjustment value for the determined media type.
In another implementation, a contents display control apparatus includes: means for determining a media type of selected media contents; means for generating at least one adjustment value for at least one display parameter of the selected media contents; and means for controlling display of the selected media contents in the background when graphical user interface is present in the foreground, by appropriately adjusting the at least one display parameter of the selected media contents with the at least one adjustment value for the determined media type.
As used in this disclosure, the term “contents” can refer to multimedia contents including moving images, audio, and text. The term “media” can refer to means for carrying or transmitting information, such as hard disks, optical disks, memory cards, and broadcast lines, and can represent data formats such as compression formats.
This disclosure describes systems and methods that provide control of multimedia contents displayed in the background when the graphical user interface (GUI) is invoked in the foreground. The disclosure provides a plurality of implementations to enable context sensitive media display under the activated GUI menu screen. Furthermore, when the GUI menu screen is closed, the media display is allowed to continue from the previously “paused” state. In particular, various implementations of the contents display control systems and methods are described for multimedia apparatuses, such as game consoles and media controllers.
The multimedia processing apparatus 102 receives multimedia contents from various media sources, such as broadcast media, the Internet media, an optical disk 110, and a memory card 112. Contents from the broadcast media can be received through line 106, while contents from the Internet media can be received through line 108. The contents from the broadcast media and the Internet media can be recorded and stored by the multimedia processing apparatus 102. The received contents can also be used by various functions (e.g., a game) of the multimedia processing apparatus 102.
The received multimedia contents are displayed on the display 104. The controller 114 allows the user to input various instructions related to multimedia processing, and to control functions of the multimedia processing apparatus 102.
The controller 114 includes a direction-determining unit 222 for determining one or a combination of four directions (i.e., an upward direction, a downward direction, a left direction, and a right direction) from the user input; and an instruction-determining unit 224 for determining an instruction from the user input. The instruction may include a command to present a multimedia content, to terminate the presentation, to invoke a menu screen, and to issue other related commands and/or instructions. Output of the controller 114 is directed to the display output unit 202, the display control unit 204, and the game processor 206.
In the illustrated implementations of
In one implementation, the direction-determining unit 222 may determine the diagonal movements of the button as a binary command in which the movement is ascertained to be in one of two directions. Thus, a diagonal movement between the up direction and the right direction can be ascertained to be in either the up or the right direction. In another implementation, the direction-determining unit 222 may determine the diagonal movements of the button as an analog command in which the movement is ascertained to be in a particular direction up to the accuracy of the measurement. Thus, a diagonal movement between the up direction and the right direction can be ascertained to be in a northwesterly direction.
The data I/O unit 200 includes a broadcast input unit 212 for inputting broadcast contents via the television line 106; a network communication unit 214 for inputting and outputting data such as web contents via the Internet line 108; a disk reading unit 216 for inputting data stored on a disk 110; and a memory card reading unit 218 for inputting and outputting data to/from a memory card 112. Output of the data I/O unit 200 is directed to the display output unit 202, the display control unit 204, the game processor 206, and the storage unit 208.
The display output unit 202 includes a decoder 232, a synthesizer 234, an output buffer 236, and an on-screen buffer 238. The decoder 232 decodes input data received from the data I/O unit 200 or the storage unit 208. Thus, the input data may include broadcast contents, movie, and music. The synthesizer 234 processes the decoded input data based on user direction/instruction received from the controller 114. The output of the synthesizer 234 is stored in the output buffer 236. The on-screen buffer 238 stores image data of the menu screen generated by the display control unit 204. The output of the display output unit 202 is transmitted to the display 104.
The display control unit 204 includes a menu manager 242, an effects processor 244, a contents controller 246, and an image generator 248. The menu manager 242 manages media items and multimedia contents received from the storage unit 208 and the data I/O unit 200, and shown on the menu screen. The effects processor 244 processes operation of icons and icon arrays on the menu screen. The effects processor 244 also manages various actions and effects to be displayed on the menu screen. The contents controller 246 controls processing of media items and multimedia contents, and handling of data from the data I/O unit, the storage unit 208, and the game processor 206. The image generator 248 operates to generate a menu screen including a medium icon array and a contents icon array.
The game processor 206 executes a game program using data read from the data I/O unit 200 or from the storage unit 208. The game processor 206 executes the game program based on user instructions received from the controller 114. The display data of the executed game program is transmitted to the display output unit 202.
The two-dimensional array includes a medium icon array 304 arranged in a horizontal direction, and a contents icon array 306 arranged in a vertical direction. In other implementations, the arrays 304, 306 can be arranged in different directions. Thus, the medium icon array 304 and the contents icon array 306 intersect near the center area 308 of the menu screen 302. The medium icon array 304 includes a plurality of medium icons. The contents icon array 306 includes a plurality of contents icons. The icons can be provide by the apparatus, selected by a user or retrieved from media.
In
The medium icons 312-324 can be moved or scrolled across the menu screen 302 (e.g., see 330) by horizontally moving the button/joystick on the controller 114. A particular medium icon, for example, a video icon 316 in
The effects processor 244 in the display control unit 204 manipulates the medium icon array 304 in the menu screen 302 by scrolling the medium icons in a horizontal direction. Since the medium icons 312-324 in the medium icon array 304 are organized in a circular database, every medium icon in the medium icon array 304 can be selected and displayed by the effects processor 244 by continuously scrolling in one direction. For example, although the photo icon 312 is to the left of the center area 308 of the menu screen, the icon 312 can be moved into the center area 308 by continuously scrolling left. Alternatively, the medium icons can be arranged in a linear list.
As described above, the effects processor 244 displays the medium icons with the same display parameters while the icons are being scrolled. However, when a medium icon is moved into and fixed in the center area 308 of the menu screen 302, the effects processor 244 may change the display parameters for easy viewing. The display parameters can include color, size, brightness, saturation, and/or hue. The display parameters can also include special effects, such as a flashing or blinking action.
In
Although
The TV icon 318 can be selected when a television program received from the TV line 106 is to be viewed or processed. Thus, the contents icon array may include thumbnail icons of broadcast channels and/or programs. Attributes of the television program such as a name/number of the broadcast channel, a title of the program, and a broadcast time can be displayed. The DVD icon 320 can be selected when video and/or audio stored on the optical disk 110 is to be viewed and/or listened to. When the optical disk 110 is recognized as a DVD, a legend “DVD” is displayed on the medium icon 320. Otherwise, when the optical disk is recognized as a CD, a legend “CD” is displayed on the medium icon 320. In one implementation, when a moving image is stored on the DVD or the CD, a thumbnail of a short video clip can be used as a contents icon. In another implementation, when music is stored on the DVD or the CD, a short audio clip of the music can be used as a contents icon.
The Web icon 322 can be selected when data from the Internet line 108 is to be processed or displayed. Thus in this case, the contents icon array may include thumbnail icons of Web sites or links. Attributes of the Web sites such as a URL of the Web site can be displayed adjacent to the selected icon. The game icon 324 can be selected when a game program is to be played or executed. Thus in this case, the contents icon array may include thumbnail icons of different game programs. Attributes of the game program such as a title of the game can be displayed adjacent to the selected icon.
In general, the thumbnail contents icons are still images representing the linked multimedia contents files. However, the thumbnail icons can be a sequence of animated images, which may provide better representation of the contents files. In one implementation, the contents icons are retrieved from data of the contents files (e.g., from thumbnail data stored with the contents data).
In
The effects processor 244 enlarges the icon 340 when the icon is positioned into the attention area 310. The display parameters can include color, size, brightness, saturation, and/or hue. The display parameters can also include special effects, such as a flashing or blinking action. Further, when the video contents icon 340 is positioned into the attention area 310, attributes 350 associated with the icon 340 are displayed adjacent to the icon. For example, the attributes 350 can include a title and a recording date.
When the controller 114 provides a command/instruction to select a particular contents icon or thumbnail 340 (e.g., by entering a select or play command while the icon 340 is positioned in the attention area 310), the image generator 248 in the display control unit 204 removes the menu screen 302 from the display 300. Substantially simultaneously, the contents controller 246 in the display control unit 204 initiates the display of the contents file linked to the selected contents icon 340. In the illustrated implementation, the selected contents file is the Singing Quartet video.
Once the selected video is playing, a command/instruction from the controller 114 to bring the menu screen 302 back up invokes the menu screen to be superimposed on top of the currently playing video, as shown in
The contents controller 246 includes a media type determination unit 502, a contents display parameter adjustment unit 504, and a contents display processor 506. The media type determination unit 502 receives an input and determines the media type of the media contents being displayed on the display 300. In one implementation, the input includes data indicating the media type. In this implementation, the media type determination unit 502 passes the received input data directly to the contents display parameter adjustment unit 504. In another implementation, the input includes data related to the media type. In this implementation, the media type determination unit 502 processes the received input data to determine the media type. The media type is then transmitted to the contents display parameter adjustment unit 504.
The contents display parameter adjustment unit 504 receives the media type from the media type determination unit 502 and operates to generate at least one adjustment value for at least one display parameter of the displayed media contents. The display parameters can include color, size, brightness, saturation, hue, and other related parameters, which may include the display speed for video. For example, in the illustrated implementation of
In one implementation, the contents display parameter adjustment unit 504 is configured to display the video in full motion if the currently displayed media contents item is broadcast or streaming video. In another implementation, the contents display parameter adjustment unit 504 is configured to display the video in a freeze frame mode if the currently displayed media contents item is stored video such as video on DVD or video from a hard disk (e.g., DVR). In another implementation, the contents display parameter adjustment unit 504 is configured to display the music animation/graphics in slow motion if the currently displayed media contents item is music animation/graphics. In another implementation, if the currently displayed media contents item is a slide presentation, the contents display parameter adjustment unit 504 is configured to complete the last slide transition and display the last slide in still image.
The content display processor 506 receives the media type and the display adjustment parameter/value from the units 502 and 504, respectively. The contents display processor 506 operates to control unit 504 to appropriately adjust the media contents currently displayed in the background when the GUI menu screen is invoked to be displayed in the foreground.
If the currently displayed media contents item is determined, at 606, to be broadcast or streaming video, then the video is displayed in full motion in the background, at 608. Otherwise, if the currently displayed media contents item is determined, at 610, to be stored video, then the video is displayed in a freeze frame mode in the background, at 612. Otherwise, if the currently displayed media contents item is determined, at 614, to be music animation/graphics, then the animation/graphics is played in slow motion in the background, at 616. Otherwise, if the currently displayed media contents item is determined, at 618, to be a slide presentation, then the last slide transition is completed, at 620. The last slide is then displayed in still image, at 622.
In other implementations, more, fewer, or different display adjustments are available, such as adding text to the screen (e.g., “PAUSE”) or resizing the displayed contents to match a portion of the screen not obscured by the menu. When the menu screen is no longer displayed (e.g., through an exit command), the contents are returned to their former mode of display.
Various implementations of the invention are realized in electronic hardware, computer software, or combinations of these technologies. Most implementations include one or more computer programs executed by a programmable computer. For example, in one implementation, the system for control of selected multimedia contents displayed in the background while the graphical user interface (GUI) is present in the foreground includes one or more computers executing software implementing the control of selected multimedia contents displayed in the background as discussed above. In general, each computer includes one or more processors, one or more data-storage components (e.g., volatile or non-volatile memory modules and persistent optical and magnetic storage devices, such as hard and floppy disk drives, CD-ROM drives, and magnetic tape drives), one or more input devices (e.g., mice and keyboards), and one or more output devices (e.g., display consoles and printers).
The computer programs include executable code that is usually stored in a persistent storage medium and then copied into memory at run-time. The processor executes the code by retrieving program instructions from memory in a prescribed order. When executing the program code, the computer receives data from the input and/or storage devices, performs operations on the data, and then delivers the resulting data to the output and/or storage devices.
Although various illustrative implementations of the present invention have been described, one of ordinary skill in the art will see that additional implementations are also possible and within the scope of the present invention. For example, while the contents controller 246 shown in
Accordingly, the present invention is not limited to only those implementations described above.
This application claims the benefit of priority of U.S. Provisional Patent Application Ser. No. 60/570,343, entitled “Control of Background Media when Foreground Graphical User Interface is Invoked”, filed May 11, 2004. Benefit of priority of the filing date of May 11, 2004 is hereby claimed, and the disclosure of the Provisional Patent Application is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5767919 | Lee et al. | Jun 1998 | A |
5864868 | Contois | Jan 1999 | A |
5956025 | Goulden et al. | Sep 1999 | A |
5982980 | Tada | Nov 1999 | A |
6061695 | Slivka et al. | May 2000 | A |
6084169 | Hasegawa et al. | Jul 2000 | A |
6091411 | Straub et al. | Jul 2000 | A |
6118450 | Proehl et al. | Sep 2000 | A |
6121967 | Foster et al. | Sep 2000 | A |
6182094 | Humpleman et al. | Jan 2001 | B1 |
6370550 | Douma et al. | Apr 2002 | B1 |
6395969 | Fuhrer | May 2002 | B1 |
6452609 | Katinsky et al. | Sep 2002 | B1 |
6487589 | Yoshino et al. | Nov 2002 | B1 |
6690391 | Kim et al. | Feb 2004 | B1 |
6731310 | Craycroft et al. | May 2004 | B2 |
6731312 | Robbin | May 2004 | B2 |
6781518 | Hayes et al. | Aug 2004 | B1 |
6791581 | Novak et al. | Sep 2004 | B2 |
6904566 | Feller et al. | Jun 2005 | B2 |
6930730 | Maxon et al. | Aug 2005 | B2 |
6944829 | Dando | Sep 2005 | B2 |
6964009 | Samaniego et al. | Nov 2005 | B2 |
6977335 | Georges et al. | Dec 2005 | B2 |
7073130 | Novak et al. | Jul 2006 | B2 |
7076734 | Wolff et al. | Jul 2006 | B2 |
7093198 | Paatero et al. | Aug 2006 | B1 |
7136870 | McGloughlin | Nov 2006 | B1 |
7139981 | Mayer et al. | Nov 2006 | B2 |
7146408 | Crater et al. | Dec 2006 | B1 |
7155305 | Hayes et al. | Dec 2006 | B2 |
7188315 | Chen et al. | Mar 2007 | B2 |
7239348 | Miyazaki | Jul 2007 | B2 |
7257775 | Jivakov et al. | Aug 2007 | B1 |
7326846 | Terada | Feb 2008 | B2 |
7355112 | Laakso | Apr 2008 | B2 |
20010047384 | Croy | Nov 2001 | A1 |
20020103817 | Novak et al. | Aug 2002 | A1 |
20030001907 | Bergsten et al. | Jan 2003 | A1 |
20030009537 | Wang | Jan 2003 | A1 |
20030080995 | Tenenbaum et al. | May 2003 | A1 |
20030083940 | Kumar et al. | May 2003 | A1 |
20030090524 | Segerberg et al. | May 2003 | A1 |
20030095149 | Fredriksson et al. | May 2003 | A1 |
20030169302 | Davidsson et al. | Sep 2003 | A1 |
20030182139 | Harris et al. | Sep 2003 | A1 |
20030236582 | Zamir et al. | Dec 2003 | A1 |
20030237043 | Novak et al. | Dec 2003 | A1 |
20040008229 | Hultcrantz | Jan 2004 | A1 |
20040068536 | Demers et al. | Apr 2004 | A1 |
20040078382 | Mercer et al. | Apr 2004 | A1 |
20040136570 | Ullman et al. | Jul 2004 | A1 |
20040140995 | Goldthwaite et al. | Jul 2004 | A1 |
20040155901 | McKee et al. | Aug 2004 | A1 |
20040201610 | Rosen et al. | Oct 2004 | A1 |
20040205091 | Mulcahy et al. | Oct 2004 | A1 |
20040210825 | Novak et al. | Oct 2004 | A1 |
20040216054 | Mathews et al. | Oct 2004 | A1 |
20040220791 | Lamkin et al. | Nov 2004 | A1 |
20050039128 | Hsu et al. | Feb 2005 | A1 |
20050102324 | Spring et al. | May 2005 | A1 |
20050102626 | Novak et al. | May 2005 | A1 |
20050102627 | Novak et al. | May 2005 | A1 |
20050114784 | Spring et al. | May 2005 | A1 |
20050187943 | Finke-Anlauff et al. | Aug 2005 | A1 |
20050188310 | Dideriksen et al. | Aug 2005 | A1 |
20050210051 | Novak et al. | Sep 2005 | A1 |
20050210398 | Novak et al. | Sep 2005 | A1 |
20050262449 | Anderson et al. | Nov 2005 | A1 |
20050281470 | Adams | Dec 2005 | A1 |
20060053384 | La Fetra et al. | Mar 2006 | A1 |
20060123058 | Mercer et al. | Jun 2006 | A1 |
20070011279 | Haklai | Jan 2007 | A1 |
Number | Date | Country |
---|---|---|
2002-287950 | Oct 2002 | JP |
2002-543487 | Dec 2002 | JP |
WO 0065429 | Nov 2000 | WO |
WO 03003180 | Jan 2003 | WO |
WO 03021916 | Mar 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20050257169 A1 | Nov 2005 | US |
Number | Date | Country | |
---|---|---|---|
60570343 | May 2004 | US |