Computing platforms typically support a user interface (“UI) that enables users to interact with a platform. Computing platforms such as multimedia consoles have evolved to include more features and capabilities and provide access to an ever increasing array of entertainment, information, and communication options. As a result, there exists a need for UIs that provide a full set of features and capabilities while still being easy to use that enable users to get the most out of their computing platforms while maintaining a satisfying and rich user experience.
This Background is provided to introduce a brief context for the Summary and Detailed Description that follow. This Background is not intended to be an aid in determining the scope of the claimed subject matter nor be viewed as limiting the claimed subject matter to implementations that solve any or all of the disadvantages or problems presented above.
A user interface (“UI”) includes a personalized home screen that can be brought up at any time from any experience provided by applications, games, movies, television, and other content that is available on a computing platform such as a multimedia console using a single button press on a controller, using a “home” gesture, or using a “home” voice command. The personalized home screen features a number of visual objects called tiles that represent the experiences available on the console. The tiles are dynamically maintained on the personalized home screen as their underlying applications run. Within the larger UI, one of the tiles on the personalized home screen is configured as a picture-in-picture (“PIP”) display that can be filled by the graphical output of an application that is currently running. Other tiles show shortcuts to the most recently used and favorite applications. An application can be “snapped” to the application that fills the PIP so that the snapped application renders into a separate window that is placed next to the UI for the filled application. That way, the user can readily engage in multitasking experiences with the snapped and filled applications both in the personalized home screen and in full screen. The user interface is further adapted so that the user can quickly and easily switch focus between the tiles in the personalized home screen and resume an experience in full screen.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Like reference numerals indicate like elements in the drawings. Elements are not drawn to scale unless otherwise indicated.
Local content 116, including apps, games, and/or media content may also be utilized and/or consumed in order to provide a particular user experience in the environment 100. As shown in
The user 110 can typically interact with the multimedia console 112 using a variety of different interface devices including a camera system 122 that can be used to sense visual commands, motions, and gestures, and a headset 124 or other type of microphone or audio capture device. In some cases a microphone and camera can be combined into a single device. The user may also utilize a controller 126 (shown in enlarged view in the lower left of
It is emphasized that the number of controls utilized and the features and functionalities supported by the controls in the controller 126 can vary from what is shown in
A home app 152 executes on the multimedia console 112 in this illustrative example. As shown in
The tiles in the personalized home screen can be configured as “live” tiles in some implementations so that they show or represent activity of any underlying running app/game right on the personalized home screen. For example, a news app could display news headlines, sports scores, and the like from the personalized home screen. In some cases, the tiles can be configured to enable some interactivity with the underlying application through user interaction with the tile itself. For example, a tile could be configured to expose user-accessible controls to change tracks or playlists in an underlying music app that is running on the multimedia console.
The user experiences 205 further include a one button to home experience 230 in which the user 110 can push the center button 146 (
Each of the user experiences 205 are described in more detail below.
The personalized home screen 305 in
Below the resume tile 302 is a row of four tiles, in this illustrative example, that represent the most recently used apps/games, referred to here as the MRU tiles 304. The particular apps/games that are included in MRU tiles 304 can be expected to change over time as the user 110 launches and closes apps/games during the course of a session. The MRU tiles 304 can also be used in some implementations to represent either or both links for launching their respective underlying apps/games and live, currently executing applications that are running in the background (an example of an application that is running in the background while shown in the MRU tiles 304 is provided below in the text accompanying
Next to the MRU tiles 304, in this illustrative example, are several rows of tiles which comprise pins 306. The pins 306 can represent the apps/games that the user 110 likes the most and/or uses the most frequently and be used as launch tiles. Typically, the system and/or home app is configured to enable the user 110 to pick which apps/games are included as pins on the personalized home screen 305. Alternatively, the system or home app may automatically populate some or all of the pins for the user 110. For example, the system/home app may apply various rules or heuristics to determine which apps/games are included as pins or analyze usage statistics, user-expressed preferences, user behaviors, or the like when populating tiles in the pins 306 on the personalized home page 305. When a pin is selected and activated by the user, its underlying app or game can be launched. In addition, in some implementations one or more of the pins can be configured to represent a currently executing app/game with user controllability (e.g., experience control, menus, etc.) and/or implement one or more PIPs, as with the MRU tiles described above. Other apps, games, and other content can typically be browsed, selected, and launched from the menu bar 308 that is located above the resume tile 302.
In this illustrative example, the user 110 has employed the controller 126 (
If the user 110 wishes to go back to the personalized home screen from the filled game screen shown in
As shown in
When the user 110 selects the snap app button 805 and presses the “A” button 134, for example, on the controller 126, a snap app menu 905 opens on the UI as shown in the screenshot 900 in
If the user 110 uses the center button 146 to go back to the personalized home screen at this point, then the resume tile 302 is broken down into two smaller sub-tiles 1105 and 1110 as shown in the screenshot 1100 in
As shown in
As shown in the screenshot 1300 in
As shown in
In this example, the user 110 has positioned the selection indicator and pressed the “A” button 134, for example, on the controller 126 in order to answer the call. This action invokes a link to start up a VoIP app 1705 which replaces the boxing game on the UI as shown in the screenshot 1700 in
If the user presses the center button 146, for example, on the controller 126, the personalized home screen is again brought up on the UI as shown in screenshot 1800 in
In this example, the in-experience menu 1905 provides several ways to interact with the call including muting, ending, snapping, pinning, and getting call details. As shown, the user 110 has selected “snap.” When the “A” button, for example, is pressed, the VoIP app 1705 is presented at the side of the UI (in this example, on the left side) as a snapped app that renders its experience into a smaller PIP 2005 and the previous application that was running, the boxing game 310 becomes the filled app as shown in the screenshot 2000 in
If the user 110 uses the center button 146 to go back to the personalized home screen at this point, and chooses to close the snapped app using the close snap button 1305 as described above, then the snapped VoIP app 1705 closes and moves to the first position in row of MRU tiles 304 below the resume tile 302. This behavior is shown in the screenshot 2100 in
The system can be configured to inform currently running apps/games as to which of the various PIPs are being utilized to a support their experiences when the personalized home screen is being displayed. That way each running app/game has the option to adjust its experience depending on which PIP it is rendering into. For example as shown in
While some apps and games may render their normal experience into all tiles the same way, other may change the way they render their experiences based on the size, location, number of PIPs currently being displayed, and/or other criteria being utilized on a given personalized home screen. For example, if rendering into a relatively small PIP on a live tile (e.g., on an MRU tile or a pin), an application may chose to simplify or modify what is rendered compared to what it may render when it has a larger PIP to work with such as with the resume tile 302 or with a PIP in a snapped experience. Alternatively, the application could choose to render something that is different from its normal output such as advertising, an attract screen that is designed to catch a user's attention, or other objects if, for example, a tile is not particularly appropriate or suited to normal output, or as a result of a developer's design choice.
At step 2305, the user 110 (
A graphics processing unit (GPU) 2408 and a video encoder/video codec (coder/decoder) 2414 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the GPU 2408 to the video encoder/video codec 2414 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 2440 for transmission to a television or other display. A memory controller 2410 is connected to the GPU 2408 to facilitate processor access to various types of memory 2412, such as, but not limited to, a RAM.
The multimedia console 112 includes an I/O controller 2420, a system management controller 2422, an audio processing unit 2423, a network interface controller 2424, a first USB (Universal Serial Bus) host controller 2426, a second USB controller 2428, and a front panel I/O subassembly 2430 that are preferably implemented on a module 2418. The USB controllers 2426 and 2428 serve as hosts for peripheral controllers 2442(1) and 2442(2), a wireless adapter 2448, and an external memory device 2446 (e.g., Flash memory, external CD/DVD ROM drive, removable media, etc.). The network interface controller 2424 and/or wireless adapter 2448 provide access to a network (e.g., the Internet, home network, etc.) and may be any of a wide variety of various wired or wireless adapter components including an Ethernet card, a modem, a Bluetooth module, a cable modem, or the like.
System memory 2443 is provided to store application data that is loaded during the boot process. A media drive 2444 is provided and may comprise a DVD/CD drive, hard drive, or other removable media drive, etc. The media drive 2444 may be internal or external to the multimedia console 112. Application data may be accessed via the media drive 2444 for execution, playback, etc. by the multimedia console 112. The media drive 2444 is connected to the I/O controller 2420 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 1394).
The system management controller 2422 provides a variety of service functions related to assuring availability of the multimedia console 112. The audio processing unit 2423 and an audio codec 2432 form a corresponding audio processing pipeline with high fidelity and stereo processing. Audio data is carried between the audio processing unit 2423 and the audio codec 2432 via a communication link. The audio processing pipeline outputs data to the A/V port 2440 for reproduction by an external audio player or device having audio capabilities.
The front panel I/O subassembly 2430 supports the functionality of the power button 2450 and the eject button 2452, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the multimedia console 112. A system power supply module 2436 provides power to the components of the multimedia console 112. A fan 2438 cools the circuitry within the multimedia console 112.
The CPU 2401, GPU 2408, memory controller 2410, and various other components within the multimedia console 112 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can include a Peripheral Component Interconnects (PCI) bus, PCI-Express bus, etc.
When the multimedia console 112 is powered ON, application data may be loaded from the system memory 2443 into memory 2412 and/or caches 2402 and 2404 and executed on the CPU 2401. The application may present a graphical user interface that provides a consistent user experience when navigating to different media types available on the multimedia console 112. In operation, applications and/or other media contained within the media drive 2444 may be launched or played from the media drive 2444 to provide additional functionalities to the multimedia console 112.
The multimedia console 112 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the multimedia console 112 allows one or more users to interact with the system, watch movies, or listen to music. However, with the integration of broadband connectivity made available through the network interface controller 2424 or the wireless adapter 2448, the multimedia console 112 may further be operated as a participant in a larger network community.
When the multimedia console 112 is powered ON, a set amount of hardware resources are reserved for system use by the multimedia console operating system. These resources may include a reservation of memory (e.g., 16 MB), CPU and GPU cycles (e.g., 5%), networking bandwidth (e.g., 8 kbps), etc. Because these resources are reserved at system boot time, the reserved resources do not exist from the application's view.
In particular, the memory reservation preferably is large enough to contain the launch kernel, concurrent system applications, and drivers. The CPU reservation is preferably constant such that if the reserved CPU usage is not used by the system applications, an idle thread will consume any unused cycles.
With regard to the GPU reservation, lightweight messages generated by the system applications (e.g., pop-ups) are displayed by using a GPU interrupt to schedule code to render pop-ups into an overlay. The amount of memory needed for an overlay depends on the overlay area size and the overlay preferably scales with screen resolution. Where a full user interface is used by the concurrent system application, it is preferable to use a resolution independent of application resolution. A scaler may be used to set this resolution such that the need to change frequency and cause a TV re-sync is eliminated.
After the multimedia console 112 boots and system resources are reserved, concurrent system applications execute to provide system functionalities. The system functionalities are encapsulated in a set of system applications that execute within the reserved system resources described above. The operating system kernel identifies threads that are system application threads versus gaming application threads. The system applications are preferably scheduled to run on the CPU 2401 at predetermined times and intervals in order to provide a consistent system resource view to the application. The scheduling is to minimize cache disruption for the gaming application running on the console.
When a concurrent system application requires audio, audio processing is scheduled asynchronously to the gaming application due to time sensitivity. A multimedia console application manager (described below) controls the gaming application audio level (e.g., mute, attenuate) when system applications are active.
Input devices (e.g., controllers 2442(1) and 2442(2)) are shared by gaming applications and system applications. The input devices are not reserved resources, but are to be switched between system applications and the gaming application such that each will have a focus of the device. The application manager preferably controls the switching of input stream, without knowledge of the gaming application's knowledge and a driver maintains state information regarding focus switches.
A number of program modules may be stored on the hard disk, magnetic disk 2533, optical disk 2543, ROM 2517, or RAM 2521, including an operating system 2555, one or more application programs 2557, other program modules 2560, and program data 2563. A user may enter commands and information into the computer system 2500 through input devices such as a keyboard 2566 and pointing device 2568 such as a mouse. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, trackball, touchpad, touch screen, touch-sensitive module or device, gesture-recognition module or device, voice recognition module or device, voice command module or device, or the like. These and other input devices are often connected to the processing unit 2505 through a serial port interface 2571 that is coupled to the system bus 2514, but may be connected by other interfaces, such as a parallel port, game port, or USB. A monitor 2573 or other type of display device is also connected to the system bus 2514 via an interface, such as a video adapter 2575. In addition to the monitor 2573, personal computers typically include other peripheral output devices (not shown), such as speakers and printers. The illustrative example shown in
The computer system 2500 is operable in a networked environment using logical connections to one or more remote computers, such as a remote computer 2588. The remote computer 2588 may be selected as another personal computer, a server, a router, a network PC, a peer device, or other common network node, and typically includes many or all of the elements described above relative to the computer system 2500, although only a single representative remote memory/storage device 2590 is shown in
When used in a LAN networking environment, the computer system 2500 is connected to the local area network 2593 through a network interface or adapter 2596. When used in a WAN networking environment, the computer system 2500 typically includes a broadband modem 2598, network gateway, or other means for establishing communications over the wide area network 2595, such as the Internet. The broadband modem 2598, which may be internal or external, is connected to the system bus 2514 via a serial port interface 2571. In a networked environment, program modules related to the computer system 2500, or portions thereof, may be stored in the remote memory storage device 2590. It is noted that the network connections shown in
The architecture 2600 illustrated in
The mass storage device 2612 is connected to the CPU 2602 through a mass storage controller (not shown) connected to the bus 2610. The mass storage device 2612 and its associated computer-readable storage media provide non-volatile storage for the architecture 2600. Although the description of computer-readable storage media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable media can be any available computer storage media that can be accessed by the architecture 2600.
By way of example, and not limitation, computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable media includes, but is not limited to, RAM, ROM, EPROM (erasable programmable read only memory), EEPROM (electrically erasable programmable read only memory), Flash memory or other solid state memory technology, CD-ROM, DVDs, HD-DVD (High Definition DVD), BLU-RAY, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the architecture 2600.
According to various embodiments, the architecture 2600 may operate in a networked environment using logical connections to remote computers through a network. The architecture 2600 may connect to the network through a network interface unit 2616 connected to the bus 2610. It should be appreciated that the network interface unit 2616 also may be utilized to connect to other types of networks and remote computer systems. The architecture 2600 also may include an input/output controller 2618 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus (not shown in
It should be appreciated that the software components described herein may, when loaded into the CPU 2602 and executed, transform the CPU 2602 and the overall architecture 2600 from a general-purpose computing system into a special-purpose computing system customized to facilitate the functionality presented herein. The CPU 2602 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, the CPU 2602 may operate as a finite-state machine, in response to executable instructions contained within the software modules disclosed herein. These computer-executable instructions may transform the CPU 2602 by specifying how the CPU 2602 transitions between states, thereby transforming the transistors or other discrete hardware elements constituting the CPU 2602.
Encoding the software modules presented herein also may transform the physical structure of the computer-readable storage media presented herein. The specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to, the technology used to implement the computer-readable storage media, whether the computer-readable storage media is characterized as primary or secondary storage, and the like. For example, if the computer-readable storage media is implemented as semiconductor-based memory, the software disclosed herein may be encoded on the computer-readable storage media by transforming the physical state of the semiconductor memory. For example, the software may transform the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory. The software also may transform the physical state of such components in order to store data thereupon.
As another example, the computer-readable storage media disclosed herein may be implemented using magnetic or optical technology. In such implementations, the software presented herein may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations also may include altering the physical features or characteristics of particular locations within given optical media to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
In light of the above, it should be appreciated that many types of physical transformations take place in the architecture 2600 in order to store and execute the software components presented herein. It also should be appreciated that the architecture 2600 may include other types of computing devices, including hand-held computers, embedded computer systems, smartphones, PDAs, and other types of computing devices known to those skilled in the art. It is also contemplated that the architecture 2600 may not include all of the components shown in
As shown in
Various techniques may be utilized to capture depth video frames. For example, in time-of-flight analysis, the IR light component 2710 of the camera system 122 may emit an infrared light onto the capture area and may then detect the backscattered light from the surface of one or more targets and objects in the capture area using, for example, the IR camera 2715 and/or the RGB camera 2720. In some embodiments, pulsed infrared light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the camera system 122 to a particular location on the targets or objects in the capture area. Additionally, the phase of the outgoing light wave may be compared to the phase of the incoming light wave to determine a phase shift. The phase shift may then be used to determine a physical distance from the capture device to a particular location on the targets or objects. Time-of-flight analysis may be used to indirectly determine a physical distance from the camera system 122 to a particular location on the targets or objects by analyzing the intensity of the reflected beam of light over time via various techniques including, for example, shuttered light pulse imaging.
In other implementations, the camera system 122 may use structured light to capture depth information. In such an analysis, patterned light (i.e., light displayed as a known pattern such as a grid pattern or a stripe pattern) may be projected onto the capture area via, for example, the IR light component 2710. Upon striking the surface of one or more targets or objects in the capture area, the pattern may become deformed in response. Such a deformation of the pattern may be captured by, for example, the IR camera 2715 and/or the RGB camera 2720 and may then be analyzed to determine a physical distance from the capture device to a particular location on the targets or objects.
The camera system 122 may utilize two or more physically separated cameras that may view a capture area from different angles, to obtain visual stereo data that may be resolved to generate depth information. Other types of depth image arrangements using single or multiple cameras can also be used to create a depth image. The camera system 122 may further include a microphone 2725. The microphone 2725 may include a transducer or sensor that may receive and convert sound into an electrical signal. The microphone 2725 may be used to reduce feedback between the camera system 122 and the multimedia console 112 in the target recognition, analysis, and tracking system 2700. Additionally, the microphone 2725 may be used to receive audio signals that may also be provided by the user 110 to control applications such as game applications, non-game applications, or the like that may be executed by the multimedia console 112.
The camera system 122 may further include a processor 2730 that may be in operative communication with the image camera component 2705 over a bus 2740. The processor 2730 may include a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions that may include instructions for storing profiles, receiving the depth image, determining whether a suitable target may be included in the depth image, converting the suitable target into a skeletal representation or model of the target, or any other suitable instruction. The camera system 122 may further include a memory component 2740 that may store the instructions that may be executed by the processor 2730, images or frames of images captured by the cameras, user profiles or any other suitable information, images, or the like. According to one example, the memory component 2740 may include RAM, ROM, cache, Flash memory, a hard disk, or any other suitable storage component. As shown in
The camera system 122 operatively communicates with the multimedia console 112 over a communication link 2745. The communication link 2745 may be a wired connection including, for example, a USB (Universal Serial Bus) connection, a Firewire connection, an Ethernet cable connection, or the like and/or a wireless connection such as a wireless IEEE 802.11 connection. The multimedia console 112 can provide a clock to the camera system 122 that may be used to determine when to capture, for example, a scene via the communication link 2745. The camera system 122 may provide the depth information and images captured by, for example, the IR camera 2715 and/or the RGB camera 2720, including a skeletal model and/or facial tracking model that may be generated by the camera system 122, to the multimedia console 112 via the communication link 2745. The multimedia console 112 may then use the skeletal and/or facial tracking models, depth information, and captured images to, for example, create a virtual screen, adapt the user interface, and control apps/games 2750.
A motion tracking engine 2755 uses the skeletal and/or facial tracking models and the depth information to provide a control output to one more apps/games 2750 running on the multimedia console 112 to which the camera system 122 is coupled. The information may also be used by a gesture recognition engine 2760, depth image processing engine 2765, and/or operating system 2770.
The depth image processing engine 2765 uses the depth images to track motion of objects, such as the user and other objects. The depth image processing engine 2765 will typically report to the operating system 2770 an identification of each object detected and the location of the object for each frame. The operating system 2770 can use that information to update the position or movement of an avatar, for example, or other images shown on the display 150, or to perform an action on the user interface.
The gesture recognition engine 2760 may utilize a gestures library (not shown) that can include a collection of gesture filters, each comprising information concerning a gesture that may be performed, for example, by a skeletal model (as the user moves). The gesture recognition engine 2760 may compare the frames captured by the camera system 112 in the form of the skeletal model and movements associated with it to the gesture filters in the gesture library to identify when a user (as represented by the skeletal model) has performed one or more gestures. Those gestures may be associated with various controls of an application and direct the system to open the personalized home screen as described above. Thus, the multimedia console 112 may employ the gestures library to interpret movements of the skeletal model and to control an operating system or an application running on the multimedia console based on the movements.
In some implementations, various aspects of the functionalities provided by the apps/games 2750, motion tracking engine 2755, gesture recognition engine 2760, depth image processing engine 2765, and/or operating system 2770 may be directly implemented on the camera system 122 itself.
Based on the foregoing, it should be appreciated that technologies for multitasking experiences with interactive PIP have been disclosed herein. Although the subject matter presented herein has been described in language specific to computer structural features, methodological and transformative acts, specific computing machinery, and computer-readable storage media, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features, acts, or media described herein. Rather, the specific features, acts, and mediums are disclosed as example forms of implementing the claims.
The subject matter described above is provided by way of illustration only and should not be construed as limiting. Various modifications and changes may be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the present invention, which is set forth in the following claims.