Video game consoles employ controllers to allow users to interface with software, such as video games. A typical controller has a number of controls. For example, a gamepad type controller will typically incorporate one or more directional controls, such as a number of buttons arranged in a directional keypad, one or more analog sticks, or a combination of such controls. In addition to the directional controls, a controller will typically include one or more action buttons that may be located on the face or shoulders of the controller. In some cases, directional controls may provide action selection functionality as well. For example, the analog sticks in a controller compatible with the XBOX 360® brand video game console, available from Microsoft Corp. of Redmond, Wash., can be pushed in as well as moved directionally.
In addition to controlling the movement and actions of a character in a video game, a controller can be used to navigate a graphic user interface, such as the dashboard presented to users of the XBOX 360® brand video game console. The graphic user interface may include a number of menus and sub-menus that allow a user to, for example, execute game software, access media resources such as image, video, or audio files or media discs, configure system settings, etc. Navigating the graphic user interface is conventionally accomplished via a combination of directional navigation commands, e.g., left, right, up, and down, input using the directional controls and action buttons. While this control scheme can work well for traditional gamepad type controllers, it relies on the use of several buttons and is thus not well-suited for controllers that lack the buttons typically found on gamepad type controllers.
For example, some video games involve singing into a microphone type controller. The limited space available on the surface of the microphone type controller limits the number of buttons that can be implemented on the microphone type controller. In addition, the presence of buttons on such a controller may detract from the overall gaming experience by reducing the degree of realism of the controller.
According to various embodiments, a game controller, such as a microphone controller, incorporates motion sensors that are configured to detect gestures performed by a user of the game controller. The gestures can be used to navigate and perform actions in a graphic user interface that a game console employs to provide a consistent user experience when navigating to different media types available on the game console.
One embodiment is directed to a method for using a game controller to navigate a graphic user interface presented by a video game console to a user. A motion of the game controller is detected and is recognized as a gesture. An operational mode in which the game controller is operating is then determined. If the game controller is operating in a first operational mode, a navigation command corresponding to the recognized gesture is executed in the graphic user interface. On the other hand, if the game controller is operating in a second operational mode, an action corresponding to the recognized gesture is performed in the graphic user interface. This method may be performed by a computer executing instructions stored on a computer readable storage medium.
Another embodiment is directed to a game controller for use with a video game console. A microcontroller in electrical communication with at least one motion sensor configured to detect motion of the game controller. While not required, in some embodiments, the at least one motion sensor is incorporated in the game controller. The microcontroller or the video game console is configured to recognize the detected motion as a gesture. If the game controller is operating in a first operational mode, the recognized gesture is mapped to a navigation command that is executed in a graphic user interface presented by the video game console to a user. If the game controller is operating in a second operational mode, the recognized gesture is mapped to an action that is performed in the graphic user interface.
Various embodiments may realize certain advantages. For example, by using gestures to perform navigation commands and actions in the graphic user interface, the game controller avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons. As a result, the sense of realism and the overall gaming experience may be enhanced.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The illustrative embodiments will be better understood after reading the following detailed description with reference to the appended drawings, in which:
The inventive subject matter is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, it is contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies.
With reference to
Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media include both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110. Communication media typically embody computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation,
The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
The computer 110 may operate in a networked or distributed environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in
When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
It will be appreciated that one particular application of computer 110 is in the form of a game console 200 as depicted in
A graphics processing unit (GPU) 208 and a video encoder/video codec (coder/decoder) 214 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the graphics processing unit 208 to the video encoder/video codec 214 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 240 for transmission to a television or other display device. A memory controller 210 is connected to the GPU 208 and CPU 201 to facilitate processor access to various types of memory 212, such as, but not limited to, a RAM (Random Access Memory).
Game console 200 includes an I/O controller 220, a system management controller 222, an audio processing unit 223, a network interface controller 224, a first USB controller 226, a second USB controller 228 and a front panel I/O subassembly 230 that may be implemented on a module 218. The USB controllers 226 and 228 serve as hosts for peripheral controllers 242(1)-242(2), a wireless adapter 248, and an external memory unit 246 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.). The network interface 224 and/or wireless adapter 248 provide access to a network (e.g., the Internet, a home network, etc.) and may be any of a wide variety of various wired or wireless interface components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like. The game console 200 may be connected to a controller sensing device 254 to sense the position or motion of the peripheral controllers 242(1)-242(2) or other accessories. The controller sensing device may be implemented using, for example, a three-dimensional camera or an ultrasonic triangulation system.
System memory 243 is provided to store application data that is loaded during the boot process. A media drive 244 is provided and may comprise a DVD/CD drive, a hard drive, or a removable media drive, etc. The media drive 244 may be internal or external to the game console 200. When the media drive 244 is a drive or reader for removable media (such as removable optical disks or flash cartridges), then the media drive 244 is an example of an interface onto which (or into which) media are mountable for reading. Application data may be accessed via the media drive 244 for execution, playback, etc. by game console 200. Media drive 244 is connected to the I/O controller 220 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 3394). While media drive 244 may generally refer to various storage embodiments (e.g., hard disk, removable optical disk drive, etc.), game console 200 may specifically include a hard disk 253, which can be used to store game data.
The system management controller 222 provides a variety of service functions related to assuring availability of the game console 200. The audio processing unit 223 and an audio codec 232 form a corresponding audio processing pipeline with high fidelity, 3D, surround, and stereo audio processing according to aspects of the present subject matter described herein. Audio data is carried between the audio processing unit 223 and the audio codec 232 via a communication link. The audio processing pipeline outputs data to the A/V port 240 for reproduction by an external audio player or device having audio capabilities.
The front panel I/O subassembly 230 supports the functionality of the power button 250 and the eject button 252, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the game console 200. A system power supply module 236 provides power to the components of the game console 200. A fan 238 cools the circuitry within the game console 200.
The CPU 201, GPU 208, memory controller 210, and various other components within the game console 200 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures.
When the game console 200 is powered on or rebooted, application data can be loaded from the system memory 243 into memory 212 and/or caches 202, 204 and executed on the CPU 201. The game console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on the game console 200. In operation, applications and/or other media contained within the media drive 244 may be launched or played from the media drive 244 to provide additional functionalities to the game console 200.
The game console 200 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the game console 200 may allow one or more users to interact with the system, watch movies, listen to music, and the like. However, with the integration of broadband connectivity made available through the network interface 224 or the wireless adapter 248, the game console 200 may further be operated as a participant in a larger network community.
The game console 200 can be used with a variety of controllers 242, such as the controllers 242(1), 242(2), and 242(3) of
One example implementation of a microphone controller is depicted at
The power button 302 is connected to a power control module 404, which is in turn connected to a power source 406, such as one or more batteries. When the power button 302 is actuated, the power control module 404 causes the microphone controller 300 to draw power from the power source 406, activating the microphone controller 300. If the microphone controller 300 is already activated, actuating the power button 302 may cause the power control module 404 to deactivate the microphone controller 300. In some embodiments, the power control module 404 may consist of a simple electrical switch that completes a circuit when the power button 302 is actuated. In other embodiments, the power control module 404 may be more complex, such that, for example, the power button 302 must be continuously actuated for some period to either activate or deactivate the microphone controller 300. As another example, the power control module 404 may be configured to automatically deactivate the microphone controller 300 when the microphone controller 300 is not used beyond a specified timeout period. This configuration may provide the advantage of avoiding excess power consumption when the microphone controller 300 is not in use. In some embodiments, as an alternative to deactivating the microphone controller 300, the power control module 404 may be configured to place the microphone controller 300 in a low-power “sleep” mode, either in response to actuation of the power button 302 or after the timeout period.
The microphone controller 300 also includes one or more motion sensors. While
In the embodiment shown in
In embodiments in which the signals are provided to the microcontroller 414, the microcontroller 414 generates an output signal based on the signals received from the motion sensors 408, 410, and 412. This output signal may also be based in part on the signal generated by the transducer 402, as indicated by the dashed line connecting the transducer 402 to the microcontroller 414 in
In operation, the motion sensors 408, 410, and 412 detect gestures performed by a user of the microphone controller 300. Specifically, each motion sensor 408, 410, and 412 detects motion in one or more orthogonal directions and outputs motion data. Gestures are derived from the motion data by software. The software that converts the motion data to gestures may reside in the game console 200 or may be embedded in the microphone game controller 300. If the software resides in the game console 200, the game controller 300 may be configured to output the motion data to the game console 200. The gestures can be simple directional movements, e.g., UP, DOWN, LEFT, or RIGHT, or more complex movements to represent commands such as START, BACK, ENTER, ESCAPE, and the like. More complex movements can be represented by simple movements combined with an actuation of the modifier button 304.
The gestures are then used to control various aspects of the operation of the game console 200, including, for example, selecting and launching a game. Further, as described above, the game console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on the game console 200. One particular example of such a graphic user interface is the dashboard menu used by the XBOX 360® brand video game console. According to various embodiments, gestures detected by the motion sensors 408, 410, and 412 are used to navigate the graphic user interface and to perform actions using the graphic user interface. The modifier button 304 may be used to switch between one operational mode in which gestures are used to perform navigation commands, such as UP, DOWN, LEFT, and RIGHT, and another operational mode in which gestures are used to perform actions, such as START, BACK, ENTER, and ESCAPE. It will be appreciated by those skilled in the art that the modifier button 304 may also be used to place the microphone controller 300 in operational modes other than those specifically described in this disclosure. By using gestures to perform navigation commands and actions in the graphic user interface, the microphone controller 300 avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons on the body of the microphone controller 300. As a result, the sense of realism and the overall gaming experience may be enhanced.
As described above in connection with
On the other hand, if the microphone controller 300 is operating in the operational mode in which gestures are used to perform actions, then, at a step 508, the game console 200 maps the detected gesture to an action, such as START, BACK, ENTER, and ESCAPE. At a step 510, the action is performed, causing the graphic user interface to respond appropriately. For example, if the detected gesture is mapped to the action START, the highlighted item, e.g., a game or a movie, may be initiated. As another example, if the detected gesture is mapped to the action BACK, the graphic user interface may display a previously displayed menu or menu item. Using a combination of navigation and action gestures, the user can use the microphone controller 300 to perform most or all of the functions that are supported by a gamepad type controller without needing to use a separate controller.
While the above embodiments have been described in the context of a microphone controller, it will be appreciated that the principles described herein can be applied to any of a variety of game controllers. These principles may be particularly suitable for application to game controllers for which it is desirable to minimize the number of buttons, for example, to enhance realism. Other types of game controllers in connection with which the principles described herein may be particularly beneficial may include, but are not limited to, exercise controllers intended to be worn on the wrists and/or feet of a user for use in exercise or dancing games, pointing controllers such as guns, and specialized sports controllers for use in sports games, such as simulated tennis rackets, baseball bats, and the like.
Although the subject matter has been described in language specific to the structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features or acts described above are disclosed as example forms of implementing the claims.