A user may interact with a computing device via an input device such as a game controller, computer mouse, keyboard, etc. which provides data and control signals to the computing device. Such a computing device and input device may be part of a computing system including a display device that displays a user interface of selectable items. A user may then use the input device to navigate through the items and select a particular item of interest. In some cases, the input device may communicate wirelessly to provide interaction with the user interface.
Accordingly, the present disclosure provides a method of using a wireless controller to interact with a user interface presented on a display. The method includes receiving an audio signal from the wireless controller based on an audio input applied to the wireless controller, and receiving a position signal from the wireless controller based on a position input applied to the wireless controller. Based on the audio signal and the position signal, a selection command is recognized which causes selection of a user interface item on the display. In addition to selection of the user interface item, a position signal or position signals from the wireless controller may be used to navigate the user interface to highlight a user interface item for selection.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Wireless controller 102 may also include a microphone 110 configured to detect audio inputs such as a user's voice, hand clapping, finger snapping, or any other type of audio input.
In response to position inputs and audio inputs applied to wireless controller 102, the wireless controller may output and transmit corresponding signals to interface module 104. For example, in some embodiments, wireless controller 102 may be configured, based on the applied position inputs, to output position signals (e.g., a stream of positional data signals) to interface module 104. Likewise, wireless controller 102 may be configured, in response to the applied audio inputs, to output audio signals (e.g., a stream of audio signals) for receipt by interface module 104. Wireless controller 102 may be configured to transmit these signal streams to interface module 104 via a wireless local area network, a Bluetooth connection or any other wireless link which is appropriate to a given setting. Wireless controller 102 may further include a multicolor LED or other output configured to be software-controlled. Accordingly, in such a case, wireless controller 102 may be further configured to receive one or more signals from interface module 104 to illuminate the LED or otherwise provide an output.
In some embodiments, wireless controller 102 may have a form factor of a handheld microphone such as, for example, a vocal microphone. As will be described in various examples, the audio and accelerometer outputs of the controller may be employed to generate commands to control a user interface, via interaction with interface module 104. Furthermore, the user interface functionality may be achieved without additional buttons, actuators, etc. on the wireless controller. This may be desirable in some cases to preserve a desired form factor or aesthetic for the wireless controller, for example so that it appears to be a “real” microphone of the type used in audio recording and performance.
Interface module 104 may be implemented via executable instructions such as instructions on a data-holding subsystem 112 that are executable by a logic subsystem 114. Data-holding subsystem 112 may be any suitable computer readable medium, such as a hard drive, optical drive, memory chip, etc. Further, in some embodiments, data-holding subsystem 112 may be a removable computer readable medium such as a memory card, CD, DVD and the like.
As described in more detail hereafter with reference to
In some embodiments, the above mentioned computing system may include a video gaming system. In such a case, logic subsystem 114 may be included on a game-playing device, such as a video game console, which may be configured to execute instructions on data-holding subsystem 112 for instantiating interface module 104. Such instructions may also include game code for executing a music video game.
In the example setting of a music video game, wireless controller 102 may then be a gaming controller with the form factor of a handheld microphone for delivering an audio performance. The microphone would be configured to receive audio inputs (e.g., a user's vocal inputs) during game play. The above-described navigation commands and selection commands enable the wireless controller in this example to also function as an input device, enabling the user to navigate menus and select items displayed on the menus. As previously discussed, the navigation and selection commands may be achieved via audio and accelerometer signals without additional buttons, actuators, etc. Therefore, the wireless controller may maintain the form factor of a handheld microphone allowing for a more realistic gaming experience for the user.
When included in the present examples, a logic subsystem (e.g., logic subsystem 114) may include one or more physical devices configured to execute one or more instructions. For example, the logic subsystem may be configured to execute one or more instructions that are part of one or more programs, routines, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result. The logic subsystem may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The logic subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located in some embodiments.
When included in the present examples, a data-holding subsystem (e.g., data-holding subsystem 112) may include one or more physical devices configured to hold data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem may be transformed (e.g., to hold different data). The data-holding subsystem may include removable media and/or built-in devices. The data-holding subsystem may include optical memory devices, semiconductor memory devices, and/or magnetic memory devices, among others. The data-holding subsystem may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, a logic subsystem and data-holding subsystem may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip. The data-holding subsystem may also be in the form of computer-readable removable media, which may be used to store and/or transfer data and/or instructions executable to implement the herein described methods and processes.
The terms “module” and “engine” may be used to describe an aspect of computing system 100 that is implemented to perform one or more particular functions. In some cases, such a module or engine may be instantiated via logic subsystem 114 executing instructions held by data-holding subsystem 112. It is to be understood that different modules and/or engines may be instantiated from the same application, code block, object, routine, and/or function. Likewise, the same module and/or engine may be instantiated by different applications, code blocks, objects, routines, and/or functions in some cases.
When included, a display subsystem (e.g., display subsystem 106) may be used to present a visual representation of data held by a data-holding subsystem. As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of the display subsystem may likewise be transformed to visually represent changes in the underlying data. The display subsystem may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with a logic subsystem (e.g., logic subsystem 114) and/or a data-holding subsystem (e.g., data-holding subsystem 112) in a shared enclosure, or such display devices may be peripheral display devices.
At 202, method 200 includes displaying a plurality of user interface items on a display. Typically, the user interface items are selectable items presented via a user interface. Examples of such user interface items may include, for example, menu items, contextual menu items, selectable options, etc. In the context of a gaming system, such user interface items may include game settings, character settings, audio settings, visual settings, or any other selectable item or option. It should be appreciated that these examples of user interface items are nonlimiting in that user interface items may include any other suitable user interface items as indicated by the instructions of the interface module. As previously discussed, the user interface items may be presented in a user interface, such as user interface 116, and may be displayed on a display, such as display device 118 (
Returning to
The position signal received at 204 is an output (e.g., from an accelerometer) based on a position input applied to the wireless controller (e.g., movement, tilting, etc. of the controller). The output may indicate, among other things, a positioning of the wireless controller at an angle. As an example,
At 404, wireless controller 402 is depicted in a horizontal position, i.e., an angle of θ=0° where θ is measured with respect to the horizon. Accordingly, such positioning corresponds to a z-axis component of gravitational acceleration of 0 g, where g is one gravitational unit (e.g., near the surface of the Earth, 1 g≈32 ft/sec2). In other words, when the controller is in the horizontal position shown at 404, the gravitational acceleration along the axial direction of the controller body is zero. At 406, wireless controller 402 is depicted at an intermediate angle (e.g., θ=45°) with respect to the horizon and such positioning corresponds to a z-axis component of gravitational acceleration of some intermediate value between 0 g and 1.0 g. At 408, wireless controller 402 is depicted at an angle of θ=90° with respect to the horizon and such positioning corresponds to a z-axis component of gravitational acceleration of 1 g.
Returning to
In some embodiments, a module such as interface module 104 described above with reference to
For example, in the case that the position signal indicates a positioning of the controller at an angle, a first range of angle values may be mapped to a particular navigation command, with a second range of angle values corresponding to another navigation command. As an example,
It will be appreciated that the angle ranges depicted in
Returning to
Further, in some embodiments, the navigation command may also indicate how the navigation action may be displayed. For example, holding the controller at an angle for a given duration of time may not only indicate a navigation command of scrolling but may further indicate a speed at which to scroll.
Thus, in some embodiments, a computing system such as computing system 100 may be configured in accordance with method 200 to allow a user to interact with a displayed user interface simply by moving, tilting, rotating etc. a wireless controller. Thus, executing the instructions of method 200 can allow a wireless controller to be used for navigation instead of having to use additional buttons, actuators or a traditional directional pad, which may provide a more natural or otherwise improved user experience. For example, in a music video game, a user could easily navigate through a list of selectable items with the microphone being used for singing, simply by tilting/moving the microphone as described above.
In addition to navigation of a user interface, the present disclosure encompasses selection of user interface items based on position signals and audio signals from a wireless microphone or other wireless controller. In particular,
At 702, method 700 includes displaying a user interface item on a display. In some embodiments, a method such as method 200, described above, may be performed prior to step 702 to navigate a selection indicator to highlight a particular user interface item of choice.
At 704, method 700 includes receiving an audio signal from a wireless controller. The wireless controller may be a wireless microphone or other wireless controller, such as wireless controller 102 of
At 706, method 700 includes receiving a position signal from a wireless controller. Such a step may be similar to that of step 204 described above with reference to
At 708, method 700 includes recognizing a selection command based on the audio signal and the position signal. Such a selection command indicates a desired selection of the user interface item. In some embodiments, recognizing the selection command is defined by the audio signal and the position signal occurring relative to one another within a predetermined time interval. For example, a selection command may be recognized if the position signal of interest occurs within a few tenths of a second of the audio signal of interest. As an example,
Returning to
At 710, method 700 includes selecting the user interface item in response to recognition of the selection command. Thus, in some embodiments, a computing system such as computing system 100 that is configured to execute instructions of method 700 may allow a user of the wireless controller to select items of a user interface displayed on the display simply by moving the wireless controller and making a noise. Thus, such selection may be independent of any physical button activation on the wireless controller. For example, a user may perform a motion of the wireless controller in a direction axial to the wireless controller, while tapping on the end of the microphone. Upon sensing the axial acceleration and the audio impulse from the tap, which would occur close in time as readily identifiable signals, the instructions would then execute a selection command of a user interface item.
In addition to the above navigation and selection operations, audio and/or position inputs may be used to provide other interactions with a user interface. It can be determined for example, that various physical motions applied to a wireless controller produce accelerometer signals having identifiable characteristics. These physical motions can then be mapped to various user interface operations. One example is the use of a long sweeping motion to cancel a selection item, or to step backward through a hierarchical menu sequence. More specifically, with reference to
It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.