A hardware game controller can provide input to a video game (e.g., to control an object or character in the video game) and, optionally, to interact with other software. The video game may be running on a computing device that is locally connected with the hardware game controller (e.g., a mobile phone or tablet) or a cloud-based computing platform, among other possibilities.
As mentioned above, a hardware game controller (also referred to herein as “game controller” or “controller”) can provide input to a video game played on a mobile computing device, such as a mobile phone or tablet. Some games require a touch input on the touch screen of a mobile computing device, but a game controller may not have a physical touch screen to receive and provide the required touch input. The game controller can be configured to convert an actuation of a control surface (e.g., a button, joystick, etc.) of the game controller to a “touch screen input” (e.g. an input that mimics the input a touch screen would provide to the mobile computing device), so that the appropriate input can be provided to the computing device. A game controller that can provide this conversion may be referred to as a “virtual controller.” Some prior virtual controllers require a user to manually create the mapping between actuation of control surfaces and touch screen inputs. This can involve dropping-and-dragging on a displayed graphical user interface to associate every possible touch screen input with a control surface actuation on the game controller (e.g., swiping across the screen to make a character run is represented by moving the right joystick, etc.). This can be a very cumbersome and frustrating process for the user. Further, there can be limits on the associations that can be made in the set-up phase. For example, there may not be an option to map a combination of control surface actuations. Also, because the set-up process can require the user to use their finger, positioning may not be nuanced.
To address these one or more problems, in one example embodiment discussed below, developer support is used with some calculations in each area of the game to effectively automatically implement a mapping as opposed to, for example, requiring a developer to build a function for controller support. In one example implementation, individual maps for each game are curated. This can be a manual process by the developer or some other party (though, preferably not by the end user of the game controller) to indicate all of, or substantially all of, the various moves that the user can make in the game. Preferably, although not necessary, the map can match the scope of options a developer intended to provide (e.g. not simply a user's interpretation, which can be narrower) and also can include some input regarding user intention (e.g., what users do in the game). In operation according to one example, when a user boots up a game, the computing device can pull down the map from a storage location (e.g, in the cloud or some other storage location), perform some processing, and then provide the game controller with the information it needs to perform the mapping-again, without, or with minimal, manual user configuration-so the user can more simply play the game. In one example implementation, the maps for various games are stored in a single file (though, more files can be used for storage) that includes a series of equations. The game controller can receive an input (e.g. screen attributes and capabilities, etc.) from the computing device and use the equations to align the control inputs with the specific touch screen of the computing device.
Before turning to these and other examples, the following section provides an overview of an example game controller and computing device. It should be understood that these are merely examples and other implementations can be used. Accordingly, none of the details presented herein should be read into the claims unless expressly recited therein.
In this example, the computing device takes the form of a mobile phone 200 (e.g., running on Android, iphone, or other operating system). However, in other examples, the computing device takes the form of a tablet or other type of computing device. The mobile phone 200 in this example comprises a touch-screen display, a battery, one or more processors, and one or more non-transitory computer-readable media having program instructions (e.g., in the form of an app) stored therein that, when executed by the one or more processors, individually or in combination, cause the mobile phone 200 to perform various functions, such as, but not limited to, some or all of the functions described herein. The mobile phone 200 can comprise additional or different components, some of which are discussed below.
In operation, the game controller 10 can be used to play a game that is locally stored on the mobile phone 200 (a “native game”) or a game 320 that is playable remotely via one or more digital data networks 250 such as a wide area network (WAN) and/or a local area network (LAN) (e.g., a game offered by a cloud streaming service 300 that is accessible via the Internet). For example, during remote gameplay, the computing device 200 can send data 380 to the cloud streaming service 300 based on input from the game controller 10 and receive streamed data 390 back for display on the phone's touch screen. In one example, a browser on the mobile phone 200 is used to send and receive the data 380, 390 to stream the game 320. The mobile phone 200 can run an application configured for use with the game controller 10 (“game controller app”) to facilitate the selection of a game, as well as other functions of the game controller 10. U.S. patent application Ser. No. 18/214,949, filed Jun. 27, 2023, which is hereby incorporated by reference, provides additional example use cases for the game controller 10 and the mobile phone 200. The game controller app is sometimes referred to herein as a “platform operating service.”
As shown in
As also shown in
In some embodiments (see
Some people who play games on their mobile devices have found that using an attachable game controller provides an overall better gaming experience. Not all games that can be played via a mobile device, however, support attachable game controllers. In fact, many mobile games, including well-known or popular games, do not support such controllers. As an example, not all games available on the Android platform are set up to be played with an alternate input such as a game controller. Instead, these mobile games rely on user input through touchscreens on the mobile device itself. Compared to using a controller to operate a game, touchscreen inputs provide an inferior-gaming experience because, for example, touchscreen inputs typically only allow for a certain amount of speed, accuracy, or both. Unfortunately, such games do not provide a native way to interface with the inputs/outputs of attachable game controllers. This can leave many users with an inability to play games due to various issues such as hand size with small buttons on a user interface or the inability to use an accessibility controller to play a game.
One possible solution to this problem is to manually map game controller buttons to touchscreen inputs, but this can be cumbersome for the average user. For example, a user may be required to manually drag and drop controls on the screen and may have trouble scaling the touch controls to the screen size. In general, configuration is a user-directed process and requires much work on the user's behalf. For software-only solutions, developers end up having to resort to debugging-and-developer tooling to get access to touch application program interfaces (APIs) in the phone. This can require the users to enable high level permissions that can let a developer have administrative access to their phones. Another issue with software-only solutions with developer mode access is that the game developer can prevent the user from playing to mitigate hackers and cheaters. When this happens, a user's account can get banned and labeled a hacker or cheater account. In addition to overcoming the problems associated with these traditional solutions, the following embodiments do not create a risk of a user being labeled a hacker.
In general, the following embodiments provide custom and intuitive input mappings to a game controller on the Android operating system, or other computing device operating systems, that work from the start with very little, or no, user interaction. These embodiments may also provide an intuitive radial wheel for seamlessly changing between modes in each game. For example, PUBG Mobile has many different modes, such as Infantry, Driving a Vehicle, Passenger in a Vehicle, and Parachuting. Within each mode, similar touch screen inputs can result in different actions in the game. Switching between these modes seamlessly may be desired to allow competitive users to not miss a beat in multiplayer games and to ensure accuracy in their commands.
One aspect of the following embodiment is a Touch Synthesis Engine, which can be implemented by one or more processors in the game controller executing computer-readable program code. The Touch Synthesis Engine is a tool that allows an attachable game controller to input commands to a computing device, such as a mobile phone, that would otherwise be communicated to the computing device through touch on the computing device's touchscreen. In one example, the game controller connects to the computing device via a USB connection, although other wired or wireless connections can be used. When enabled, the Touch Synthesis Engine automatically maps controller inputs to existing touch input commands and provides a seamless, intuitive interface for a user.
The Touch Synthesis Engine allows users to play games on computing devices that do not have native controller support. In one example embodiment, the Touch Synthesis Engine uses a series of calculations and estimations to identify the most user-intuitive mappings for touchscreen inputs to translate to controller settings. The Touch Synthesis Engine can appear to the computing device, such as a mobile phone, as an external touch screen such that the commands it sends to the computing device, such as a mobile phone, mimic the inputs as though they were coming from an internal/native touchscreen.
In this example embodiment, because the computing device 500 supports an external touchscreen, the game controller 400 can function as both a game controller and a virtual touch screen. In operation, the game controller 400 connects to the USB port of the computing device 500 (e.g. a phone running the Android operating system). In other embodiments, another wired or wireless interface is used to connect the game controller 400 to the computing device 500. The USB and HID drivers 505, 520 on the computing device 500 eventually identify two separate devices: a HID game controller 415 and a HID touch screen 420. Both HIDs 415, 420 are natively available to the operating system's input manager 535, which allows for the operating system or any foreground application to access them. In addition, the computing device 500 has its own internal touch screen interface 540 that also coexists inside of the input manager 535. Currently in this embodiment, the Android operating system only allows one pointer device to be actively providing motion inputs at a time. Lastly, vendor USB interfaces are utilized by the game controller's app (the platform operating service 510), which provide internal control over the game controller configuration, which overlays to show, etc.
On the game controller 400, the gamepad core 440 routes inputs to both HID game controller and HID touch screen interfaces 415, 420. For the HID touch screen 420, the inputs go through an additional layer (the Touch Synthesis Engine 435), which synthesizes the controller inputs into multi-dimensional touch commands which can include not only touch on a specific location on the screen, but can include dynamic motion of the touch contact (e.g. strength of a push and motion like a swipe). In one embodiment, the Touch Synthesis Engine 435 does this by transforming each physical input into vectorized data that can then be used to stimulate various touch surfaces. Ultimately, these surfaces get translated into digitized touch contacts “on” the virtual touch screen. Lastly, the bulk data interface 425 allows for a command API 430 to manage the Touch Synthesis Engine 435 This is how the platform operating service 510 loads-in the game specific data for the virtual controller.
Although HID is usually associated with USB, some embodiments may choose to substitute an alternate serial communication type. For example, one might replace the USB connection with Lightning. In this case, the HID and bulk data interfaces would still exist but be replaced with their native lightning equivalents. Conceptually, the system would operate very similarly with any transceiver that supports HID and some general bulk data exchange.
Turning again to the drawings,
The controller map is then parsed, and a default game mode is selected (act 620). Within the controller map, there can be multiple control layouts triggered by activating different “modes” within the game, and one mode can be specified to be loaded by default or it can be set to a “disabled” mode for when there are no touch surfaces defined. The default mode for a game can be considered the “target” game mode. It should be understood that other game modes such as a “vehicle” mode can be programmed to be the “target” game mode. The “target” game mode is what will be processed to show up for the user when they begin using the “virtual controller” with the game. First, all of the defined surface nodes in the file are converted into visual overlay objects (act 630) (to convert the overlay objects, and arrange them on the screen based on the coordinate data within). The controller map is then compiled and adjusted to the specific display attributes of the native (or local) touch screen (act 640). In this step, all the surface nodes are compiled into a packed representation that is suitable to be programmed into the physical controller 400. In this step, the coordinate data is also adapted to the local computing devices touch screen, taking into account the device-specific attributes or differences, such as, for example, aspect ratio, display density, and display cutout/notch keep outs.
Once the touch synthesis data has been adapted and compiled, the touch synthesis data is transferred to the game controller 400 (act 650). The packed representation is considerably more efficient and can be uploaded in very few transactions, making the operation nearly instantaneous. With all of the data properly loaded in the accessory/game controller, the next step is to activate the HID touch screen, while simultaneously disabling the HID game controller, and starting the game. In most devices, switching HID interfaces is simply a matter of conditionally deciding which HID input reports to send or skip. Here, the HID touch screen is enabled (e.g. “driving”), and the HID game controller is disabled (act 660). If visual overlays are to be shown, the visibility of these floating controls are adjusted at this point as well (act 670), and the game is launched (act 680). In practice, the various steps performed above complete in a very short amount of time, which result in a seamless, or substantially seamless, transition into virtual touch controls.
Additional embodiments of a loading the “target” game mode can include storing a previously loaded game mode in local storage on either the game controller, mobile device, or platform operating system. The pervious loaded game mode can be from an earlier play session or what was used on another device.
The Touch Synthesis Engine can mimic a human's touch inputs by translating game controller inputs into touch screen inputs. As game controller inputs are pressed with a digital touch, contact is allocated and sent to the computing device 500. For a simple touch button, the position may be static and the touch momentary, but for a more complex control like a joystick, the touch position may move as the joystick position changes and the touch released once the joystick moves to its resting position.
According to this example, since the game controller 400 does not in this instance implement a physical touch screen, only the logical coordinate values matter. The computing device 500 will typically convert the logical values into the physical screen dimensions, so the logical min/max really only affects the numerical resolution. Using a range such as 0-65535 should be more than enough to cover typical smartphone and tablet screen sizes (e.g. pixels), while also providing some extra bits for increased numerical resolution.
In general, the game controller 400 converts various inputs into multiple touch positions on a virtual screen. The game controller 400 can be programmed with the physical locations and constraints of game's touch screen elements, and then project the positions into the logical space of the virtual touch screen. Another embodiment could be the touch screen elements and constraints are calculated on the platform operating service then sent down to the game controller. The computing device 500 can automatically scale the HID touch screen inputs to the physical touch screen coordinates, ultimately achieving virtual touch events as if the user had tapped on the internal touch screen directly. For example,
The Touch Synthesis Engine can use a Touch Synthesis API communication layer to configure the synthesis engine. This can consist of a proprietary protocol used between the application and game controller 400. In one embodiment, the synthesis engine uses 16 or more input nodes and 16 or more surface nodes, each of which is a unique input that the game uses to direct user actions. Example input and surface notes are detailed below.
In this example, the first stage of the virtual controller processing is taking one or more inputs and transforming them into one or more outputs. These input transforms can vary from a simple digital input to a single Boolean output, to a more complex multi-axis X/Y output for a virtual joystick. Input transforms can also include multiple game controller elements. For example, a digital input can be combined with multiple joystick inputs to help enable a joystick that responds only when a particular input is held. Example include nodes include:
1. Generic Button: The generic button is the simplest input map, which just converts a digital gamepad button to a Boolean output. It is not very common for two buttons to map to the same location. This use case is reserved for games with fewer buttons where we may want to allow flexibility to press whatever button is more intuitive to the user. For example, a fighting game may opt to have A/B/X/Y operate the same as L1/R1/L2/R2 as two alternate ways to do the same thing.
2. Phased Button: The phased button is similar to the generic button but implements basic button phases for short press and long hold. The input transform interprets the press timing to decide which output should be activated. The timing information is encoded in the transform arguments.
3. Excluded Button: The excluded button is the complement to the button chord, where the primary input gets modulated by an exclusion input. This is the simpler, upstream version of the anti-chord surface.
4. 2-axis Joystick: The 2-axis joystick is essentially just a simple combination of two analog values into a single output vector. This is most commonly used to map a joystick to a radial joystick surface. In the normal case, only the axis inputs need to be specified. However, in some advanced cases it may be necessary to provide button inputs which should deactivate the joystick. These button exclusions are often useful when the joystick is also used with multiple surfaces.
5. 1-axis Trigger: The 1-axis trigger is very similar to a generic button except the scalar input supports a threshold value to control its activation. When using a generic button, any non-zero value of the trigger will activate the output. However, with this unique node the output will only be activated once the analog value crosses a specified threshold. It is common in many games to add a dead-band to triggers to provide the right gameplay feel, and also to avoid accidental presses.
6. Button chord 2:1: This node takes in two input buttons and only activates its output when both buttons are pressed. The input processing for this node is quite basic, and therefore does not have any hysteresis about the simultaneity (timing) of the presses. An example where this may be useful is when a game has an aiming mechanic that introduces an additional button to tap when held (e.g., Genshin Impact bow characters). For example, holding down L2 can make it so that pressing R2 has a different tap location.
7. Button Chord 2:3: This is an alternate version of the button chord node that provides additional outputs for the exclusive press of the inputs. This not only makes the controller map more compact by being more efficient but also implicitly applies masking to the presses. So, when the user presses button 1 & 2, the press 1/press 2 outputs are not active. This behavior can also be optionally disabled if masking is not preferred. Like the 2:1 version, this node does not have extra hysteresis on the press timing. See surface chords for a more advanced form of chording.
8. Joystick Aim Button: The joystick aim button is an interesting variation on the 2-axis joystick input node. In some ways, it is the inverse. For this node, the X/Y joystick values only get activated in the output vector when the specified button input is also being held down (inclusion rather than exclusion). For this node, the activation criteria is simply whether or not the button is currently being pressed. In most cases, it is better to ignore the state of the joystick for activation because usually this node is used to map to a character ability that can be optionally aimed, but the user can also just tap the ability to auto aim as well (e.g., MOBAs such as Wild Rift).
9. Joystick to dpad: This node converts a 2-axis joystick input into four quadrants which then get mapped to the four possible directional pad outputs. If diagonals are allowed, then a total of eight slices are supported allowing for two adjacent directions to be pressed simultaneously. An optional angle overlap parameter controls how many degrees of overlap between quadrants should be allowed for diagonal presses.
10. Dpad Hat: This node converts the four directional pad inputs into a single vector output. Depending on which direction is being pressed, a normalized unit vector is returned. In situations where two directions are pressed simultaneously (diagonals) the directional vectors are simply added together since they are assumed to be orthogonal. As a result, this node is capable of producing up to eight distinct directions in its output vector. The output direction vectors can then be scaled by a surface map to produce a virtual d-pad in game. Although it is possible to do the same mapping manually with four generic buttons, this form is a bit more compact, and is likely what would be most intuitive in a drag/drop controller map creation tool.
11. Axial joystick: A joystick input node which is sensitive in only one direction. It takes in joystick X/Y values and then coerces them into a single float output (in X component of output vector).
12. Joystick plus button: This is a variation of the 2-axis joystick that includes a button (usually the joystick button L3 or R3) as a second output. This can then be wired to a hybrid surface that combines the two vectors. For example, a double tap joystick which overrides the joystick when L3/R3 is pressed in order to overlay a double tap gesture momentarily. Similarly, L3/R3 could also be used to temporarily generate a swipe gesture (e.g. PUBG sprint toggle).
13. Pinch/Zoom Unipolar: This input node is designed for use with a panning surface to produce a pinch/zoom gesture. This particular flavor of the node is designed to support unipolar inputs, such as triggers or digital buttons (0-1). As such, the node requires two separate inputs, one for zoom in and one for zoom out. Then based on these inputs two direction vectors are pointed inward or outward. If both inputs are active at the same time the result is a zero vector.
14. Pinch/Zoom Bipolar: This input node is designed for use with a panning surface to produce a pinch/zoom gesture. This particular flavor of the node is designed to support bipolar inputs, such as joysticks. With a single joystick axis producing values between −1 and 1, this node is able to produce two touch vectors that are capable of both zooming in and out.
15. Button Group: This input node simply groups multiple buttons into the same output path. This is only really useful to pipe multiple buttons into a surface which requires multiple inputs (e.g. surface chord). The main reason you may want to use a surface chord is because it is possible to implement “masking” as well as a proper FSM with hysteresis.
16. Edge detector: The edge detector node converts a digital input into two outputs, one for each edge of the button change. Rising edge occurs when the button transitions from released to pressed. And falling edge occurs when the button transitions from pressed to released. The outputs are inherently one-shot (single frame pulse), so typically this kind of node is wired to a surface type which support pulse extension (e.g. tap button with non-zero pulse length).
Surface nodes are implicitly linked to a specific input transform and, therefore, can take in multiple inputs depending on the transform type. The responsibility of the surface node is to take in the vector of inputs and conditionally output an absolute touch position. Simple surfaces translate button up/down events to taps, but more complicated surfaces can implement a range of positions. For example, joystick input can be translated into a variable X/Y position and scaled based on configured radius to implement a traditional radial virtual joystick.
Surface nodes can produce zero or one touch outputs, depending on whether or not the surface is activated by its dependent input. The following are examples.
1. Tap Button: This is the most-common surface, which simply taps in a static location. It takes in a single input and produces a single tap when the input is active. This surface has an optional pulse length which will hold the tap down for a specified amount of time. Extending the pulse is useful when taking in input from a transform which produces a one-shot signal (e.g. phased button, multi-press button)
2. Radial Joystick: This surface is intended to be used for virtual joysticks in games that emulate physical joysticks. It takes in an X/Y vector and scales the position based on the specified width/height. Although joysticks are normally circular, this surface can be stretched to any oval shape or even a straight line.
3. Pan Joystick: This surface is intended to be used with 3D games which have a camera which can be panned. It takes in an X/Y vector which is interpreted as a directional vector and scaled by a multiplier. The resulting vector is then added to a tracked surface X/Y which results in a touch location that gradually moves in a particular direction. Once the touch hits the edge of the surface's defined frame, the value will wrap back to the origin. The result is that holding the joystick in a particular direction will result in a series of swipe movements that cause the camera to rotate.
4. Double Tap Button: This surface converts Boolean input to a double tap gesture. The gesture timing can be controlled with specified surface parameters. The gesture will start when the output transitions from false to true, and will continue to run until the gesture is complete.
5. Flex Canvas: This is very similar to the pan joystick but instead of wrapping when hitting the edge of the frame, the behavior is to clamp.
6. Slingshot: This is a variant of the radial joystick which has an additional dead-zone, but instead of reporting 0 within the dead-zone, the previous value is held. This is helpful for certain radial based touch interfaces that are sensitive to the direction/angle of the tap from a center point.
7. Radial Target: This is a variant of the flex canvas that is constrained within a circle. This works well for aimed abilities typically found in MOBAs where you can aim an ability in a circular region, not only controlling the direction but distance to a particular target.
8. Triple Tap Button: This is the same as the double tap button but with a 3rd tap. All of the same timing parameters apply.
9. Directional Swipe Up: This surface implements a swipe gesture in the up direction. The swipe starts from the bottom edge of the surface frame and moves towards the top. The speed of the swipe is specified as a duration, which gets turned into a dx/dy from the width or height.
10. Directional Swipe Down: This surface implements a swipe gesture in the down direction. The swipe starts from the top edge of the surface frame and moves towards the bottom. The speed of the swipe is specified as a duration, which gets turned into a dx/dy from the width or height.
11. Directional Swipe Left: This surface implements a swipe gesture in the left direction. The swipe starts from the right edge of the surface frame and moves towards the left. The speed of the swipe is specified as a duration, which gets turned into a dx/dy from the width or height.
12. Directional Swipe Right: This surface implements a swipe gesture in the right direction. The swipe starts from the left edge of the surface frame and moves towards the right. The speed of the swipe is specified as a duration, which gets turned into a dx/dy from the width or height.
13. Button Chord. This surface takes two inputs and waits until both inputs are active before synthesizing a tap. Unlike the input transform version of the button chord, this version implements a proper state machine and applies hysteresis to the buttons, requiring a release of both before it can be retriggered for example.
14. Button Anti-chord: This surface is the counterpart to the button chord surface which instead masks the secondary input when the primary input is not active.
15. Virtual Callback: This is a virtual surface that triggers a callback message to be sent back to the platform operating service when the Boolean input goes from false to true (rising edge). The surface parameter is passed on in the callback packet which can be used to trigger specific behavior. The main use case this is designed for is triggering a controller map change automatically when a button is pressed. For example, pressing the button assigned to “enter vehicle” in PUBG not only taps on the screen but also indicates to the app to load the controller map for vehicle mode.
16. Double Tap Joystick: This is a variation of the radial joystick where a secondary input can override the joystick press in order to execute a double tap gesture. This surface is specifically designed to work with PS remote play's L3 button mapping. This surface is used almost exclusively with the Joystick Plus Button input node type. The parameter field of this surface controls the pulse width and spacing timing information for the double tap gesture.
17. Swipe Up Joystick: This is similar to the double tap joystick except the secondary input triggers a swipe gesture instead. Some games, such as PUBG, implement the sprint function by having the user swipe up from the top of the joystick. Basically, moving the finger way outside of the joystick in the up direction. While the swipe gesture is being executed, the joystick values are forced to 0. In addition, value override is held for a short period afterwards to ensure any unintentional movement of the joystick resulting from pressing L3/R3. In practice, the swipe timing is long enough to debounce any joystick movement.
Input nodes come in a variety of flavors. Some input nodes take in multiple physical controls and combine them to produce an output. For example, the Joystick Aim Button takes in two joystick axis values and a button input. Whenever the button input is pressed, the joystick values are allowed to drive the surface, but, when not pressed, the surface is inactive. On the other hand, a different 2-axis joystick node can have exclusion inputs that cause it to become inactive when the button input is pressed. This essentially allows the two different surfaces to be mutually exclusive, allowing the joystick to be used for multiple purposes depending on what button is pressed.
Touch screens on mobile devices come in a variety of aspect ratios, densities, and even unique shapes and/or cutouts. As a result, game developers are introduced with a problem: they need to dynamically scale and position their in-game controls based on the various screen attributes of the specific mobile computing device the game is running on. This makes certain implementations of a “virtual controller” more complicated or precarious because the physical screen locations (for inputs such as tap and swipe) are device dependent and thus, can vary greatly from device to device. However, mobile apps and games tend to follow common layout principles and standard practices to adapt to different screen permutations, so the virtual controller can succeed by reproducing the same calculations being performed in the game engine.
While compiling, the screen properties, such as width, height, and pixel density, can be used to transform surface positions and size. It may be desired to have the largest screen size possible to capture the entire scope of inputs from a user. One aspect of the map files and layout process is the consistent use of density-independent pixel format. It may be easier to scale layout variables relative to the local device pixel density by factoring out the pixel density. In addition, the local screen width and height are used to calculate common anchor positions, such as top, left, bottom, right, and center. These anchors then are referenced in the map file referred to in
Each game may render its touch controls slightly differently, which may have an impact on the controller mapping designed for that game. In many cases, a relative layout engine is used, often with anchors, safe-area (e.g. the area on the mobile device's screen that accepts touch inputs), and other predetermined criteria. In some cases, a game will use a proportional scheme, where the aspect ratio and/or size of the screen scales the position of controls. Since games may be played on many different screen sizes, there may even be situations where completely different control schemes exist based on device class, e.g. mobile phone vs tablet. Virtual controller maps therefore mirror how the game's control schemes are organized. For example, if the game has different layout schemes based on the aspect ratio, a different controller map is created for each ratio. On the other hand, if the game uses a relative layout scheme following standard safe area guidelines, a single controller map may suffice. The ability for the engine to auto scale to capture more types of devices is an ability that makes the feature easy to use. In addition, being able to support multiple game modes makes the virtual controller work much more similar to how a game controller would natively work.
When building device-independent layouts for a virtual controller map, it may be necessary to first collect positional information while in game. One of the practical approaches is to start from screenshots, which inherently capture the raw contents and coordinates of the local screen. In order to build a robust understanding of the layout, multiple aspect ratios may be processed. By including multiple data points, it is possible to estimate the layout equations, and test their effectiveness.
When annotating the different touch surfaces to create the controller map, the key information is the X, Y position of the touch input as well as the width and height when applicable. The X, Y position may be a key value for the virtual touch location, but the width, height may also be important for overlay rendering, as well as constraining surfaces with complex motion (virtual joystick, pan joystick, etc.). When comparing coordinate data across devices, it may be important to convert into density-independent pixels, and, therefore, it may be important to record metadata for the local device into the controller map, such as display density. When processing the raw coordinate data (either by hand or by tool), the following criteria may be assessed to help narrow down possible layout schemes: (1) do controls have a consistent offset from the edge of the screen? What about the center?; (2) do controls appear to be inset to account for display cutouts/notches?; and (3) do controls scale in their size or remain fixed across screen sizes? If done by hand, this criteria can often be visually recognized by overlaying multiple screenshots in a photo editor, or a custom tool. This method is used to create a detailed, comprehensive controller map that can be used with the virtual controller.
To automate some of the controller map “curation” process, an algorithm can be used to evaluate several different possible layout equations, and select the one with the lowest error. The algorithm can import three or more controller maps that contain consistent surface allocation but with varied screen positional data. For each surface index, the algorithm can calculate the lowest error layout scheme across all input controller maps. For this step, the algorithm can iterate over a collection of common layout schemes and sum the error/deviation for each controller map. Each layout scheme can have an implied relational function. For this process, the algorithm can work backwards from the screen position data to solve the value field of the relation. When evaluating each layout scheme, the hypothetical layout parameters are injected into the surface to produce absolute positions based on the screen info. To calculate the error, the algorithm can take the difference between the projected position vs actual position. With each surface now having a “best-fit” layout function, a new density independent controller map file can be produced.
A virtual controller solution needs to solve for complex “modes” that exist within games. In games, there are often different “modes” that reflect different user experiences within a singular game. The inputs provided in the touch synthesis engine can provide contextual clues to identify when game modes can be shifted and automatically make this choice for the user. Examples of these game modes are provided below.
(a) Controller flows: One embodiment of this solution is “Controller Flows.” A controller flow is a set of controller maps that can be switched between via contextual gestures. Flows can vary from bimodal maps to nested or non-linear sequences. Once the virtual controller is synchronized with the game state, it is possible to control and update a model game state that determines which controller map to use. In a simple sense, multiple controller maps are being linked together via gestures. The model of the game state does not need to be overly complex. In some ways, one can think of the game state as a storyboard, and that certain gestures allow you to transition between the frames.
Bimodal Controller Map: In a simple example, a game like PUBG can have a controller map for combat mode and vehicle mode. Since PUBG already has a distinct button to tap to enter a vehicle, the controller map switch request can piggyback on this touch surface. So, the user taps a button to enter the vehicle and at the same time automatically switches to the vehicle controller map. Similarly, the vehicle mode has a distinct button to exit the vehicle which in turn can switch the controller map back to combat mode.
Unreliable game state: In some cases, the button that triggers a new game state may not be a reliable signal to the game controller system. For example, in a game such as Honkai: Star Rail, the player can use a simple attack button to interact with destructible objects in the game world. However, this attack is also used to engage enemies in the world, and when this occurs the game transitions to a turn-based battle state. To handle cases like this, a button can be overloaded with an additional gesture. In this example, a short press of the button could invoke the existing attack function, but a long hold of the button could switch into the battle mode. The player does need to remember to switch into the mode, but because the gesture is on the attack button, it is easier than selecting via a menu. Similarly, the battle may complete automatically when the last enemy is defeated with no contextual clue we are transitioning to the exploration state. In this case, the same button hold gesture can be used to toggle back to the exploration controller map.
Game State reset: Inevitably, the game controller system may become out of sync with the actual game state. This can happen for a variety of reasons, often not at fault of the controller system. However, it is important to provide a means for the player to sync back up the state. There are two approaches that come to mind: (i) Provide a button on the controller such as a software service button (“SSB”), which can operate as a reset to reload the first map in the controller flow. The SSB is available in all maps and modes, and can be a reliable way to get back to a known state, or (ii) Provide service overlay or menu to select from a list of possible controller maps in the flow, to immediately jump to the desired controller state. These approaches need not be mutually exclusive. Both have their merits, and may be used together.
In addition, an overlay menu can potentially have the same issue as the button toggle/cycle approach because you may need to scroll down to the mode you want. To address this, a radial wheel (see
Virtual Cursor: Controller flows can also incorporate virtual cursor “leaf nodes” in situations where the surface action is to open up a complex menu or inventory screen. These simple controller maps generally consist of two surfaces: (i) a dynamic cursor surface that can be moved and clicked, or (ii) a button surface that can exit from the screen. Because the user is basically given an unconstrained mouse pointer to navigate, the system should also handle the case where the user clicks the “close button” via the virtual cursor as opposed to using a game controller button such as the B button.
In many ways, a virtual cursor leaf node is analogous to a modal dialog in traditional user interfaces. In a very complex set of controller maps, it may be most intuitive for a user to press a consistent button to enter the virtual cursor, which could even be accessible from any standard (non-cursor) controller map.
Other virtual cursor embodiments:
(a) The virtual cursor can also be constrained within a subset of the entire screen for usability. This is primarily to avoid the user scrolling too far away from a narrow target area. The region that the virtual cursor moves within can also incorporate “snap points”, where the joystick sensitivity is modulated when the cursor is nearby. This form of aim assist can be fairly helpful for usability, but since we don't know the true contents of the screen the snapping should be low intensity, and probably shouldn't hard snap (e.g. snap the cursor directly to the anchor point)
(b) Radial wheel. As mentioned above, another embodiment of this solution offers a radial wheel for users to seamlessly toggle between different modes. Users can also toggle between modes using controller inputs-specifically, joystick or button commands. It offers a quick, intuitive way to switch between modes and ensures the user can always see the correct glyph hints in any mode. The user can trigger the radial wheel menu with a shortcut button that they can select (L3 or another trigger button that is subject to change via testing).
In another embodiment, a unique icon set displayed on the radial wheel represents each game mode within the game. The icons can be intuitive and ensure accessibility and user understanding regardless of the user's spoken language. In many cases, icons can be custom to games to ensure maximum accessibility and can be abstractions that are shared with each game's iconography. Example icons are shown in the radial wheel in the screen shot of
Using “systems level access” permission granted via the game controller, the “virtual controller” feature can have several enhanced features, which may require the user to accept “enhanced” Android system-level permissions. If the user grants this “systems level access” a suite of enhanced features is available: In one embodiment, the Android OS uses an internal hardware accelerometer in the computing device to determine screen orientation. When the computing device is rotated into a portrait orientation, it can be deduced that the player is not in a game, and the visual overlay can be disabled/hidden so as to not obscure the screen while the user is conducting other actions on the device. In addition, in this enhanced mode embodiment, supplemental data from the OS, such as the foreground operation, can be used to determine that the virtual controller game is not in focus to the same effect. When this occurs, the visual overlay of the button glyphs can be disabled so the user can fully use their phone to use other applications, such as review and send text messages, email, make phone calls or other actions. Once it is detected that the phone has returned to landscape mode, the virtual controller button glyph overlay can be re-enabled, so that the user can seamlessly get back to gaming. In other embodiments of the enhanced mode, it can allow the software to automatically enable these custom mappings for games with no setup required whenever the game is detected in the foreground
This is illustrated in the flow chart 1800 in
This method can detect when a user launches a virtual controller game outside of the game controller app and trigger the hints overlay. This method can also detect if the controls have been modified to impair the appropriate feature usage. One embodiment can identify if a user is using a non-standard control layout in a game and automatically disable the “virtual controller” or implement custom mappings pre-set by the user. Another embodiment can alert a user with a prompt to revert to default custom mappings.
There are several advantages associated with the embodiments described above. For example, with these embodiments, a user can launch a game that does not have official game controller support and instantly start playing. As previously mentioned, custom mapping can be a highly-manual experience where a user needs to map controller inputs to unique game controls. Using the solution proposed herein, the user does not have to deal with the complicated process of dropping in all the controls and mapping things themselves. In addition, the Touch Synthesis Engine of these embodiments provides a great deal of flexibility to map controls, allowing more-advanced rules between the buttons and joysticks. The Touch Synthesis Engine allows the ability to map more-nuanced controls not possible in other solutions. The following provides some examples of nuanced controller commands that may be utilized to action nuance touch inputs using our virtual controller solution:
Radial Joystick vs Pan Joystick: There are two common joystick permutations found in most games. The first is the Radial Joystick, which operates almost identically to a physical joystick where a center point is dragged to an X, Y point, constrained to a unit circle. This is most commonly used for left/right forward/back movement of the player character. The Pan Joystick, on the other hand, operates quite a bit differently and is commonly used to control a 3D camera in first- or third-person games. The way the pan joystick usually works is that the relative distance from where you started dragging/panning is translated into pitch/yaw angle of the camera. Very often, there is no visible UI element for this control, instead tapping anywhere else on the screen (or sometimes on the right half) is interpreted as a pan gesture. Nuanced guidelines can be used to determine which joystick is appropriate for the game at hand.
In order to produce continuous motion with the pan joystick (map to physical joystick on controller), it may be necessary for the system to generate repeated swipe/pan gestures. As a result, the touch region can be encoded as large as possible. The ability to define custom surfaces allows for optimally taking advantage of the screen size and shape to provide the most surface area for these calculations to minimize the frequency of the calculation loop, providing a smoother experience to the end user. For example, if a user can tap anywhere in empty areas to control the camera, it may be best to setup the X, Y as the center of the screen and the width, height to stretch to the size of the screen.
The joysticks are just one example of many possible touch surfaces the technology can map to in a more advanced way than previous solutions. The joysticks, in particular, just happen to require more-advanced movement. In certain modes of a specific game, the joystick can be mapped to four distinct touch surfaces, rather than a virtual joystick. This is in addition to the dpad also being used for the steering controls. The joystick demux basically decodes the X/Y angle of the joystick and turns that into four binary signals based on what quadrant the user is in (with some overlap for diagonals)
Another example of advanced control mapping is button chords and exclusion counter parts. In the PUBG map, the right joystick is not only used for camera control but also for choosing grenades or healing. When L1/R1 is held down, the camera controls get excluded while the circular menus for grenade or heal are accessed.
A “gesture first” approach starts with the motion and touch dynamics that the user was making with their finger, and works back from that to map these gestures to game controller inputs. The system is also designed to be modular, so that a special dpad node can be used to convert the four directions into a single X/Y vector and connect to the pan joystick (with a constraint to eight degrees of freedom in camera rotation).
The screen attributes can also be adapted to the user's phone to achieve automatic support. This may not be easily achievable through a user-generated model. In addition, in-depth studies of a target game can be performed to produce a high-quality control scheme that can be on-par with a developer chosen map.
Overall, the aforementioned embodiments can provide an improved (e.g., optimal) user experience for a game player, as compared to previous solutions, which were messy and overly manual in that they required a user to custom map buttons to each supported game in their app in an imprecise way. In contrast, with these embodiments, a user can accept an Android device permission in the game controller's app and simply start playing a virtual controller game without any extra fuss or customization.
Any embodiment, implementation, feature, and/or example described herein is not necessarily to be construed as preferred or advantageous over any other embodiment, implementation, feature, and/or example unless stated as such. Thus, other embodiments, implementations, features, and/or examples may be utilized, and other changes may be made without departing from the scope of the subject matter presented herein. Accordingly, the details described herein are not meant to be limiting. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations.
Further, unless the context suggests otherwise, the features illustrated in each of the figures may be used in combination with one another. Thus, the figures should be generally viewed as component aspects of one or more overall embodiments, with the understanding that not all illustrated features are necessary for each embodiment. Additionally, any enumeration of elements, blocks, or steps in this specification or the claims is for purposes of clarity. Thus, such enumeration should not be interpreted to require or imply that these elements, blocks, or steps adhere to a particular arrangement or are carried out in a particular order.
Further, terms such as “A coupled to B” or “A is mechanically coupled to B” do not require members A and B to be directly coupled to one another. It is understood that various intermediate members may be utilized to “couple” members A and B together.
Moreover, terms such as “substantially” or “about” that may be used herein, are meant that the recited characteristic, parameter, or value need not be achieved exactly but that deviations or variations, including, for example, tolerances, measurement error, measurement accuracy limitations and other factors known to skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
Also, when reference is made in this application to two or more defined steps or operations, such steps or operations can be carried out in any order or simultaneously, unless the context excludes those possibilities. Furthermore, the term “comprises” and its grammatical equivalents are used in this application to mean that other components, features, steps, processes, operations, etc. are optionally present. For example, an article “comprising” or “which comprises” components A, B, and C can contain only components A, B, and C, or it can contain components A, B, and C along with one or more other components. Additionally, directions such as “right” and “left” (or “top,” “bottom,” etc.) are used for convenience and in reference to the views provided in figures. But the game controller may have a number of orientations in actual use. Thus, a feature that is vertical, horizontal, to the right, or to the left in the figures may not have that same orientation or direction in actual use.
It is intended that the foregoing detailed description be understood as an illustration of selected forms that the invention can take and not as a definition of the invention. It is only the following claims, including all equivalents, that are intended to define the scope of the claimed invention. Finally, it should be noted that any aspect of any of the embodiments described herein can be used alone or in combination with one another.