Interaction with a browser or mobile application user interface may involve input using a variety of input devices such as, for example, a keyboard, a mouse, a trackball, a joystick, a touch screen, or other input device. Input mechanisms vary in the number and types of events that are capable of being transmitted. In addition, the range of available input devices is expanding as technology advances.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
The present disclosure relates to implementing a variety of user actions on a touch sensitive client device for media applications. Various embodiments of the present disclosure facilitate translation of touch events received from a touch sensitive client device into corresponding inputs recognizable by a media application. For example, in some embodiments, a media application may be executed by a computing device such as a server. The media application generates a video transmission that is ultimately rendered in the form of a user interface on a touch sensitive client device. Input from the client device may be received by an input mapping application over a network and subsequently translated as a corresponding input recognized by the media application. The media application performs the appropriate user action and responds with appropriate changes in output to the video transmission that is transmitted to the touch sensitive client device over a network. In the following discussion, a general description of the system and its components is provided, followed by a discussion of the operation of the same.
With reference to
The computing device 103 may comprise, for example, a server computer or any other system providing computing capability. Alternatively, a plurality of computing devices 103 may be employed that are arranged, for example, in one or more server banks or computer banks or other arrangements. For example, a plurality of computing devices 103 together may comprise a cloud computing resource, a grid computing resource, and/or any other distributed computing arrangement. Such computing devices 103 may be located in a single installation or may be distributed among many different geographical locations. For purposes of convenience, the computing device 103 is referred to herein in the singular. Even though the computing device is referred to in the singular, it is understood that a plurality of computing devices 103 may be employed in the various arrangements as described above.
Various applications and/or other functionality may be executed in the computing device 103 according to various embodiments. Also, various data is stored in a data store 113 that is accessible to the computing device 103. The data store 113 may be representative of a plurality of data stores 113 as can be appreciated. The data stored in the data store 113, for example, is associated with the operation of the various applications and/or functional entities described below.
The components executed on the computing device 103, for example, include a media application 116, an input mapping application 119, and other applications, services, processes, systems, engines, or functionality not discussed in detail herein. The media application 116 is executed to serve up or stream video and/or other media generated by an application to the client 106 that may comprise, for example, a touch screen display device 146. To this end, the media application 116 may generate various streaming or otherwise transmitted content such as, for example, games, simulations, maps, movies, videos, and/or other multimedia files.
The media application 116 may communicate with the client 106 over various protocols such as, for example, hypertext transfer protocol (HTTP), simple object access protocol (SOAP), real-time transport protocol (RTP), real time streaming protocol (RTSP), real time messaging protocol (RTMP), user datagram protocol (UDP), transmission control protocol (TCP), and/or other protocols for communicating data over the network 109. The input mapping application 119 is executed to facilitate receipt of various user inputs from the client 106 that include, for example, hovering, selecting, scrolling, zooming, and/or other operations.
The data stored in the data store 113 includes, for example, touch screen model(s) 123, user account(s) 126, and potentially other data. Each of the touch screen model(s) 123 includes various data associated with a corresponding mobile device including, for example, specifications 129, input mapping regions 133 and/or other information. In addition, specifications 129 associated with each of the touch screen model(s) 123 may include various data including dimensions, size, structure, shape, response time, and/or other data. Input mapping regions 133 are areas are defined in a touch screen display device 146 to which specific functions in the media application 116 are assigned. Touch events occurring in such areas are ultimately translated into corresponding inputs recognized by the media application 116. Touch events represent points of contact with the touch screen display device 146 and changes of those points with respect to the touch screen display device 146. Touch events may include, for example, tap events, and drag events, pinch events, mouse up events, mouse down events, mouse move events, and/or other points of contact with the touch screen display device 146. Inputs recognized by the media application 116 may comprise, for example, scroll commands, hover commands, zoom commands or other commands as will be described.
Each user account 126 includes various data associated with a user that employs client 106 to interact with media application 116. Each user account 126 may include user information 136 such as, usernames, passwords, security credentials, authorized applications, and/or other data. Customization data 139 includes settings made by a user employing a client 106 that specify a user customization or alternations of default versions of the input mapping regions 133. Additionally, customization data 139 may include other various aspects of the user's viewing environment. When a user employing a client 106 customizes the input mapping regions 133, the computing device 103 maintains customization data 139 that defines customized versions of the input mapping regions 133 in the data store 113 for use in interacting with media application 116 as rendered on the client 106. The customization data 139 may correspond to data associated with the input mapping regions 133 saved normally by the media application 116 or may correspond to a memory image of the media application 116 that may be resumed at any time.
The client 106 is representative of a plurality of client devices that may be coupled to the network 109. The client 106 may comprise, for example, a processor-based system such as a computer system. Such a computer system may be embodied in the form of a desktop computer, a laptop computer, a personal digital assistant, a cellular telephone, music players, web pads, tablet computer systems, game consoles, touch screen monitors, tablet computers, smartphones, or other devices with like capability.
The client 106 may include a touch screen display device 146 and may include one or more other input devices. Such input devices may comprise, for example, devices such as keyboards, mice, joysticks, accelerometers, light guns, game controllers, touch pads, touch sticks, push buttons, optical sensors, microphones, webcams, and/or any other devices that can provide user input.
The client 106 may be configured to execute various applications such as a client side application 143 and/or other applications. The client side application 143 is executed to allow a user to launch, play, and otherwise interact with a media application 116 executed in the computing device 103. To this end, the client side application 143 is configured to receive input provided by the user through a touch screen display device 146 and/or other input devices and send this input over the network 109 to the computing device 103 as input data. The client side application 143 is also configured to obtain output video, audio, and/or other data over the network 109 from the computing device 103 and render a view of the media application 116 on the touch screen display device 146. To this end, the client side application 143 may include one or more video and audio players to play out a media stream generated media application 116. In one embodiment, the client side application 143 comprises a plug-in within a browser application. The client side application 143 may be executed in a client 106, for example, to access and render network pages, such as web pages, or other network content served up by the computing device 103 and/or other servers. To this end, the client side application 143 renders streamed or otherwise transmitted content in the form of a user interface 149 on a touch screen display device 146. The client 106 may be configured to execute applications beyond client side application 143 such as, for example, browser applications, email applications, instant message applications, and/or other applications.
Next, a general description of the operation of the various components of the networked environment 100 is provided. To begin, a user at a client 106 sends a request to a computing device 103 to launch a media application 116. The computing device 103 executes media application 116 in response to the appropriate user input. On first access, the media application 116 may query the client 106 in order to determine the type of touch screen model 123 of the client 106. In one embodiment, as an initial setting, the media application 116 may determine, based on the type of touch screen model 123, the input mapping regions 133 that are to be used for various input at the client 106. In another embodiment, as an initial setting, the media application 116 may determine, based on the type of media application 116, the input mapping regions 133 that are to be used for various input at the client 106. Input mapping regions 133 may vary based on different types of applications, classes of applications, different types of clients, different classes of clients and/or other considerations.
Additionally, the media application 116 may facilitate the creation of a user account 126 by providing one or more user interfaces 149 for establishing the user account 126 if the user account 126 has not already been established. For instance, the media application 116 may prompt the user to indicate a name for the user account 126, a password for the user account 126, and/or any other parameter or user information 136 for establishing the user account 126. In another embodiment, the media application 116 facilitates specification of customization data 139 associated with input mapping regions 133 if a user employing a client 106 wishes to customize the input mapping regions 133. As a result, the media application 116 may adjust an area of one or more of the input mapping regions 133 based on such customization, where such changes are stored as the customization data 139.
In one embodiment, a user employing a client 106 touches the touch screen display device 146 using a finger, stylus, and/or other device. A coordinate input corresponding to the touch event is generated by the client side application 143 and sent to the input mapping application 119. The input mapping application 119 determines if the touch event occurred within one of the input mapping regions 133. When the input mapping application 119 determines that the touch event occurred within one of the input mapping regions 133, the input mapping application 119 translates the touch event received in client side application 149 into a corresponding input that is recognizable by the media application such as, for example, hovering, selecting, scrolling, zooming and/or other actions. The input mapping application 119 then sends the corresponding input to media application 116.
The media application 116 performs the appropriate user action and modifies the graphical output in the video transmission. The media application 119 continually transmits the video transmission to the client side application 143 over the network 109 as the output data. Ultimately, the effect of the touch event performed by the user of the client 106 may be reflected in the client side application 143 as a corresponding user action such as, for example, hovering, selecting, scrolling, zooming, and/or other actions. Further, touch events generated at a client 106 may be mapped as other types of inputs generated by another type of input device. For example, a pinch gesture corresponding to two fingers moving together on a touchscreen, used to enable zooming may be translated as a scroll wheel zoom action recognized by the media application 116.
As a non-limiting example, when a touch event is received in one of the input mapping regions 133 correlated with a scrolling action, the input mapping application 119 maps the touch event to a scrolling input and sends the scroll input to media application 116. Media application 116 scrolls a view of the video transmission in a predefined direction associated with the respective input mapping region 133. The scrolling video transmission is transmitted by the media application 116 to the client 106 over the network 109 as the output data. The client side application 143 obtains the output data and renders a view of the scrolling video transmission on the touch screen display device 146.
Referring next to
Although the example of a map used in
The input mapping regions 133 are correlated to a coordinate plane of the touch screen display device 146. The input mapping regions 133 may include, button activation regions, selecting regions, scrolling regions, and/or other regions that are associated with one or more user actions. In one embodiment, each of the input mapping regions 133 has an outer border 203 that is aligned with an edge of the viewing area of the touch screen display device 146, where such input mapping regions 133 are used to generate a scrolling input. In one embodiment, a speed of the scroll action is determined to be proportional to a distance between the outer border 203 and the coordinate input of a touch event relative to the total distance between the outer border 203 and the inner border 206 of the respective one of the input mapping regions 133. In another embodiment, the speed of the scroll action is determined to be proportional to the distance between the inner border 206 and the coordinate input of a touch event relative to the total distance between the outer border 203 and the inner border 206 of the respective one of the input mapping regions 133.
The graphical components, such as input mapping regions 133, comprising information shown in
In another embodiment,
Accordingly, the StarCraft II media application 116 may expect input from a mouse scroll wheel, input from dragging a scroll bar, input from keyboard arrow keys and/or other scroll input devices. Various embodiments of the present disclosure enable the input mapping application 119 to map the touch event to an appropriate input such as, a scroll input that is recognizable by the media application 116 and sends such input to the StarCraft II media application 116. The StarCraft II media application 116 scrolls a view of the video transmission or takes other appropriate action in accordance with the input. In the case of scrolling, the scrolling direction may be the same as that of the location of the respective input mapping region 133. However, it is noted that scrolling in some clients 106 may happen in a direction opposite the location of the respective input mapping region 133. The viewing area of the touch screen display device 146 may also include various user interface components for controlling the media application 116, exiting the media application 116, communicating with other users, controlling the audio, and/or other components.
Referring next to
The flowchart sets forth an example of the functionality of the input mapping application 119 in translating touch events, combinations of touch events, and/or other touch gestures from the client 106 that specifically involve scrolling While scrolling is discussed, it is understood that this is merely an example of the many different types of inputs that may be invoked with the use of an input mapping region 133. Specifically, the touch events comprise messages indicating coordinates of a touch or other manipulation of the touch screen display device (
Beginning with box 303, when a user employing a client 106 (
If the coordinate input corresponds to one of the input mapping regions 133 that corresponds to a scrolling action in box 303, the input mapping application 119 moves to box 316 and determines whether the coordinate input is associated with a mouse down event. Assuming the coordinate input does not correspond to a mouse down event, the input mapping application 119 moves to box 321. If the coordinate input is associated with a mouse down event, the input mapping application 119 proceeds to box 319. In box 319, the input mapping application 119 determines the direction of the scroll action based on a predefined direction associated with the respective one of the input mapping regions 133. Such a direction may be vertical, horizontal, diagonal, and/or other directions.
Next, the input mapping application 119 proceeds to box 323 and determines the speed of the scroll action. As an example, the input mapping application 119 (
Assuming that the mouse event is not a mouse down event as determined in box 316, the input mapping application 119 proceeds to box 321. In box 321, the input mapping application 119 determines whether the coordinate input is associated with a drag-action into one of the input mapping regions 133 from a position on the touch screen display device 146 that is located outside of the input mapping regions 133. As an example, a user employing a client 106 may initially provide a touch input to the touch screen display device 146 outside of the input mapping regions 133 (
If the coordinate input is not associated with a drag-action into one of the input mapping regions 133 as determined by box 321, the input mapping application 119 proceeds to box 333. In box 333, the input mapping application 119 determines if the coordinate input is associated with a drag-action within one of the input mapping regions 133. If the coordinate input is associated with a drag-action within one of the input mapping regions 133, the input mapping application 119 moves to box 323 to determine if a change in scroll speed is necessary as described above. Otherwise, the input mapping application 119 proceeds to box 336 and sends a command to the media application 116 to stop the scroll action. Thereafter, the input mapping application 119 ends.
With reference to
Stored in the memory 403 are both data and several components that are executable by the processor 406. In particular, stored in the memory 403 and executable by the processor 406 are the media application 116, input mapping application 119 and potentially other applications. Also stored in the memory 403 may be a data store 113 and other data. In addition, an operating system may be stored in the memory 403 and executable by the processor 406.
It is understood that there may be other applications that are stored in the memory 403 and are executable by the processors 406 as can be appreciated. Where any component discussed herein is implemented in the form of software, any one of a number of programming languages may be employed such as, for example, C, C++, C#, Objective C, Java, Javascript, Perl, PHP, Visual Basic, Python, Ruby, Delphi, Flash, or other programming languages.
A number of software components are stored in the memory 403 and are executable by the processor 406. In this respect, the term “executable” means a program file that is in a form that can ultimately be run by the processor 406. Examples of executable programs may be, for example, a compiled program that can be translated into machine code in a format that can be loaded into a random access portion of the memory 403 and run by the processor 406, source code that may be expressed in proper format such as object code that is capable of being loaded into a random access portion of the memory 403 and executed by the processor 406, or source code that may be interpreted by another executable program to generate instructions in a random access portion of the memory 403 to be executed by the processor 406, etc. An executable program may be stored in any portion or component of the memory 403 including, for example, random access memory (RAM), read-only memory (ROM), hard drive, solid-state drive, USB flash drive, memory card, optical disc such as compact disc (CD) or digital versatile disc (DVD), floppy disk, magnetic tape, or other memory components.
The memory 403 is defined herein as including both volatile and nonvolatile memory and data storage components. Volatile components are those that do not retain data values upon loss of power. Nonvolatile components are those that retain data upon a loss of power. Thus, the memory 403 may comprise, for example, random access memory (RAM), read-only memory (ROM), hard disk drives, solid-state drives, USB flash drives, memory cards accessed via a memory card reader, floppy disks accessed via an associated floppy disk drive, optical discs accessed via an optical disc drive, magnetic tapes accessed via an appropriate tape drive, and/or other memory components, or a combination of any two or more of these memory components. In addition, the RAM may comprise, for example, static random access memory (SRAM), dynamic random access memory (DRAM), or magnetic random access memory (MRAM) and other such devices. The ROM may comprise, for example, a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other like memory device.
Also, the processor 406 may represent multiple processors 406 and the memory 403 may represent multiple memories 403 that operate in parallel processing circuits, respectively. In such a case, the local interface 409 may be an appropriate network 109 (
Although the media application 116, the input mapping application 119, and other various systems described herein may be embodied in software or code executed by general purpose hardware as discussed above, as an alternative the same may also be embodied in dedicated hardware or a combination of software/general purpose hardware and dedicated hardware. If embodied in dedicated hardware, each can be implemented as a circuit or state machine that employs any one of or a combination of a number of technologies. These technologies may include, but are not limited to, discrete logic circuits having logic gates for implementing various logic functions upon an application of one or more data signals, application specific integrated circuits having appropriate logic gates, or other components, etc. Such technologies are generally well known by those skilled in the art and, consequently, are not described in detail herein.
The flowchart of
Although the flowchart of
Also, any logic or application described herein, including the media application 116 and the input mapping application 119, that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor 406 in a computer system or other system. In this sense, the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system. In the context of the present disclosure, a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system. The computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media. More specific examples of a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, USB flash drives, or optical discs. Also, the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM). In addition, the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.
It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.