In electronic devices, when input devices such as mouse, track pad, etc., are used, an user is provided with a pointer on a screen, using which the user can position and perform operations such as click, hover, select, etc. However, in touch screen devices, a device does not know the position of the fingertip on the screen until it is touched. In touch screen devices, position and click information is generally obtained together. Accordingly, in touch screen devices, it is challenging to point to a specific position without performing any operation. Further, it is also challenging to hover over a specific element displayed in the touch screen user interface to initiate an operation.
The claims set forth the embodiments with particularity. The embodiments are illustrated by way of examples and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. Various embodiments, together with their advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings.
Embodiments of techniques for graphical interaction in a touch screen user interface are described herein. In the following description, numerous specific details are set forth to provide a thorough understanding of the embodiments. A person of ordinary skill in the relevant art will recognize, however, that the embodiments can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In some instances, well-known structures, materials, or operations are not shown or described in detail.
Reference throughout this specification to “one embodiment”, “this embodiment” and similar phrases, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one of the one or more embodiments. Thus, the appearances of these phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Graphical interaction element 120 is shown in the touch screen user interface of the multi-touch electronic device. The graphical interaction element 120 floats in the touch screen user interface, and the user may move the graphical interaction element 120 to any position or location in the touch screen user interface. Floating refers to dynamic movement of the graphical interaction element 120 in correspondence with any of the multi-touch gesture received on the graphical interaction element 120. Floating is merely exemplary, the graphical interaction element may move, drag, glide, etc., in the graphical user interface. The graphical interaction element 120 acts like a pointer and may also be placed at a particular location on the graphical user interface. The graphical interaction element 120 can be of different shape, size, level of transparency, etc. The appearance of the graphical interaction element 120 can be customized by specifying or defining the shape, size, transparency, color, etc., in user settings of the multi-touch electronic device. The graphical interaction element 120 may be activated by one or more of the activation types such as multi-touch gestures like hover, tap, double tap, long press, etc. Individual activation types of the graphical interaction element 120 can associated or be bound to different user interface (UI) elements such as a window, menu, popup screen, context menu, widget, icon, pointer, cursor, selection, handle, text cursor, insertion point, tabs, magnifier, etc.
Based on the activation received as input on the graphical interaction element 120, an activation type is identified. A UI element corresponding to the activation type associated with the graphical interaction element 120 is identified and displayed. For example, when an activation type ‘tap’ is received on the graphical interaction element 120 co-located with or located at a proximity to a display element ‘APP15’, popup screen UI element may be displayed in response to the received activation type. Co-location of graphical interaction element with the display element is merely exemplary, visually graphical interaction element may appear to be superimposed on the display element, positioned in a close proximity to the display element, positioned at a pre-defined or user-defined proximity, positioned at a partially or completely overlapping proximity , etc. Since the transparency of the graphical interaction element can be user specified, the display element and the co-located/superimposed/overlaid graphical interaction element are visible. Similarly, other UI elements can be associated with other activation types for the graphical interaction element 120.
The graphical interaction element controller 235 may be a device driver that is a computer program that operates or controls the graphical interaction element in the touch screen hardware 210. The graphical interaction element controller 235 may be a component in the operating system 220 and/or may be an interface between the operating system 220 and applications. The graphical interaction element controller 235 may send or push the position (X, Y) 225 coordinates of the graphical interaction element to any application requesting or any application co-located or in a proximity or positioned below the graphical interaction element such as application 240. The applications may register application functions with the graphical interaction element controller 235. Whenever an activation input 215 is received from the operating system 220 at the graphical interaction element controller 235 (not illustrated), the registered application functions may be invoked and the determined position (X, Y) coordinates and activation type 245 are pushed or sent to application 240.
For example, ‘app 1’, ‘app 2’ and ‘app 3’ are downloaded and installed from the ‘app stores’ and executing in the client device. The ‘apps’ executing in the client device are referred to as applications executing in the client device. The application icons of the installed ‘apps’ displayed in the client device may be referred to as display elements. The graphical interaction element may be associated with various ‘system defined functions’ when the graphical interaction element is activated while there may or may not be underlying application icons co-located with the graphical interaction element. The graphical interaction element may be associated with various ‘system defined functions’ depending on the type of activation received on the graphical interaction element. For example, for activation input 215 ‘press’ a ‘system defined function’ of displaying a ‘context menu of system functions’ may be associated. When a user performs ‘press’ activation on the graphical interaction element, the ‘press’ activation is sent to the operating system 220 executing in the client device. The operating system 220 determines the location or position (X, Y) 225 coordinate of the graphical interaction element, and determine the activation type 230 to be ‘press’. The operating system 220 registers operating system functions with the graphical interaction element controller 235. The determined position (X, Y) 225 coordinate and activation type 230 are sent to the graphical interaction element controller 235 since the operating system functions are registered with it. In response to the received position and activation type, the associated ‘context menu of system functions’ is displayed in the touch screen user interface, e.g., by the graphical interaction element controller 235. Various types of user interface (UI) elements such as ‘context menu of system functions’, ‘context menu of application functions’, a pop-up menu, a tool tip, etc., may appear to be superimposed on the graphical interaction element, positioned in a close proximity to the graphical interaction element, positioned at a pre-defined or user-defined proximity, positioned at a partially or completely overlapping proximity, etc.
In one embodiment, the graphical interaction element may be associated with various ‘application defined functions’ when the graphical interaction element is activated and there is an underlying application icon or application co-located with the graphical interaction element. The graphical interaction element can act as a pointer and dynamically interact with the application icon co-located or located within an overlapping proximity The graphical interaction element dynamically interacts with the application icon co-located with the graphical interaction element, or the application currently executing in the touch screen user interface. The graphical interaction element is associated with various ‘application defined functions’ depending on the activation type received on the graphical interaction element. For example, for activation input 215 ‘press’ an ‘application defined function’ for displaying a ‘context menu of application functions’ may be associated. When a user performs ‘press’ activation on the graphical interaction element that is co-located with an application icon, the ‘press’ activation input 215 is sent to the operating system 220 executing in the client device. The operating system 220 determines the location or position (X, Y) coordinates of the graphical interaction element, and determines the activation type to be ‘press’. The determined position (X, Y) 225 coordinates and the activation type 230 are sent from the operating system 220 to the graphical interaction element controller 235. The graphical interaction element controller 235 provides this information to the application 240 since application functions are registered with the graphical interaction element controller 235. In response to the received activation input 215 ‘press’, the application 240 provides the ‘context menu of application functions’ for display in the touch screen user interface from the application 240.
In one embodiment, application functions such as event handlers defined by the application 240 are registered with the graphical interaction element controller 235. The application 240 registers application functions using corresponding event handler interfaces. For the activation type 230 ‘press’, a corresponding event handler interface is identified and an event handler application function is invoked and executed. The result of execution is sent to the touch screen user interface. In one embodiment, the application 240 may request position (X, Y) and activation type 250 from the graphical interaction element controller 235. In response to the request received from the application 240, the requested position (X, Y) and activation type 245 are sent to the application 240. A user may choose to select a function from the displayed ‘context menu of application functions’ and the function is executed by the application 240. The response of executed function 255 is sent to the touch screen user interface for display.
For example, for an activation type ‘hover’ a ‘system defined function’ of displaying a ‘tool tip pop-up’ may be associated. When a ‘hover’ activation input is received on the graphical interaction element 320 that is co-located with the display element ‘app15’ 330, the ‘hover’ activation input is sent to the operating system executing in the client device. The operating system determines the position (X, Y) coordinates of the graphical interaction element 320 and also determines the activation type as ‘hover’. This information is sent to a graphical interaction element controller. The operating system functions are registered with the graphical interaction element controller. When application ‘app 15’ 330 is installed in the client device, a predefined description of ‘app15’ 330 is registered with the operating system. In response to the ‘hover’ activation input received, a ‘tool tip pop-up’ is displayed with the pre-defined description ‘search for applications, games and music!’ 340. The size and shape of the ‘tool tip pop-up’ displayed can be customized based on user settings.
The various embodiments described above have a number of advantages. With the graphical interaction element user can point to a specific location in the screen without performing any operation. User can hover on any display element in the screen using the graphical interaction element. The graphical interaction element can interact with the underlying application and dynamically provide UI elements, therefore the graphical interaction element is not restricted to a set of predefined functions or a restricted set of UI elements.
Some embodiments may include the above-described methods being written as one or more software components. These components, and the functionality associated with each, may be used by client, server, distributed, or peer computer systems. These components may be written in a computer language corresponding to one or more programming languages such as, functional, declarative, procedural, object-oriented, lower level languages and the like. They may be linked to other components via various application programming interfaces and then compiled into one complete application for a server or a client. Alternatively, the components maybe implemented in server and client applications. Further, these components may be linked together via various distributed programming protocols. Some example embodiments may include remote procedure calls being used to implement one or more of these components across a distributed programming environment. For example, a logic level may reside on a first computer system that is remotely located from a second computer system containing an interface level (e.g., a graphical user interface). These first and second computer systems can be configured in a server-client, peer-to-peer, or some other configuration. The clients can vary in complexity from mobile and handheld devices, to thin clients and on to thick clients or even other servers.
The above-illustrated software components are tangibly stored on a computer readable storage medium as instructions. The term “computer readable storage medium” should be taken to include a single medium or multiple media that stores one or more sets of instructions. The term “computer readable storage medium” should be taken to include any physical article that is capable of undergoing a set of physical changes to physically store, encode, or otherwise carry a set of instructions for execution by a computer system which causes the computer system to perform any of the methods or process steps described, represented, or illustrated herein. Examples of computer readable storage media include, but are not limited to: magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute, such as application-specific integrated circuits (ASICs), programmable logic devices (PLDs) and ROM and RAM devices. Examples of computer readable instructions include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment may be implemented using Java, C++, or other object-oriented programming language and development tools. Another embodiment may be implemented in hard-wired circuitry in place of, or in combination with machine readable software instructions.
A data source is an information resource. Data sources include sources of data that enable data storage and retrieval. Data sources may include databases, such as, relational, transactional, hierarchical, multi-dimensional (e.g., OLAP), object oriented databases, and the like. Further data sources include tabular data (e.g., spreadsheets, delimited text files), data tagged with a markup language (e.g., XML data), transactional data, unstructured data (e.g., text files, screen scrapings), hierarchical data (e.g., data in a file system, XML data), files, a plurality of reports, and any other data source accessible through an established protocol, such as, Open Data Base Connectivity (ODBC), produced by an underlying software system (e.g., ERP system), and the like. Data sources may also include a data source where the data is not tangibly stored or otherwise ephemeral such as data streams, broadcast data, and the like. These data sources can include associated data foundations, semantic layers, management systems, security systems and so on.
In the above description, numerous specific details are set forth to provide a thorough understanding of embodiments. One skilled in the relevant art will recognize, however that the embodiments can be practiced without one or more of the specific details or with other methods, components, techniques, etc. In other instances, well-known operations or structures are not shown or described in detail.
Although the processes illustrated and described herein include series of steps, it will be appreciated that the different embodiments are not limited by the illustrated ordering of steps, as some steps may occur in different orders, some concurrently with other steps apart from that shown and described herein. In addition, not all illustrated steps may be required to implement a methodology in accordance with the one or more embodiments. Moreover, it will be appreciated that the processes may be implemented in association with the apparatus and systems illustrated and described herein as well as in association with other systems not illustrated.
The above descriptions and illustrations of embodiments, including what is described in the Abstract, is not intended to be exhaustive or to limit the one or more embodiments to the precise forms disclosed. While specific embodiments of, and examples for, the one or more embodiments are described herein for illustrative purposes, various equivalent modifications are possible within the scope, as those skilled in the relevant art will recognize. These modifications can be made in light of the above detailed description. Rather, the scope is to be determined by the following claims, which are to be interpreted in accordance with established doctrines of claim construction.