Embodiments of the invention generally relate to computer graphics processing and selective visual display systems, and more particularly to dynamic display of actionable items in devices.
In electronic devices, when input devices such as mouse, track pad, etc., are used, an user is provided with a pointer on a graphical user interface (GUI) screen, using which the user can position and perform operations such as click, hover, select, etc. However, in hand-held devices, the interaction with the hand-held device is based on touch interaction by positioning fingertip on the GUI of the device. In applications rendered or displayed on the touch based devices, menu items are displayed statically on the content at fixed positions in the GUI. Users of such hand-held devices may access the applications by holding the touch devices in either of the hand. The hand-held touch devices may be of varying screen sizes, and also the touch devices may be held in landscape orientation instead of portrait orientation. In both the scenarios noted above, it is challenging to access the statically displayed actionable items across a wide screen of the hand-held device.
The claims set forth the embodiments with particularity. The embodiments are illustrated by way of examples and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. The embodiments, together with its advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings.
Embodiments of techniques for dynamic display of user interface elements in hand-held devices are described herein. In the following description, numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the embodiments can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail.
Reference throughout this specification to “one embodiment”, “this embodiment” and similar phrases, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one of the one or more embodiments. Thus, the appearances of these phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
A hand-held device may be a multi-touch electronic device that users can control through multi-touch gestures. Multi-touch gestures are predefined motions used to interact with multi-touch devices. Some examples of multi-touch gestures are hover, tap, double tap, long press, scroll, pan, pinch, rotate, etc. Users may use the multi-touch gestures to interact with the multi-touch electronic device, and the applications rendered in the multi-touch electronic device. Multi-touch gestures can be performed on various user interface (UI) elements such as a menu, popup screen, context menu, widget, icon, pointer, cursor, selection, handle, text cursor, insertion point, tabs, magnifier, window, etc. UI elements may also be referred to as actionable elements since actions such as selection, hover, clicking, etc., can be performed on the UI elements. The multi-touch electronic device may be held by the users in right hand, left hand, or both and this is referred to as handedness. Handedness is a preference or performance of use of a one hand over another. Handedness may be left-handedness, right-handedness, mixed-handedness and ambidexterity. Left-handedness is also referred to as dexterous, and right-handedness is also referred to as sinister.
In one embodiment, orientation of the hand-held device 625 may be landscape instead of portrait. For example, touch based gaming remote devices may be held in landscape orientation instead of portrait orientation. The hand-held device 625 is held in landscape orientation by a user. When the hand-held device 625 is held in right hand of a user, ‘sensor E’ 615 and ‘sensor F’ 620 are in close proximity with the right hand. ‘Sensor E’ 615 and ‘sensor F’ 620 individually or in combination detect that the hand-held device 625 is held in the right hand, and dynamically displays the UI elements ‘list’, ‘play’ and ‘pause’ 630 on a first area or towards right side of GUI 645 of the hand-held device 625. When the user switches the hand-held device 625 to the left hand, ‘sensor D’ 610 and ‘sensor C’ 605 are in close proximity with the left hand. ‘Sensor D’ 610 and ‘sensor C’ 605 individually or in combination detect that the hand-held device 625 is held in the left hand, and dynamically displays the UI elements ‘list’, ‘play’ and ‘pause’ 630 on a second area or towards left side of GUI 650 of the hand-held device 625.
In one embodiment, hand-held device 625 may be held in both the hands of a user in a landscape orientation. ‘Sensor E’ 615 and ‘sensor F’ 620 individually or in combination based on factors such as proximity to a hand and pressure received on the sensors, determine handedness of the hand-held device 625. Numeric values may be associated with the factors such as proximity to the hand and pressure received on the sensors. Computation of numerical values may be based on a program logic or algorithm associated with the sensors. For example, based on the proximity of right hand to ‘sensor E’ 615 and ‘sensor F’ 620, a numerical value of ‘0.25’ is calculated. Based on the pressure received from the right hand on ‘sensor E’ 615 and ‘sensor F’ 620, a numerical value is calculated as ‘0.27’. With reference to the right hand, sum of the calculated numerical values ‘0.25 and ‘0.27’ is ‘0.52’.
Based on the proximity of left hand to ‘sensor D’ 610 and ‘sensor C’ 605, a numerical value of ‘0.22’ is calculated. Based on the pressure received from the left hand on ‘sensor D’ 610 and ‘sensor C’ 605, a numerical value is calculated as ‘0.20’. With reference to the left hand, sum of the calculated numerical values ‘0.22 and ‘0.20’ is ‘0.42’. A threshold value of ‘0.05’ can be used in determining the handedness. The delta/difference of the calculated numerical values with reference to the left hand and right hand is compared with the threshold value of ‘0.05’. The delta/difference between the calculated numerical value of right hand with reference to the left hand is ‘0.1’ and is greater than the threshold value ‘0.05’, and accordingly it is determined that the handedness is right. The UI elements ‘list’, ‘play’ and ‘pause’ 630 are displayed in a first area or towards right side of GUI 645 of the hand-held device 625. Alternatively, if the delta/difference between the calculated numerical value of the left hand with reference to the right hand is ‘0.1’ and is greater than the threshold value ‘0.05’, it is determined that the handedness is left.
In one embodiment, if delta/difference of the calculated numerical values with reference to the left hand and right hand are below the threshold value ‘0.05, conflict in handedness may be resolved in using various options explained below. When the delta/difference between the calculated numerical values with reference to the left hand and right hand is below the threshold value ‘0.05, handedness can be determined by prompting a user to select between right and left handedness to resolve the conflict. User activity or preference of handedness can be maintained in a history/user preference in the program logic or algorithm associated with the sensors. When the delta/difference between the calculated numerical values with reference to the left hand and right hand is below the threshold value ‘0.05’, the stored history/user preference is used to determine the handedness and resolve the conflict.
In one embodiment, hardware sensors may be placed on circumference or periphery of hand-held device 730. Consider a polygon shaped hand-held device 730, such as a touch based gaming remote. ‘Sensor C’ 735 and ‘sensor D 740’ are placed on the periphery of the hand-held device 730. When the hand-held device 730 is held in the right hand, ‘sensor D’ 740 dynamically determines that pressure is received on the ‘sensor D’ 740 and determines that the hand-held device 730 is held in the right hand. Accordingly, UI elements 745 are displayed on a first area or towards right side of GUI of the hand-held device 730. When the hand-held device 730 is shifted to the left hand, ‘sensor C’ 735 dynamically determines that pressure is received on the ‘sensor C’ 735 and determines that the hand-held device 730 is held in left hand. Accordingly, UI elements 745 are displayed on a second area or towards left side of GUI of the hand-held device 730 (not shown).
Some embodiments may include the above-described methods being written as one or more software components. These components, and the functionality associated with each, may be used by client, server, distributed, or peer computer systems. These components may be written in a computer language corresponding to one or more programming languages such as, functional, declarative, procedural, object-oriented, lower level languages and the like. They may be linked to other components via various application programming interfaces and then compiled into one complete application for a server or a client. Alternatively, the components maybe implemented in server and client applications. Further, these components may be linked together via various distributed programming protocols. Some example embodiments may include remote procedure calls being used to implement one or more of these components across a distributed programming environment. For example, a logic level may reside on a first computer system that is remotely located from a second computer system containing an interface level (e.g., a graphical user interface). These first and second computer systems can be configured in a server-client, peer-to-peer, or some other configuration. The clients can vary in complexity from mobile and handheld devices, to thin clients and on to thick clients or even other servers.
The above-illustrated software components are tangibly stored on a computer readable storage medium as instructions. The term “computer readable storage medium” should be taken to include a single medium or multiple media that stores one or more sets of instructions. The term “computer readable storage medium” should be taken to include any physical article that is capable of undergoing a set of physical changes to physically store, encode, or otherwise carry a set of instructions for execution by a computer system which causes the computer system to perform any of the methods or process steps described, represented, or illustrated herein. A computer readable storage medium may be a non-transitory computer readable storage medium. Examples of a non-transitory computer readable storage media include, but are not limited to: magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices. Examples of computer readable instructions include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment may be implemented using Java, C++, or other object-oriented programming language and development tools. Another embodiment may be implemented in hard-wired circuitry in place of, or in combination with machine readable software instructions.
A data source is an information resource. Data sources include sources of data that enable data storage and retrieval. Data sources may include databases, such as, relational, transactional, hierarchical, multi-dimensional (e.g., OLAP), object oriented databases, and the like. Further data sources include tabular data (e.g., spreadsheets, delimited text files), data tagged with a markup language (e.g., XML data), transactional data, unstructured data (e.g., text files, screen scrapings), hierarchical data (e.g., data in a file system, XML data), files, a plurality of reports, and any other data source accessible through an established protocol, such as, Open Data Base Connectivity (ODBC), produced by an underlying software system (e.g., ERP system), and the like. Data sources may also include a data source where the data is not tangibly stored or otherwise ephemeral such as data streams, broadcast data, and the like. These data sources can include associated data foundations, semantic layers, management systems, security systems and so on.
In the above description, numerous specific details are set forth to provide a thorough understanding of embodiments. One skilled in the relevant art will recognize, however that the embodiments can be practiced without one or more of the specific details or with other methods, components, techniques, etc. In other instances, well-known operations or structures are not shown or described in detail.
Although the processes illustrated and described herein include series of steps, it will be appreciated that the different embodiments are not limited by the illustrated ordering of steps, as some steps may occur in different orders, some concurrently with other steps apart from that shown and described herein. In addition, not all illustrated steps may be required to implement a methodology in accordance with the one or more embodiments. Moreover, it will be appreciated that the processes may be implemented in association with the apparatus and systems illustrated and described herein as well as in association with other systems not illustrated.
The above descriptions and illustrations of embodiments, including what is described in the Abstract, is not intended to be exhaustive or to limit the one or more embodiments to the precise forms disclosed. While specific embodiments and examples are described herein for illustrative purposes, various equivalent modifications are possible within the scope, as those skilled in the relevant art will recognize. These modifications can be made in light of the above detailed description.