In today's computer software development industry, traditional marking menus typically can call functions based on only single vectors to access and execute system commands. As a result, access and execution of system commands can be cumbersome and slow. This slow access and execution of system commands can reduce software developer productivity.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The disclosure is directed to a method including receiving an initial user activation event (e.g., produced from a digit of a hand of a user, a keyboard chord, a keyboard hot key, a stylus, or an action of a pointing device). The method includes receiving a first portion of a direction-specific symbolic swipe gesture, such as a symbolic swipe gesture with a curve, and recording, in response to the initial user activation event, a first path of a first portion of the direction-specific symbolic swipe gesture. In response to a pause in the direction-specific symbolic swipe gesture, a selected number of possible symbolic gestures are displayed based on the recorded first path that reveal system commands that map to the symbolic gestures. A second path of a second portion of the direction-specific symbolic swipe gesture is recorded. Examples of the direction-specific symbolic swipe gesture are produced from a digit of a hand of a user, a stylus, or an action of a pointing device. In response to the recorded first and second paths of the direction-specific symbolic swipe gesture and a trigger (e.g., a digit of a hand or stylus being lifted from a touchscreen, or an action of a pointing device), a first system command is accessed that maps to the first direction-specific gesture. One embodiment of the method executes the accessed first system command.
The accompanying drawings are included to provide a further understanding of embodiments and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments and together with the description serve to explain principles of embodiments. Other embodiments and many of the intended advantages of embodiments will be readily appreciated, as they become better understood by reference to the following detailed description. The elements of the drawings are not necessarily to scale relative to each other. Like reference numerals and other indicators (collectively alpha-numerics in this disclosure) designate corresponding similar features.
In the following Detailed Description, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present invention. The following detailed description, therefore, is not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims. It is also to be understood that features of the various example embodiments described herein may be combined with each other, unless specifically noted otherwise.
Computing device 100 can also have additional features/functionality. For example, computing device 100 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or solid state memory, or flash storage devices such as removable storage 108 and non-removable storage 110. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any suitable method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Memory 104, removable storage 108 and non-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, universal serial bus (USB) flash drive, flash memory card, or other flash storage devices, or any other storage device that can be used to store the desired information and that can be accessed by computing device 100. Any such computer storage media may be part of computing device 100.
Computing device 100 includes one or more communication connections 114 that allow computing device 100 to communicate with other computers/applications 115. Computing device 100 may also include input device(s) 112, such as keyboard, pointing device (e.g., mouse), stylus (e.g., pen), voice input device, touch input device, touchscreen, etc. Computing device 100 may also include output device(s) 111, such as a display, speakers, printer, etc.
The computing device 100 can be configured to run an operating system software program and one or more software applications, which make up a system platform. In embodiments, the operating system and/or software applications are configured to present a user interface (UI) that is configured to allow a user to interact with the software application in some manner using some type of input device. In one embodiment, this UI is a visual display that is capable of receiving user input and processing that user input in some way. Embodiments of such a UI can, for example, include one or more user interactable components (e.g., links, buttons or controls) that can be selected (e.g., clicked or touched) by a user via a pointing device or touchscreen or other suitable input device.
In one example, the computing device 100 includes a software component referred to as a managed environment. The managed environment can be included as part of the operating system or can be included later as a software download. The managed environment typically includes pre-coded solutions to common programming problems to aid software developers to create software programs such as applications to run in the managed environment, and it also typically includes a virtual machine that allows the software applications to run in the managed environment so that the programmers need not consider the capabilities of the specific processors 102. A managed environment can include cache coherency protocols and cache management algorithms.
The computing device 100 can be coupled to a computer network, which can be classified according to a wide variety of characteristics such as topology, connection method, and scale. A network is a collection of computing devices and possibly other devices interconnected by communications channels that facilitate communications and allows sharing of resources and information among interconnected devices. Examples of computer networks include a local area network, a wide area network, the Internet, or other network.
One embodiment of touchscreen device 200 (e.g., tablet PC, slate device, or touchscreen phone) is illustrated in schematic diagram form in
One example of this implementation could be touchscreen device 200 where a keyboard or pointing device is not readily available. (e.g., slate device or touchscreen phone). In this example implementation, the initial user activation event of is made with a touch with a digit 218 of a second hand 214 (most likely a non-dominant hand) or a stylus to touchscreen 202 to initiate the listening action. Then, the direction-specific symbolic swipe gesture of digit 216 of first hand 212 (most likely a dominant hand) determines the menu or function called. In one embodiment, releasing digit 216 of first hand 212 selects the function. In one embodiment, releasing digit 218 of second hand 214 at any time after the initial user activation event cancels the action and returns user 210 to a normal work environment.
In the example scenario for implementing self-revealing symbolic gestures for invoking, displaying, and executing system commands illustrated in
Position 222 is an initial touch point of digit 216 of first hand 212 to touchscreen 202. Digit 216 of hand 212 is then dragged on touchscreen 202 in a specific direction, in this scenario up to position 224.
When digit 216 of hand 212 is held at position 224 in this scenario, indicators for positions 226, 228, and 230 are revealed (i.e., displayed) as options for further navigation each for possible symbolic gestures that are mapped to system commands. Dragging digit 216 of hand 212 on touchscreen 202 to any of positions 226, 228, and 230 completes a direction-specific symbolic swipe gesture and then releasing digit 216 of first hand 212 triggers the displayed system command mapped to the completed direction-specific symbolic swipe gesture. The selected command is accessed and executed.
In
In
In
In
In
At 404, in response to a pause in the direction-specific symbolic swipe gesture, a selected number of possible symbolic gestures are displayed based on the recorded first path that reveal system commands that map to the symbolic gestures. In one embodiment, the selected number of possible symbolic gestures is calculated based on the recorded first path, gestures that align with the record first path, and/or gestures that are most used by the user.
At 406, a second path of a second portion of the direction-specific symbolic swipe gesture is recorded. Example first and second portions of the direction-specific symbolic swipe gesture are produced from a digit of a hand of a user, a stylus, and/or an action of a pointing device. In one example, the first and second portions of the direction-specific symbolic swipe gesture are produced from a digit of a first hand (e.g., dominant hand) of the user, and the initial user activation event is produced from a digit of a second hand (e.g., non-dominant hand) of the user.
At 408, in response to the recorded first and second paths of the direction-specific symbolic swipe gesture and a trigger, the first system command is accessed that maps to the first direction-specific gesture. Example triggers are produced in response to a digit being lifted from a touchscreen, a stylus being lifted from a touchscreen, and/or a second action of the pointing device. At 410, the accessed first system command is executed.
At 506, in response to the initial user activation event, a first path of a first portion of the direction-specific symbolic swipe gesture is recorded. At 508, a first selected number of possible symbolic gestures based on the recorded first path is calculated. In one embodiment, the selected number of possible symbolic gestures is calculated based on the recorded first path, gestures that align with the record first path, and/or gestures that are most used by the user.
At 510, in response to a pause in direction-specific symbolic swipe gesture, the calculated first selected number of possible symbolic gestures based on the recorded first path are displayed that reveal system commands that map to the symbolic gestures.
At 512, a second portion of the direction-specific symbolic swipe gesture is received. At 514, a second path of a second portion of the direction-specific symbolic swipe gesture is recorded. At 516, a second selected number of possible symbolic gestures based on recorded first and second paths is recorded. At 518, in response to a pause in the direction-specific symbolic swipe gesture, the calculated second selected number of possible symbolic gestures based on recorded first path are displayed that reveal system commands that map to symbolic gestures.
At 520, a third portion of the direction-specific symbolic swipe gesture is received. At 522, a third path of third portion of direction-specific symbolic swipe gesture is recorded. Example first, second, and third portions of the direction-specific symbolic swipe gesture are produced from a digit of a hand of a user, a stylus, and/or an action of a pointing device. In one example, the first, second, and third portions of the direction-specific symbolic swipe gesture are produced from a digit of a first hand (e.g., dominant hand) of the user, and the initial user activation event is produced from a digit of a second hand (e.g., non-dominant hand) of the user.
At 524, in response to the recorded first, second, and third paths of the direction-specific symbolic swipe gesture and a trigger, a first system command is accessed that maps to the first direction-specific gesture. Example triggers are produced in response to a digit being lifted from a touchscreen, a stylus being lifted from a touchscreen, and/or a second action of the pointing device. At 526, the accessed first system command is executed.
The computer implemented method 400 or method 500 employ the combination of the initial user activation event and the direction-specific symbolic swipe gesture that allows system commands to be easily accessed and executed. Traditional marking menus typically can call functions based on only single vectors. The combination of an initial activation gesture and a secondary function selection gesture (e.g., curved gestures or graffiti-style gestures) of the disclosed embodiments can permit a significant increase in the number of system commands available to be accessed and executed. Via repeated entering of direction-specific symbolic swipe gestures, muscle memory of a user for specific system commands can develop, and therefore further increase productivity speed.
Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a variety of alternate and/or equivalent implementations may be substituted for the specific embodiments shown and described without departing from the scope of the present invention. This application is intended to cover any adaptations or variations of the specific embodiments discussed herein. Therefore, it is intended that this invention be limited only by the claims and the equivalents thereof.