Traditional computing devices such as computers, message boards, electronic billboards, and monitoring devices are controlled directly over a user interface using input hardware. Typically, they are directly controlled using input devices such as a mouse, remote control, keyboard, stylus, touch screen, or the like for controlling the device. Touch-enabled devices, however, are typically controlled over a touch interface by the detection and analysis of touch input by a user. In touch interfaces, input devices such as a keyboard, stylus or a mouse are not fully integrated with the touch-enabled device and commands for controlling operations on software, applications or documents in the device are not easily accessible. For example, keyboards have multiple keys for navigating and selecting options, and a typical mouse can be used to select options, scroll, and display and navigate menus utilizing a right-click function. Since these navigating and selecting tools are not available in touch interfaces, editing a document or making changes in a program may be limited and may be much slower than in traditional computing devices with integrated input hardware.
Some touch devices integrate menus for navigating and executing commands on a touch-enabled device at the top edge or bottom edges of the interface screen. The menus may provide more accessible options for editing and navigating documents, however the menus take up valuable screen space on a touch screen interface and may obstruct the view of the document or provide a smaller working view of a document. Generally, it is desirable to maximize the working view of a document or application by hiding menus and commands until the user needs them.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to exclusively identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
Embodiments are directed to providing a compact control menu over an interactive touch interface where a user may interact with a touch-enabled device to execute commands. According to some embodiments, a compact control menu may be provided after a user makes a touch selection in a document to aid in the user's ability to execute common control commands quickly and in the context of the selection. The compact control menu may initially appear in a collapsed state displaying a limited number of commands, and may allow the user to swipe in a particular direction for executing a command. The compact control menu may be expanded to display more command options upon a trigger to expand initiated by a particular user touch motion on the touch interface. The user may execute a command from the expanded command menu and after command execution, the compact control menu may disappear until a further user selection within a document.
These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory and do not restrict aspects as claimed.
As briefly described above, a compact control menu may be presented to a user over an interactive touch interface in order for a user to execute commands on a touch-enabled device. When a user makes a touch selection in a document, the compact control menu may be provided to aid in the user's ability to execute common control commands quickly and in the context of the selection. The compact control menu may initially appear in a collapsed state displaying a limited number of commands, allowing the user to swipe in a direction for executing a command; and the compact control menu may be expanded to display more command options upon a trigger to expand initiated by a particular user touch motion on the touch interface. After a command execution, the compact control menu may disappear until a further user selection within a document.
In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These aspects may be combined, other aspects may be utilized, and structural changes may be made without departing from the spirit or scope of the present disclosure. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents.
While the embodiments will be described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a computing device, those skilled in the art will recognize that aspects may also be implemented in combination with other program modules.
Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that embodiments may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and comparable computing devices. Embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Embodiments may be implemented as a computer-implemented process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage medium readable by a computer system and encoding a computer program that comprises instructions for causing a computer or computing system to perform example process(es). The computer-readable storage medium can for example be implemented via one or more of a volatile computer memory, a non-volatile memory, a hard drive, a flash drive, a floppy disk, or a compact disk, and comparable media.
Throughout this specification, the term “platform” may be a combination of software and hardware components for providing a compact control menu over an interactive touch interface and detecting user touch input for expanding the control menu and executing commands. Examples of platforms include, but are not limited to, a hosted service executed over a plurality of servers, an application executed on a single computing device, and comparable systems. The term “server” generally refers to a computing device executing one or more software programs typically in a networked environment. However, a server may also be implemented as a virtual server (software programs) executed on one or more computing devices viewed as a server on the network. More detail on these technologies and example operations is provided below.
Referring to
In a system according to embodiments, when a user 102 views a document or application over a user interface, control options may not initially be visible on the screen while the user is reading and scrolling through the document. When the user desires to execute a command for editing a portion of the document, the user may select a portion of the document for editing 108 using a touch selection motion, and the compact control menu 110 may appear after a user 102 creates a selection in the document. The compact control menu 110 may be anchored 112 to the selected portion 108 for indicating which selection the compact control menu 110 may be associated with. The user may then use a touch motion to execute a command displayed on the compact control menu 110. The selected portion of the document (or user interface) may be a text portion, a graphic portion, a number of cells in a table, a portion of an image, or a combination of any of those.
As demonstrated in
In another embodiment, the number of available commands may be displayed as defined extensions 114 from the compact control menu 110 representing the commands for executing, such as for example a cross, a star, or a flower, where parts of the shape extend from the center as points or extensions 114. In other embodiments, additional or fewer commands may be represented by defined regions of the shape of the compact control menu and the compact control menu 110 may be of any alternative shape which indicates the number of available commands. An extension as used herein may refer to a defined region of a shape, corner, or point extending from the center of a shape wherein the extension serves to represent a direction for an available command.
As demonstrated in
In an embodiment, one defined region 214 may represent a command for expanding the compact control menu 210 to display more available commands. The user may swipe in the direction of the region for expanding the compact control menu from the collapsed state to the expanded state causing more available commands to be displayed. Alternatively, the user may tap the compact control menu 210 to trigger the menu to expand to display more available commands.
As illustrated in diagram 200, the expanded state compact control menu may display which command is represented by each region of the compact control menu 110. For an example, commands such as copy, cut 208 and paste 206 may be displayed at each defined region 214 of the compact control menu 210. The commands may be displayed on the menu using text 202 or graphics 206, 208, or, in other examples the commands may be represented by symbols, icons, abbreviations, or full text labels in various orientations. In a further example of an expanded state, two or more available commands may be displayed at each region that a user may select 212.
Available commands may be programmed into permanent particular positions at each defined region of the compact control menu so that the user(s) can create a habit of always swiping in a particular direction for execution of a particular command. For example, as shown in diagram 100, the user may always swipe to the left in order to execute a cut 208 operation. When the user remembers where each command is positioned, the user may utilize the compact control menu 110 in its collapsed state without the need to expand the menu into its expanded state 210 in order to quickly execute routine command functions.
In an embodiment, the command positions may be pre-programmed as part of the system, or alternatively, the user(s) may select which commands the user desires to be associated with the defined regions of the compact control menu, the number of commands that should be displayed in the collapsed state menu, and the position of the commands at each region. Additionally, the user may select the type of display of the commands that the user prefers in the expanded state of the compact control menu 210. For example, the user may prefer icons or graphic images to represent a command, or a user may prefer a textual representation. The user may select an abbreviation, or may customize a representation for a command. Additionally the user may choose size, font, and orientation of command displays in the expanded state compact control menu 210 based on the user's custom preferences.
In a further example, an available command may have two or more options associated with the command 314, such that a user may desire to view and select from the available options. For example, as demonstrated in diagram 300, the right extension 312 may display two available commands, a text command 314 and an alignment command 302. A variety of options may be available for executing a command associated with the text command 314, and the compact control menu 310 in the expanded state may be configured to expand further to display a second level control menu 320 with the available options and commands associated with the first level commands 312. In example embodiments, the second level control menu 320 with more available commands may be presented as a popup menu or a dropdown menu or as an extension 320 anchored to the first extension 312. In further examples, more levels of expansion may be provided for presenting available commands and options. The user may execute a command from the second level control menu 320 by tapping the selected option. If the selection is a command to operate, the tapping motion may execute the command, or alternatively, if the selection is a command with more available options, the tapping may operate to expand to present the further options. After execution of a command to edit the selection of the document or application, the compact control menu 310 may disappear from the screen view so that the user may continue to review a document or application on the touch-enabled device 300 without obstruction of the view by the control menu 310. Upon another selection by the user, the compact control menu may appear again for presenting available commands relating to the newly selected portion of the document.
In another embodiment, the number of available commands may be displayed as defined extensions 420 from the compact control menu representing the commands for executing, such as for example a cross, a star, or a flower, where parts of the shape extend from the center as points or extensions. In another example, the compact control menu may be in a shape with four corners 430, where each corner region represents four distinct commands. Different numbers of commands may be represented by regions of the shape of the compact control menu and the compact control menu may be of any alternative shape which indicates the number of available commands. The commands may be displayed as text 434, icon 414, abbreviation 424, or a combination of representations 404. The display options and positions of the commands in defined regions of the control menu may be predefined by the system, or a user may customize to the user's preferences.
As illustrated in diagram 400, in an embodiment, each defined region may represent one command 404, 406, such that a swipe in a direction executes the command associated with the defined region in that direction. Alternatively each defined region may represent two or more available commands 432, 434, 436. The initial compact control menu may include an additional expansion menu 410 for expanding from the collapsed state to the expanded state control menu 440. The first shape 430 may allow a user to swipe in the direction of any of the defined regions to expand the available commands for that region. For example, if the user swiped in the left direction of menu 430, then the available commands on the left region would be displayed 432. The control menu may also include a second expansion menu 410, that when tapped by a user causes all of the regions of the collapsed state control menu to expand such that all available commands are displayed to the user 440. The user may then select one of the available commands from any of the displayed regions to further expand the control menu, generating a second level control menu 438 displaying more available options for the selected command.
The example systems in
Client applications executed on any of the client devices 511-513 may facilitate communications via application(s) executed by servers 515, or on individual server 516. An application executed on one of the servers may facilitate the detection of a user touch selection in a document, presenting a compact control menu associated with the selection, and detecting user touch input for expanding the control menu and executing commands. The application may retrieve relevant data from data store(s) 519 directly or through database server 518, and provide requested services (e.g. document editing) to the user(s) through client devices 511-513.
Network(s) 510 may comprise any topology of servers, clients, Internet service providers, and communication media. A system according to embodiments may have a static or dynamic topology. Network(s) 510 may include secure networks such as an enterprise network, an unsecure network such as a wireless open network, or the Internet. Network(s) 510 may also coordinate communication over other networks such as Public Switched Telephone Network (PSTN) or cellular networks. Furthermore, network(s) 510 may include short range wireless networks such as Bluetooth or similar ones. Network(s) 510 provide communication between the nodes described herein. By way of example, and not limitation, network(s) 510 may include wireless media such as acoustic, RF, infrared and other wireless media.
Many other configurations of computing devices, applications, data sources, and data distribution systems may be employed to implement a platform for providing a compact control menu over an interactive touch interface and detecting user touch input for expanding the control menu and executing commands. Furthermore, the networked environments discussed in
Control menu application 622 may enable a computing device 600 to continually detect user touch input from a touch interface to detect user selection of a portion of a document, provide a control menu displaying available commands, detect user selection of a command, and to execute commands associated with the user selection. Through command module 624, control menu application 622 may display a compact control menu associated with a selected portion of a document, and may detect user touch input to display the compact control menu in a collapsed state or an expanded state, and to execute commands associated with the selected content. The application may continuously detect user input and provide a compact control menu when a user creates a selection in a document, while minimizing user interface interference by disappearing from the screen view when a user has not selected a portion of a document for editing. Application 622 and configuration module 624 may be separate application or integrated modules of a hosted service. This basic configuration is illustrated in
Computing device 600 may have additional features or functionality. For example, the computing device 600 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
Computing device 600 may also contain communication connections 616 that allow the device to communicate with other devices 618, such as over a wired or wireless network in a distributed computing environment, a satellite link, a cellular link, a short range network, and comparable mechanisms. Other devices 618 may include computer device(s) that execute communication applications, web servers, and comparable devices. Communication connection(s) 616 is one example of communication media. Communication media can include therein computer readable instructions, data structures, program modules, or other data. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
Example embodiments also include methods. These methods can be implemented in any number of ways, including the structures described in this document. One such way is by machine operations, of devices of the type described in this document.
Another optional way is for one or more of the individual operations of the methods to be performed in conjunction with one or more human operators performing some. These human operators need not be collocated with each other, but each can be only with a machine that performs a portion of the program.
Process 700 begins with operation 710, where user selection by touch in a document or application on a touch-enabled device is detected. At operation 720, the computing device presents a compact control menu associated with the selected portion of the document. The compact control menu may be initially presented in a collapsed state, such that defined regions of the compact control menu represent available commands associated with the selection are presented.
At operation 730 the computing device analyzes the touch motion and determines if the user uses a tap or swipe control motion. If a swipe motion is detected, then the device determines that the user intended to execute the command associated with the direction of the swipe. At operation 740, the device detects the swipe motion direction and executes the command associated with the defined region in the direction in which the user swiped. If a tap motion is detected, then at operation 750, the device operates to expand the compact control menu into an expanded state for displaying available commands. At operation 760, the device detects the control motion on the available commands of the expanded state control menu. If a tap is detected on an executable command, then at operation 780, the device operates to execute the command selected by the user tap motion from the expanded control menu. If a tap is detected on a command which may have more options associated with the command, then at operation 770, the process returns to operation 750 to detect the tap motion on an available command and further expand the control menu to display more available command options. When a tap is detected on an executable command, then at operation 780 the device executes the command and at operation 790 the compact control menu is dismissed and disappears from the screen view of the touch interface. According to some embodiments, a size and a number of operation commands displayed in an expanded state of the control menu may be determined based on a size of displayed user interface.
The operations included in process 700 are for illustration purposes. User touch input detection and providing a compact control menu may be implemented by similar processes with fewer or additional steps, as well as in different order of operations using the principles described herein.
The above specification, examples and data provide a complete description of the manufacture and use of the composition of the embodiments. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims and embodiments.