The subject matter disclosed herein relates to human-machine interfaces, and more particularly, to dynamic contextual touch menus.
Multi-touch user interfaces often suffer from low information density, as it is difficult to balance ease of use for a touch device with a large number of user interface (UI) elements. This holds particularly true in control systems, where human-machine interfaces (HMIs) for industrial control software are very rich in detailed information. Progressive disclosure patterns, such as popup or context menus, collapse/expand panels, and semantic zoom, selectively provide and hide access to underlying information. UI elements in a pointer-based environment may not translate well into a multi-touch environment. The term “pointer-based”, as used herein, refers to environments using a movable onscreen pointer or cursor and may include mice, trackballs, touchpads, pointing sticks, joysticks, and the like, where the input device and display device are separate elements. A multi-touch device can recognize the presence of two or more points of contact on a touch-sensitive surface.
As one example, a typical activation gesture in a multi-touch environment for a popup menu is a “tap hold” operation that can be uncomfortable and time consuming. Another common mouse UI element in engineering tools is a property grid, which provides an information dense UI control with poor usability on multi-touch devices. “Tooltips” are commonly used in pointer-based HMIs and engineering tools to provide details about an element of the UI when a pointer hovers over the element; however, in a multi-touch environment without hover events, the use of tooltips is not possible.
One aspect of the invention is a system for providing a dynamic contextual touch menu. The system includes a multi-touch display and processing circuitry coupled to the multi-touch display. The processing circuitry is configured to detect a contextual menu display request in response to a touch detected on the multi-touch display. The processing circuitry is configured to display a dynamic contextual touch menu associated with a first element as a targeted element in response to the detected contextual menu display request. The processing circuitry is also configured to modify content of the dynamic contextual touch menu to align with a second element as the targeted element in response to a detected motion on the multi-touch display between the first and second elements.
Another aspect of the invention is a method for providing a dynamic contextual touch menu. The method includes detecting, by processing circuitry coupled to a multi-touch display, a contextual menu display request in response to a touch detected on the multi-touch display. The method further includes displaying on the multi-touch display, a dynamic contextual touch menu associated with a first element as a targeted element in response to detecting the contextual menu display request. The processing circuitry modifies content of the dynamic contextual touch menu to align with a second element as the targeted element in response to a detected motion on the multi-touch display between the first and second elements.
Another aspect of the invention is a computer program product for providing a dynamic contextual touch menu. The computer program product includes a non-transitory computer readable medium storing instructions for causing processing circuitry coupled to a multi-touch display to implement a method. The method includes detecting a contextual menu display request in response to a touch detected on the multi-touch display. The method further includes displaying on the multi-touch display, a dynamic contextual touch menu associated with a first element as a targeted element in response to detecting the contextual menu display request. The processing circuitry modifies content of the dynamic contextual touch menu to align with a second element as the targeted element in response to a detected motion on the multi-touch display between the first and second elements.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
In the example of
The control system framework 102 may interface to various processing systems 112 via a network 114. The network 114 may also interface to one or more remote data storage systems 116. A local data storage system 118, which can include fixed or removable media, may be accessible to or integrated with the control system framework 102. A wireless interface 120 can enable wireless access to the control system framework 102 by one or more mobile devices 122. In exemplary embodiments, the mobile devices 122 respectively include multi-touch displays 124 that enable touchscreen-based navigation and control of elements within the control system framework 102. The wireless interface 120 may be part of the network 114 or be separately implemented.
The control system framework 102 can also or alternatively interface locally to one or more multi-touch displays 126 via display drivers 128. The multi-touch displays 126 can be large form factor displays, i.e., non-mobile device displays. For example, the multi-touch displays 126 can be mounted vertically or horizontally to a support structure or integrated within a support structure, such as a touch-sensitive computer table surface. The display drivers 128 produce a variety of interactive user interfaces to support access, control, monitoring, and troubleshooting of the control subsystems 104.
The control system framework 102 can also include a number of additional features, such as a human-machine interface (HMI) 130, a trender 132, a device information module 134, and a code module 136. The HMI 130 may provide direct control and monitoring of the control subsystems 104. The trender 132 can monitor, log, and display data from the sensors 108, system status, and various derived signals from the control subsystems 104. The trender 132 may store recorded data locally in the local data storage system 118 for logging and analyzing recent events, while long-term data can be stored to and extracted from the one or more remote data storage systems 116. The device information module 134 can identify, display and edit information associated with selected devices. The device information module 134 may access the remote and/or local data storage systems 116 and 118 for device data. Device data that may be accessed by the device information module 134 can include properties, configurable parameters, data sheets, inventory information, troubleshooting guides, maintenance information, alarms, notifications, and the like. The code module 136 can display underlying code used to design and interface with other modules such as the HMI 130. The code module 136 can access underlying code stored on the remote and/or local data storage systems 116 and 118, and display the code in a graphical format to further assist with troubleshooting of problems within the control system environment 100.
Although a number of features are depicted as part of the control system environment 100 and the control system framework 102, it will be understood that various modules can be added or removed within the scope of various embodiments. For example, the wireless interface 120 can be omitted where the mobile devices 122 are not supported. The code module 136 can be omitted where the underlying code is not made visible to users of the control system framework 102. Additionally, user accounts can be configured with different levels of permissions to view, access, and modify elements and features within the control system framework 102. For example, a user may only be given access to the trender 132 and/or the device information module 134 to support analysis and troubleshooting while blocking access to change states of parameters of the control subsystems 104.
In exemplary embodiments, in terms of hardware architecture, as shown in
The processing circuitry 205 is hardware for executing software, particularly software stored in memory 210. The processing circuitry 205 can include any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing system 201, a semiconductor based microprocessor (in the form of a microchip or chip set), a macroprocessor, or generally any device for executing software instructions.
The memory 210 can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and nonvolatile memory elements (e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, memory card, programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), digital versatile disc (DVD), disk, diskette, cartridge, cassette or the like, etc.). Moreover, the memory 210 may incorporate electronic, magnetic, optical, and/or other types of storage media. The memory 210 can have a distributed architecture, where various components are situated remote from one another but can be accessed by the processing circuitry 205.
Software in memory 210 may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions. In the example of
The control system framework 102 as described herein may be implemented in the form of a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When a source program, then the program may be translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory 210, so as to operate properly in conjunction with the OS 211. Furthermore, the control system framework 102 can be written in an object oriented programming language, which has classes of data and methods, or a procedure programming language, which has routines, subroutines, and/or functions.
In exemplary embodiments, the input/output controller 235 receives touch-based inputs from the multi-touch display 126 as detected touches, gestures, and/or movements. The multi-touch display 126 can detect input from one finger 236, multiple fingers 237, a stylus 238, and/or another physical object 239. Multiple inputs can be received contemporaneously or sequentially from one or more users. The multi-touch display 126 may also support physical object recognition using, for instance, one or more scannable code labels 242 on each physical object 239. In one example, the multi-touch display 126 includes infrared (IR) sensing capabilities to detect touches, shapes, and/or scannable code labels. Physical object 239 may be, for instance, a user identification card having an associated IR-detectable pattern for the user as one or more scannable code labels 242 to support login operations or user account and permissions configuration.
Other output devices such as the I/O devices 240, 245 may include input or output devices, for example but not limited to a printer, a scanner, a microphone, speakers, a secondary display, and the like. The I/O devices 240, 245 may further include devices that communicate both inputs and outputs, for instance but not limited to, components of the wireless interface 120 of
In exemplary embodiments, the system 200 can further include a network interface 260 for coupling to the network 114. The network 114 can be an internet protocol (IP)-based network for communication between the processing system 201 and any external server, client and the like via a broadband connection. The network 114 transmits and receives data between the processing system 201 and external systems. In exemplary embodiments, network 114 can be a managed IP network administered by a service provider. The network 114 may be implemented in a wireless fashion, e.g., using wireless protocols and technologies, such as WiFi, WiMax, etc. The network 114 can also be a packet-switched network such as a local area network, wide area network, metropolitan area network, Internet network, or other similar type of network environment. The network 114 may be a fixed wireless network, a wireless local area network (LAN), a wireless wide area network (WAN), a personal area network (PAN), a virtual private network (VPN), intranet or other suitable network system and includes equipment for receiving and transmitting signals.
If the processing system 201 is a PC, workstation, intelligent device or the like, software in the memory 210 may further include a basic input output system (BIOS) (omitted for simplicity). The BIOS is a set of essential software routines that initialize and test hardware at startup, start the OS 211, and support the transfer of data among the hardware devices. The BIOS is stored in ROM so that the BIOS can be executed when the processing system 201 is activated.
When the processing system 201 is in operation, the processing circuitry 205 is configured to execute software stored within the memory 210, to communicate data to and from the memory 210, and to generally control operations of the processing system 201 pursuant to the software. The control system framework 102, the OS 211, and applications 212 in whole or in part, but typically the latter, are read by the processing circuitry 205, perhaps buffered within the processing circuitry 205, and then executed.
When the systems and methods described herein are implemented in software, as is shown in
When a user desires to display, select, and/or edit contextual information or commands for a targeted element, the user makes a contextual menu display request as a touch-based command on the multi-touch display 126 thereby triggering pop-up display of a dynamic contextual touch menu 302 as depicted in
The example user interface 300 includes a pallet of icons 306 as touch-sensitive options, such as work set navigation, layout/view change, orientation/display rotation, and logging in/out. The pallet of icons 306 may also include a context icon 308. A user may touch the context icon 308 and apply a dragging motion between the context icon 308 and a targeted element 310, resulting in displaying the dynamic contextual touch menu 302 on the multi-touch display 126. In the example of
The dynamic contextual touch menu 302 is dynamic in that content 314 of the dynamic contextual touch menu 302 is customized to align with the targeted element 310, and the content 314 can be modified as the dynamic contextual touch menu 302 is maneuvered to align with different elements. For example, moving the dynamic contextual touch menu 302 between two elements that have different properties can result in modifying the content 314 displayed by the dynamic contextual touch menu 302, as well as producing layout/formatting changes of the dynamic contextual touch menu 302. In the example of
Example content 314 of the dynamic contextual touch menu 302 of
The example user interface 300 of
As previously described, the content 314 of the dynamic contextual touch menu 302 can be modified as the dynamic contextual touch menu 302 is maneuvered to align with different elements.
The content 314 of the dynamic contextual touch menu 302 is modified between
When a user desires to display, select, and/or edit contextual information or commands for a targeted element, the user makes a contextual menu display request as a touch-based command on the multi-touch display 126 thereby triggering pop-up display of a dynamic contextual touch menu. In response to a detected contextual menu display request, the example of
When a user desires to maintain the first dynamic contextual touch menu 402a and include additional dynamic contextual touch menus 402, the user can make one or more additional contextual menu display requests to open, for instance, a second dynamic contextual touch menu 402b as depicted in
In the example of
As an individual dynamic contextual touch menu 402 is dragged across the trend window 404, it is modified based on the underlying targeted element 410 such that it may appear as the first dynamic contextual touch menu 402a at targeted element 410a, as the second dynamic contextual touch menu 402b at targeted element 410b, and as the third dynamic contextual touch menu 402c at targeted element 410c. As one example, the first dynamic contextual touch menu 402a can adjust thickness and color of one of the selected signals 409, and then be dragged over a different signal line to change that line's thickness. Other variations of content and formatting of each dynamic contextual touch menu 402 can exist in other locations. Other examples can include circularly formatted dynamic contextual touch menus similar to
The dynamic contextual touch menus 402a and 402b may also support updating of parameters by copying values between the dynamic contextual touch menus 402a and 402b. For example, the configurable maximum value 502 of the dynamic contextual touch menu 402a can be copied to the configurable maximum value 512 of the dynamic contextual touch menu 402b using a copying motion by touching the configurable maximum value 502 and applying a dragging motion 520 over to the configurable maximum value 512.
At block 606, after a contextual menu display request is detected at block 604, a dynamic contextual touch menu, such as the dynamic contextual touch menu 302 or 402a-402c, at a targeted element is displayed on the multi-touch display 126. As illustrated in the example of
At block 608, the processing circuitry 205 determines whether there is a new targeted element based on input from the multi-touch display 126. For example, the processing circuitry 205 can detect a motion on the multi-touch display 126, such as the dragging motion 340 of
At block 610, based on detecting a new targeted element in block 608, the processing circuitry 205 modifies the content of the dynamic contextual touch menu, such as content 314 of the dynamic contextual touch menu 302 of
At block 612, the processing circuitry 205 determines whether a close action is detected. In the example of
Multiple instances of the process 600 can operate in parallel such that additional dynamic contextual touch menus can be displayed on the multi-touch display 126 contemporaneously in response to detecting additional contextual menu display requests. An example of this is depicted in
In exemplary embodiments, a technical effect is modifying contents of a dynamic contextual touch menu to align with a targeted element as the dynamic contextual touch menu is moved between elements. Modification of the dynamic contextual touch menu presents relevant information and/or commands based on a targeted element. Support for simultaneous display of multiple dynamic contextual touch menus enables copying of values between the dynamic contextual touch menus.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized including a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contains, or stores a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium as a non-transitory computer program product may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In exemplary embodiments, where the control system framework 102 of
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, modifications can incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments have been described, it is to be understood that aspects may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.