Touch screens are often used within industrial automation systems to display machine status and controls, and to control the machines based on user gestures on the touch screen. These gestures may take any of a wide variety of formats and functions. For example, gestures may include moving, activating, deactivating, and rotating control objects corresponding to machine controls.
Example touch screen systems process user touches on the touch screen to determine which of the variety of gestures has occurred, then determine which control object the gesture was performed upon. From this information the control object is modified by a new set of parameters (such as size, location, and the like) according to the user gestures.
In an embodiment, a method of operating a human-machine interface system is provided. The method includes displaying a plurality of objects on a display system, each of the plurality of objects displayed according to object parameters, and receiving touch information corresponding to a plurality of touches on the display system. The method also includes processing the touch information to identify which object of the plurality of objects was touched, and upon identifying the first object that was touched, processing the touch information to identify a first gesture represented by the plurality of touches. The method further includes upon identifying the first gesture, determining a first command based on the first gesture, generating first new parameters for the object based on the first command, and displaying at least the first object that was touched according to the first new parameters for the object.
In another embodiment, a processing device for the operation of a touch screen displaying a plurality of objects is provided. Each of the plurality of objects is displayed according to object parameters. The processing device includes a gesture engine coupled to the touch screen, configured to receive user inputs and determine a set of points touched based on the user inputs, a view engine coupled to the touch screen and the gesture engine, configured to receive a set of points from the gesture engine and determine an identity of an object touched from the plurality of objects based on the set of points touched, and an object engine coupled to the gesture engine and the view engine.
The object engine is configured to receive the identity of the object touched from the view engine, query the gesture engine for a command based on the user inputs and the object touched, receive a command from the gesture engine, and process the object touched and the command to determine a new set of parameters for the object. The view engine is also configured to receive the new set of parameters from the object engine, and to display the object configured by the new set of parameters on the touch screen.
In a further embodiment, a method for the operation of a touch screen is provided. The method includes displaying a plurality of objects on the touch screen, each of the plurality of objects displayed according to object parameters, and receiving user inputs from the touch screen at a gesture engine. The method also includes determining a set of points touched by the user inputs in the gesture engine, and processing the set of points touched in a view engine to determine an identity of an object touched from the plurality of objects. The method further includes receiving the identity of the object touched at an object engine, and transmitting the identity of the object touched to the gesture engine.
The method also includes determining a gesture in the gesture engine based on the user inputs and the object touched, determining a command in the gesture engine based on the gesture, receiving the command at the object engine, processing the command and the object touched in the object engine to determine a new set of parameters for the object, and displaying the object configured by the new set of parameters on the touch screen.
In an example, HMI system 108 includes a processor configured to display machine data on a touch screen, and to receive user inputs through the touch screen. HMI system 108 interprets the user inputs and adjusts the parameters of machine systems 102, 104, and 106 accordingly.
Link 110 may use any of a variety of communication media, such as air, metal, optical fiber, or any other signal propagation path, including combinations thereof. Also, the link may use any of a variety of communication protocols, such as internet, telephony, optical networking, wireless communication, wireless fidelity, code division multiple access, worldwide interoperability for microwave access, or any other communication protocols and formats, including combinations thereof. Further, the link could be a direct link or it might include various intermediate components, systems, and networks.
The touch information is processed to identify which of the plurality of objects was touched (operation 204). If no object was touched the method ends without processing the touch information to identify a gesture. In other examples, when no object is touched, the method continues, and applies the touch information to all of the objects shown on the display system. Upon identifying the object that was touched, the touch information is processed to identify a gesture represented by the plurality of touches (operation 206).
Upon identifying the gesture, a command is determined based on the gesture (operation 208). New parameters for the object are generated based on the command (operation 210). At least the object that was touched is then displayed according to the new parameters for the object (operation 212). These operations may take place in HMI system 108 which also includes the display system. In other examples these operations may take place in a computer system coupled to the display system. In such an example, the computer system may be located apart from the display system and may be coupled to the display system through a network.
Communication interface 301 comprises components that communicate over communication links, such as network cards, ports, RF transceivers, processing circuitry and software, or some other communication devices. Communication interface 301 may be configured to communicate over metallic, wireless, or optical links. Communication interface 301 may be configured to use TDM, IP, Ethernet, optical networking, wireless protocols, communication signaling, or some other communication format—including combinations thereof. In an example, communication interface 301 may be configured to communicate with a plurality of machine systems, such as machine systems 102, 104, and 106 illustrated in
User interface 302 includes components that interact with a user. These components may include a keyboard, display system, mouse, touch pad, or some other user input/output apparatus. In this example user interface 302 includes display system 312. In an example, display system 312 is a touch screen display configured to receive touch data and to display graphical data. Touch screens may be responsive to resistive or capacitive changes.
Processing circuitry 305 comprises microprocessor and other circuitry that retrieves and executes operating software 307 from memory device 306. Memory device 306 comprises a disk drive, flash drive, data storage circuitry, or some other memory apparatus. Operating software 307 comprises computer programs, firmware, or some other form of machine-readable processing instructions. Operating software 307 may include an operating system, utilities, drivers, network interfaces, applications, or some other type of software. In this example, operating software 307 includes gesture engine 309, view engine 310, and object engine 311. In an example, gesture engine 309, view engine 310 and object engine 311 are configured to control computer system 300 to perform the operations illustrated in
When executed by circuitry 305, operating software 307 directs processing system 303 to operate HMI system 300 as described herein. In particular, operating software 307 directs processing system 303 to display a plurality of objects on display system 312. Each of the plurality of objects is displayed according to object parameters.
Processing system 303 receives touch information corresponding to a plurality of touches on display system 312. Processing system 303 processes the touch information to identify which of the plurality of objects was touched. If no object is touched, the method ends without requiring the gesture to be processed. Thus gestures that do not operate on objects are ignored without processing the gesture itself. In other examples, when no object is touched, the input from the display system is applied to all of the objects shown on the display system.
Upon identifying the object that was touched, processing system 303 processes the touch information to identify a gesture represented by the plurality of touches. Upon identifying the gesture, processing system determines a command represented by the gesture. Processing system 303 generates new parameters for the object based on the command. Processing system 303 then displays at least the object that was touched according to the new parameters for the object on display system 312.
Links 410, 412, 414, 416, and 418 may use any of a variety of communication media, such as air, metal, optical fiber, or any other signal propagation path, including combinations thereof. Also, the links may use any of a variety of communication protocols, such as internet, telephony, optical networking, wireless communication, wireless fidelity, code division multiple access, worldwide interoperability for microwave access, or any other communication protocols and formats, including combinations thereof. Further, the links could be direct links or they might include various intermediate components, systems, and networks.
In this example, display system 402 includes a touch screen. Display system 402 is configured to display a plurality of objects on the touch screen. Each of the plurality of objects is displayed according to object parameters. Gesture engine 404 receives user inputs from display system 402 through link 410. Gesture engine 404 is configured to determine a set of points touched by the user inputs. Gesture engine 404 transfers the set of points touched by the user inputs to view engine 406 through link 412.
View engine 406 is configured to process the set of points touched to determine an identity of an object touched from the plurality of objects. View engine 406 transfers the identity of the object touched, along with the points within the object that were touched, to object engine 408 through link 416. In some examples, view engine 406 includes a graphics driver configured to drive display system 402. Object engine 408 transfers the identity of the object touched to gesture engine 404 through link 418. In an alternate example, view engine 406 may transfer the identity of the object touched to gesture engine 404 through link 412. Gesture engine 404 determines a gesture based on the user inputs and the object touched. Gesture engine 404 determines a command based on the gesture. Gesture engine 404 transfers the command to object engine 408 through link 418.
Object engine 408 processes the command and the object touched to determine a new set of parameters for the object. Object engine 408 transfers the new set of parameters for the object to view engine 406 through link 416. View engine 406 then displays the object configured by the new set of parameters on display system 402.
View engine 406 is configured to process the set of points touched to determine an identity of an object touched from the plurality of objects (operation 506). If no object was touched the method ends without processing the touch information do identify a gesture. In other examples, when no object is touched, the input from the display system is applied to all of the objects shown on the display system. View engine 406 transfers the identity of the object touched to object engine 408 through link 416 (operation 508). Object engine 408 transfers the identity of the object touched to gesture engine 404 through link 418 (operation 508). In an alternate example, view engine 406 may transfer the identity of the object touched to gesture engine 404 through link 412. Gesture engine 404 determines a command based on the user inputs and the object touched (operation 510). Gesture engine 404 transfers the command to object engine 408 through link 418 (operation 512).
Object engine 408 processes the command and the object touched to determine a new set of parameters for the object (operation 514). Object engine 408 transfers the new set of parameters for the object to view engine 406 through link 416. View engine 406 then displays the object configured by the new set of parameters on display system 402 (operation 516).
Note that in this example two gestures are used to modify object 608. These gestures may happen simultaneously on multiple display systems displaying the same object for multiple users, or may happen sequentially on a single display system. Gestures may come from multiple users, or a single user. In this example, three objects have been modified at the same time. In other examples, one or any number of objects may be modified at the same time according to one or more user gestures on the touch screen.
Industrial automation environment 711 communicates with communication network 715 through aggregation server 710. Aggregation server 710 communicates with human-machine interface (HMI) systems 704 and 706 through enterprise network 705, and with HMI system 720 through network 715. Machine systems 701, 702, and 703 are coupled with HMI system 704, and machine systems 707, 708, and 709 are coupled with HMI system 706. In other examples, there may be any number of machine systems and HMI systems within industrial automation environment 711. In still other examples, the machine systems may be coupled directly with enterprise network 705 without passing through any HMI systems. Thus HMI system 720 has more direct access to the machine systems without having to go through any additional HMI systems.
HMI systems 704 and 706 receive machine data from the machine systems and create a graphical display representing their respective machine systems. This graphical display allows human operators to easily visualize the status of each machine system and to control the machine systems through the HMI systems. In many industrial automation environments, machine systems 701 through 709 may be coupled together in a variety of different configurations. These configurations may change as the industrial automation environment is modified for the production of different articles and as machine systems are updated, repaired, or replaced.
HMI systems 704 and 706 are configured to monitor machine systems 701 through 709 and to display machine status and controls as a set of objects on display systems within or attached to HMI systems 704 and 706. In some examples, the display systems are co-located with the machine systems. HMI system 720 is configured to display machine status and controls from any or all of machine systems 701 through 709. Aggregation server 710 is configured to generate a graphical representation of industrial automation environment 711 and to transmit the graphical representation to HMI system 720 through network 715. In some embodiments, aggregation server 710 may be located outside of industrial automation environment 711. For example, it may exist on a data server within network 715, or may be independent and communicate to industrial automation environment 711 through another communication network. HMI system 720 is configured to display the graphical representation to a user and to respond to user commands received at a user interface within HMI system 720. HMI system 720 may be configured to both receive data from industrial automation environment 711 and also to send commands and data to industrial automation environment 711 based upon gestures received at a display system within or coupled to HMI system 720.
Further, in some embodiments, security measures may limit the data received by HMI system 720 from industrial automation environment 711 according to a security level of HMI system 720 or a user of HMI system 720. Likewise, security measures may limit the data and commands allowed to be sent by HMI system 720 to industrial automation environment 711 according to a security level of HMI system 720 or a user of HMI system 720.
Enterprise network 705 and network 715 may be any local or wide area network capable of transferring data from one computer system to another. For example, enterprise network 705 may be a local area network (LAN) with limited or no connections to machines outside of industrial automation environment 711, while network 715 may be the Internet with connections to machines and HMI systems throughout the world.
The above description and associated figures teach the best mode of the invention. The following claims specify the scope of the invention. Note that some aspects of the best mode may not fall within the scope of the invention as specified by the claims. Those skilled in the art will appreciate that the features described above can be combined in various ways to form multiple variations of the invention. As a result, the invention is not limited to the specific embodiments described above, but only by the following claims and their equivalents.