This application relates to automation engineering. More particularly, the application relates to cognitive engineering for automation systems.
Performing automation engineering tasks requires a high level of human technical and domain expertise due to the complexity and critically of modern automation systems in manufacturing and assembly, chemical, pharmaceutical, food and beverage, paper, electronics, etc.) The growing complexity of automated systems, the increasing requirements for high productivity and quality of the engineering tasks, and the increasing demands for safety and high availability of the automation make it very difficult for teams of human experts to write automation programs fast enough. Rotation of staff aggravates this problem for organizations. Improved systems and methods to address these challenges is desired.
According to some embodiments of this disclosure, a method for representing knowledge in a cognitive engineering system (CES) includes receiving information relating to an automation engineering project from an engineering tool, storing the received information in a cognitive engineering graph (CEG) comprising a plurality of nodes representative of an element of the automation engineering project and at least on edge connecting two of the nodes, the at least one edge representative of a relationship between the connected nodes, storing a plurality of previously generated CEGs representative of other prior automation engineering projects, and establishing a communication path between the CEG storing the received information and the plurality of previously generated CEGs. In some embodiments, the method may further include applying machine learning to the stored CEG based on the received information and the stored plurality of previously generated CEGs. The machine learning may be used to analyze the CEG based on the received information to identify at least one pattern that is representative of a given object of interest from the automation engineering project. In some embodiments the CES may automatically add an element to the CEG based on the received information and on a query from a user. According to an embodiment, the user may request that a change made by the CES be reversed. An undo action may be performed where the system identifies any recent automatic changes and any associated dependencies and removes those changes returning the system to the state it was in prior the automatic changes being performed.
The knowledge representation in the form of the CEG may include nodes that represent physical objects in the automation engineering project or an automation program for controlling a corresponding physical object in the automation engineering program. The CEG may include a representation of a human machine interface and/or a programmable logic controller. The CEG may be used to validate a design for the automation engineering project by comparing the CEG to a plurality of previously generated CEGs. I some embodiments the generated CEG can be compared to previously generated CEGs and provide a suggested course of action to a user.
According to some embodiments of the present disclosure, a system for providing a knowledge representation in a cognitive engineering system (CES) includes a computer-based engineering tool for providing at least one of designing, programming simulation and testing of an automation system, a cognitive system in communication with the computer-based engineering tool comprising: a knowledge extraction module for identifying and storing information contained in a project of the computer-based engineering tool and from data received from a physical automation system, a machine learning module for analyzing knowledge extracted by the knowledge extraction module and identifying characteristics of the automation system; an inductive programming module of automatically generating control programs for the automation system based on the stored information from the knowledge extraction module, and a knowledge representation comprising a cognitive engineering graph (CEG), the CEG comprising a plurality of nodes representative of an element of the automation engineering project and at least on edge connecting two of the nodes, the at least one edge representative of a relationship between the connected nodes.
The system may include a computer memory that stores a plurality of CEGs from previously designed projects in communication with the machine learning module for analyzing past knowledge. Based on analysis of the CEG of a current design in view of the previously designed projects, a feedback module may be provided to give information to the user. For example, a user may be provided with feedback relating to the validation of the project being designed. In other embodiments, the feedback module may provide a user with a recommended course of action based in part on the actions taken in previous projects.
A communications channel may be established between the knowledge extraction module and a physical automation system. The physical automation system generates data relating to the operating state of the automation system and provides the information in the data to the knowledge representation.
An automated reasoning module may be in communication with the engineering tool and the knowledge representation and may be configured to make certain design decisions including the automatic addition of a component to an engineering project in the engineering tool.
The foregoing and other aspects of the present invention are best understood from the following detailed description when read in connection with the accompanying drawings. For the purpose of illustrating the invention, there is shown in the drawings embodiments that are presently preferred, it being understood, however, that the invention is not limited to the specific instrumentalities disclosed. Included in the drawings are the following Figures:
Embodiments described in this disclosure address bringing artificial intelligence (AI) to automation engineering tools to assist humans in design while increasing their productivity and ensuring high quality. The result is generating automation programs that provide higher reliability, availability and safety. Typically, if the complexity of an automation engineering task is significant, additional human engineers are added to the design team with the hope of providing solutions to the problem that are timely and contain high quality. Unfortunately, adding more humans to the teams does not scale in practice due to the increased overhead in communication and management.
Cognitive engineering utilizes machine learning to allow the cognitive engineering system 110 to assist the engineer 101. Through machine learning, the CES 110 can recognize aspects of the physical automation system 120, along with past engineering actions previously taken and use this knowledge to assist the engineer 101 during design. For example, the CES 110 may observe an action the engineer 101 performs in the engineering tool 111 and associate the action with an action previously taken by the same or another engineer. Based on this recognition, the CES 110 may suggest to the engineer 101 that some or all of the project could be preloaded by the CES 110 into the workspace of the engineering tool 111. In some embodiments, the CES may make suggestions to the Engineer 101 regarding future design actions. For example, the CES 110 may observe the engineer 101 adding a system component to the engineering tool. The CES 110 may compare the action to actions taken by other engineers in the past who were working on the same problem or in a similar system. If the engineer's action is aligned with actions the CES 110 has seen as typical in the past, the CES 110 may suggest to the engineer 111 that a different course of action may be considered. The engineer has the option of accepting the CES suggestion or continuing with the original action of the engineer 101.
The cognitive system 113 leverages captured knowledge to assist the engineer 101. The cognitive system 113 functions to extract knowledge 119 from the system via system input 127 and provide a stored representation of the collected knowledge 117. Machine learning 114 is performed on the stored extracted knowledge to identify and exploit relationships in the data. In some embodiments the knowledge may be used to provide inductive programming 112 for components or control of the system. Automated reasoning 115 is applied to the knowledge representation 117 and may be used to analyze the knowledge representation 117 and provide feedback to the engineer 101 based on the analysis.
With reference to
The representation of knowledge has long been a topic of research with early attempts focusing on maintaining knowledge in a standard or specialized database, such as a relational database using database languages such as SQL. These databases store information in the form of rules and facts. In later research, the use of knowledge graphs has become popular for representing and analyzing linked information from communication networks, social network logistics, marketing systems and geographical information.
This disclosure presents a Cognitive Engineering Graph (CEG) as the main building block of a Cognitive Engineering System (CES), responsible for the knowledge representation 117 of the system. The CEG represents all relevant information about the engineering process in the form of a graph, in accordance with the techniques used by today's graph databases (e.g., Neo4j, BlazeGraph).
The CEG is created and subsequently updated with information obtained from engineering tools such as TIA Portal. Other related engineering tools such as computer assisted design (CAD)/computer aided engineering (CAE)/computer assisted manufacturing (CAM) may also provide information to the CEG. The information is analyzed and represented in the CEG in the form of nodes, edges and properties. The nodes represent the objects and data elements, while the edges connect two or more nodes and represent relationships between the connected nodes. Both nodes and edges may have properties that describe a particular instance in detail. The CEG may be displayed to the engineer at different levels of detail. The following examples will be explained with reference to an engineering design depicted in a view of an engineering tool as illustrated in
Navigation buttons 211 provide additional functionality to assist in operation of the system. Objects that make up physical components of the automation system may be displayed in a workspace region of the display 210. Objects may include an industrial robot 201, a conveyor 203, a light tower 205 and an object of manufacture 209. Light tower 205 provides an indication of the current operating status of the system. For example, if the green portion of the light tower is illuminated, it may indicate that the system is currently actively operating. The light tower alerts a user or bystander to the fact that the robot 201 or the conveyor 203 may be in motion and present a danger to health and safety. Conveyor 203 is used to transport the object of manufacture 209 throughout the plant. For example, conveyor 203 may transport the object of manufacture 209 to a workstation “manned” by industrial robot 201. Industrial robot 201 may then perform manufacturing actions on the object of manufacture 209. When the actions of the industrial robot 201 are complete, conveyor 303 may transport the object of manufacture 209 to another workstation for further processing, or if manufacturing is complete, may transport the object of manufacture 209 for final inspection or shipping.
The HMI 310 provides a root screen 330 that is displayed to the user. Objects displayed on the screen are elements of the root screen 330. For example, the robot 201 display shown in
PLC 320 provides blocks for monitoring data and providing control to the automation system. There is a first function block (FB) 321 containing the logic required to control the robot 201 and a second FB 325 for controlling the conveyor. Data blocks (DB) 325, 327 store data relating to the robot and the conveyor, respectively. The PLC 320 uses labels that identify the functions and properties of the components of the automation system. The labels or tags are stored in a tag table 340. The tags aid the PLC 320 in interacting with the display of the HMI 310 through an organization block (OB) 329. Via the OB 329, the values of the light tower lights can be controlled and displayed to the user. Using tags 345a for the red light, 345b for the yellow light and 345c for the green light, the OB 329 sets the value at each tag. For example, the value may be a binary value indicating whether the associated light is illuminated or dim. Using the tags 345a, 345b, 345c, the color of the associated rectangle 335a, 335b, 335c may be set to indicate each lights status. To display the light status to the user, for example, a darker shade may be used to indicate when the light is not illuminated, while a lighter, more vibrant shade of color may be used to indicate a light that is illuminated.
The structure of the CEG is constructed to include knowledge based on historical data from projects that have been previously designed and analyzed by the CEG. The past knowledge obtained by the CEG may be analyzed to discover useful patterns. For example, through logic-based pattern-matching or statistical machine learning, patterns may be identified that identify certain objects or processes that may be of use in the future. When a pattern is identified, either in a supervised or unsupervised manner, it can be applied to new contexts for various purposes including:
An example will now be illustrated where the discovery and use of a light tower such as the light tower 205 in
It should be noted that other formats or query languages may be used. The CES identifies the pattern associated with the user's request and searches for the light tower pattern in the current CEG to find a match. In this example, a match is found. The elements of this match are shown in
The CES analyzes the CEG and infers properties about the high-level concept of the light tower. These properties are presented to the engineer, and he/she can request the CES to modify them. The CES can determine the steps to carry out these modifications and can also warn the engineer about possible problems that can arise. For the light tower, the CES determines the number and colors of the lights in the light tower as a property: red, yellow and green. This property may be shown to the user in a simplified form as illustrated in
The pattern for transforming the CEG to add the new light may alternatively be expressed as a query for the graph database. This time with instructions that match the context where the transformation will occur. The instructions that create new nodes and edges in the graph may be expressed as follows:
Referring now to
In addition to the general information stored in the CEG regarding the various components of the system, the CES may require additional information in order to make reasoning decisions about the automation problems to be solved by the system being designed. In order to allow comprehensive reasoning regarding the problems faced, the system needs an understanding of the physical world and not just the general terms of the task to be solved. A typical automation program may include input modules for various sensors, output modules for various actuators and if used, one or more drive modules to control electric motors. However, this approach is missing a connection of these modules to the actual physical devices that are the subject of the automation program. Even if these connections were included in some textual form in the comments of the program, this would not provide a reliable source of information. According to some embodiments of the present disclosure, the CEG is developed to accommodate the representation of objects with varying levels of detail.
The varying detail level of the CEG can be achieved through the following elements:
Some embodiments may be realized in the form of a graph database and accompanying software interfaces, which implement the knowledge representation functionality of the CEG while adhering to the principles above. While object-oriented databases and programming languages exist, they follow a basic design principle by which all objects are fully defined at their creation meaning that they lack the flexibility of being able to represent objects with increasing level of detail throughout the object's lifecycle.
A historical record of previously designed projects may be stored in the form of a CEG. Multiple instances of previously generated CEGs 1209 may be stored. A communications link 1211 between the stored previously CEGs 1209 and the current CEG 1207 generated from on the information 1205. The information in the current CEG 1207 and the previously generated CEGs 1209 are included in the knowledge representation 1213 of the system. The knowledge representation contains stored knowledge gained from the experience of designers of varying experience and skill levels through the design on the current project and previously designed projects.
Machine learning 1215 may be applied to the knowledge representation to determine design choices and practices that have been determined to be successful, or conversely, design choices and practices that were determined to be unsuccessful. Machine learning 1215 may use the knowledge representation 1213 to make recommendations to a user via engineering tool 1201. Additionally, machine learning 1215 may examine an engineering project in the engineering tool 1201 and validate 1217 the design based on prior knowledge contained in the knowledge representation 1213.
As shown in
The processors 1020 may include one or more central processing units (CPUs), graphical processing units (GPUs), or any other processor known in the art. More generally, a processor as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device. A processor may use or comprise the capabilities of a computer, controller or microprocessor, for example, and be conditioned using executable instructions to perform special purpose functions not performed by a general-purpose computer. A processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between. A user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display images or portions thereof. A user interface comprises one or more display images enabling user interaction with a processor or other device.
Continuing with reference to
The computer system 1010 also includes a disk controller 1040 coupled to the system bus 1021 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 1041 and a removable media drive 1042 (e.g., floppy disk drive, compact disc drive, tape drive, and/or solid state drive). Storage devices may be added to the computer system 1010 using an appropriate device interface (e.g., a small computer system interface (SCSI), integrated device electronics (IDE), Universal Serial Bus (USB), or FireWire).
The computer system 1010 may also include a display controller 1065 coupled to the system bus 1021 to control a display or monitor 1066, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information to a computer user. The computer system includes an input interface 1060 and one or more input devices, such as a keyboard 1062 and a pointing device 1061, for interacting with a computer user and providing information to the processors 1020. The pointing device 1061, for example, may be a mouse, a light pen, a trackball, or a pointing stick for communicating direction information and command selections to the processors 1020 and for controlling cursor movement on the display 1066. The display 1066 may provide a touch screen interface which allows input to supplement or replace the communication of direction information and command selections by the pointing device 1061. In some embodiments, an augmented reality device 1067 that is wearable by a user, may provide input/output functionality allowing a user to interact with both a physical and virtual world. The augmented reality device 1067 is in communication with the display controller 1065 and the user input interface 1060 allowing a user to interact with virtual items generated in the augmented reality device 1067 by the display controller 1065. The user may also provide gestures that are detected by the augmented reality device 1067 and transmitted to the user input interface 1060 as input signals.
The computer system 1010 may perform a portion or all of the processing steps of embodiments of the invention in response to the processors 1020 executing one or more sequences of one or more instructions contained in a memory, such as the system memory 1030. Such instructions may be read into the system memory 1030 from another computer readable medium, such as a magnetic hard disk 1041 or a removable media drive 1042. The magnetic hard disk 1041 may contain one or more datastores and data files used by embodiments of the present invention. Datastore contents and data files may be encrypted to improve security. The processors 1020 may also be employed in a multi-processing arrangement to execute the one or more sequences of instructions contained in system memory 1030. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
As stated above, the computer system 1010 may include at least one computer readable medium or memory for holding instructions programmed according to embodiments of the invention and for containing data structures, tables, records, or other data described herein. The term “computer readable medium” as used herein refers to any medium that participates in providing instructions to the processors 1020 for execution. A computer readable medium may take many forms including, but not limited to, non-transitory, non-volatile media, volatile media, and transmission media. Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto-optical disks, such as magnetic hard disk 1041 or removable media drive 1042. Non-limiting examples of volatile media include dynamic memory, such as system memory 1030. Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up the system bus 1021. Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
The computing environment 1000 may further include the computer system 1010 operating in a networked environment using logical connections to one or more remote computers, such as remote computing device 1080. Remote computing device 1080 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 1010. When used in a networking environment, computer system 1010 may include modem 1072 for establishing communications over a network 1071, such as the Internet. Modem 1072 may be connected to system bus 1021 via user network interface 1070, or via another appropriate mechanism.
Network 1071 may be any network or system generally known in the art, including the Internet, an intranet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communication between computer system 1010 and other computers (e.g., remote computing device 1080). The network 1071 may be wired, wireless or a combination thereof. Wired connections may be implemented using Ethernet, Universal Serial Bus (USB), RJ-6, or any other wired connection generally known in the art. Wireless connections may be implemented using Wi-Fi, WiMAX, and Bluetooth, infrared, cellular networks, satellite or any other wireless connection methodology generally known in the art. Additionally, several networks may work alone or in communication with each other to facilitate communication in the network 1071.
An executable application, as used herein, comprises code or machine-readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine-readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters.
A graphical user interface (GUI), as used herein, comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions. The GUI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the GUI display images. These signals are supplied to a display device which displays the image for viewing by the user. The processor, under control of an executable procedure or executable application, manipulates the GUI display images in response to signals received from the input devices. In this way, the user may interact with the display image using the input devices, enabling user interaction with the processor or other device.
The functions and process steps herein may be performed automatically or wholly or partially in response to user command. An activity (including a step) performed automatically is performed in response to one or more executable instructions or device operation without user direct initiation of the activity.
The system and processes of the figures are not exclusive. Other systems, processes and menus may be derived in accordance with the principles of the invention to accomplish the same objectives. Although this invention has been described with reference to particular embodiments, it is to be understood that the embodiments and variations shown and described herein are for illustration purposes only. Modifications to the current design may be implemented by those skilled in the art, without departing from the scope of the invention. As described herein, the various systems, subsystems, agents, managers and processes can be implemented using hardware components, software components, and/or combinations thereof.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2019/066138 | 12/13/2019 | WO |