Processing plants and industrial facilities such as refineries, power plants, and manufacturing facilities continuously undergo modifications to equipment. These modifications may be the result of maintenance and repair operations, equipment upgrades, and/or other activities. These modifications are typically executed through a management of change (MOC) process that involves updating engineering drawings on file (e.g., process & instrumentation diagrams (P&IDs)) to represent the facility. However, in many instances, although facilities may have an established MOC process, engineering drawings do not reflect actual field installations.
Generally, it is important to ensure that engineering drawings accurately represent a facility as the usage of incorrect or out-of-date engineering drawings may carry financial implications and safety implications. For example, operational decisions based on inaccurate engineering drawings may result in serious injury to an operator of the wrongly depicted facility. Typically, to ensure that engineering drawings accurately represent a facility, a physical audit or field survey is conducted to compare the engineering drawings on file to the equipment and installations of the facility. Upon comparison, discrepancies between the equipment and installations depicted in the engineering drawings and those in the field can be identified and annotated on the engineering drawings. Subsequently, the engineering drawings may be updated according to the annotations and uploaded (or re-uploaded) to a drawing management system and/or archive.
The process of conducting a field survey, identifying discrepancies between engineering drawings and a facility, and manually modifying and correcting engineering drawings is time-consuming and laborious.
This summary is provided to introduce a selection of concepts that are further described below in the detailed description. This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in limiting the scope of the claimed subject matter.
Embodiments disclosed herein generally relate to a method for automatically and autonomously comparing and updating, when required, the equipment and structure of a facility and one or more drawings representing the facility. The method includes obtaining a drawing from a drawing management system representing, at least, a portion of a facility and dispatching an autonomous vehicle (AV) to a location of the facility corresponding to the drawing. The method further includes collecting one or more visual images of the facility using the AV once it has reached the location and identifying discrepancies between the facility and the drawing using the one or more visual images and the drawing. The method further includes generating an updated drawing that corrects the identified discrepancies and accurately represents the facility and replacing the drawing with the updated drawing in the drawing management system upon review and approval by a user.
Embodiments disclosed herein generally relate to a system that includes a drawing management system storing a drawing representing, at least, a portion of a facility, an autonomous vehicle system (AVS) configured to dispatch an autonomous vehicle (AV) to a desired location, wherein the AV is navigated without human interaction and is configured to acquire one or more visual images upon arriving at the desired location, and a computer communicably connected to the AVS. The computer includes one or more computer processors and a non-transitory computer readable medium storing instructions executable by a computer processor. The instructions include functionality for: obtaining the drawing from the drawing management system; transmitting a signal to the AVS to dispatch the AV to the desired location corresponding the drawing; receiving the one or more visual images; identifying discrepancies between the facility and the drawing using the one or more visual images and the drawing; generating an updated drawing that corrects the identified discrepancies and accurately represents the facility; and replacing the drawing with the updated drawing in the drawing management system upon review and approval by a user.
Embodiments disclosed herein generally relate to a non-transitory computer-readable memory that includes computer-executable instructions stored thereon that, when executed on a processor, cause the processor to perform the following steps. The steps include obtaining a drawing from a drawing management system representing, at least, a portion of a facility and dispatching an autonomous vehicle (AV) to a location of the facility corresponding to the drawing. The steps further include collecting one or more visual images of the facility using the AV once it has reached the location, identifying discrepancies between the facility and the drawing using the one or more visual images and the drawing, and generating an updated drawing that corrects the identified discrepancies and accurately represents the facility. The steps further include replacing the drawing with the updated drawing in the drawing management system upon review and approval by a user.
Other aspects and advantages of the claimed subject matter will be apparent from the following description and the appended claims.
In the following detailed description of embodiments of the disclosure, numerous specific details are set forth in order to provide a more thorough understanding of the disclosure. However, it will be apparent to one of ordinary skill in the art that the disclosure may be practiced without these specific details. In other instances, well-known features have not been described in detail to avoid unnecessarily complicating the description.
Throughout the application, ordinal numbers (e.g., first, second, third, etc.) may be used as an adjective for an element (i.e., any noun in the application). The use of ordinal numbers is not to imply or create any particular ordering of the elements nor to limit any element to being only a single element unless expressly disclosed, such as using the terms “before,” “after,” “single,” and other such terminology. Rather, the use of ordinal numbers is to distinguish between the elements. By way of an example, a first element is distinct from a second element, and the first element may encompass more than one element and succeed (or precede) the second element in an ordering of elements.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “engineering drawing” includes reference to one or more of such engineering drawings.
Terms such as “approximately,” “substantially,” etc., mean that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
In one aspect, embodiments disclosed herein relate to an autonomous drawing verification and revision system that can receive or acquire one or more engineering drawings depicting a facility (such as an industrial facility and/or processing plant) and receive autonomous vehicle data corresponding to the facility, such as visual images, from one or more autonomous vehicles (AVs). The autonomous drawing verification and revision system may further automatically compare the autonomous vehicle data and engineering drawings to identify discrepancies and generate engineering drawings accurate to the facility (i.e., updated drawings). In one or more embodiments, the autonomous drawing verification and revision system can dispatch one or more AVs (e.g., drones) to a location in or near a facility to acquire autonomous vehicle data (AV data) and subsequently generate a field representation of the facility from the AV data. In one or more embodiments, the autonomous drawing verification and revision system further notifies a user of a needed or proposed change to an engineering drawing such that the engineering drawing accurately reflects the facility (or portion of the facility that the engineering drawing depicts). In one or more embodiments, the autonomous drawing verification and revision system generates an inventory of equipment contained by the facility and can determine a condition of the equipment.
In general, the autonomous drawing verification and revision system disclosed herein can be used with any industrial plant or processing facility such as, for example, gas processing plants, power plants, manufacturing facilities, transportation systems (e.g., pipelines), and industrial facilities. For simplicity, the term “facility” will be adopted herein to generalize to any type of plant or industrial facility. As stated, the autonomous drawing verification and revision system can receive, update, and in some instances, generate one or more engineering drawings. In general, engineering drawings may refer to 3D representations of one or more objects (e.g., equipment of a facility), for example, in the form of so-called “solid models,” 2D representations of a facility (or a portion of a facility) such as a piping and instrumentation diagram (P&ID), or other graphical representations of an object or facility. Again, for simplicity, the term “drawing” or “drawings” will be adopted herein to generalize to any type of engineering drawing.
Turning to
In the context of the field of oil and gas, in general, a production fluid that may contain oil, gas, water (or brine), and solid particulates is separated into its constituents and further refined. The gas processing plant (100) depicted in
As shown in
In the example gas processing plant (100) of
From the knock-out drum (104), the bulk gas is processed by a filter separator (106). A filter separator (106) removes impurities such as mineral precipitates (e.g. pipe scale), water, liquid hydrocarbons, and iron sulfide from the fluid. A filter separator (106) uses filter elements, such as a replaceable sock or a coalescing filter, rather than mechanical components to separate out contaminants. Typically, a filter separator (106) may be composed of 1 or 2 stages and may operate at high or low pressure. Again, the unwanted portions of the incoming contaminated fluid (102) exit the filter separator (106) through an exit (103).
After the filter separator (106), the incoming contaminated fluid (102) has been reduced to a gaseous stream. The gaseous stream undergoes another purifying process through an amine contactor (108). An amine contactor (108) absorbs carbon dioxide (CO2) and/or hydrogen sulfide (H2S) contaminants from the gaseous stream. In general, an amine contactor (108) receives the partially processed incoming contaminated fluid (102), or gaseous stream, and a “lean” amine liquid. Common amines are diethanolamine (DEA), monoethanolamine (MEA), methyldiethanolamine (MDEA), diisopropanolamine (DIPA), and aminoethoxyethanol (Diglycolamine) (DGA). The contact between the gaseous stream and the lean amine liquid drives the absorption of CO2 and/or H2S into the amine liquid from the gaseous stream. As a result, decontaminated gas (109), also known as “sweetened gas”, may exit the amine contactor (108). The decontaminated gas (109) should be checked to make sure it meets specifications. If the decontaminated gas (109) does not meet specifications, this is indicative that control parameters within the gas processing plant (100) require adjustment. The processes of the knock-out drum (104), filter separator (106), and amine contactor (108) effectively transform the incoming contaminated fluid (102) to a decontaminated gas (109) and complete the objective of the example gas processing plant (100) shown in
As shown in
The remaining liquid contaminated amines enter a heat exchanger (112). The heat exchanger (112) recovers heat from the decontaminated amine leaving the amine stripper (114), which is described below. Consequently, the heat exchanger (112) heats the contaminated amine before entering the amine stripper (114).
The amine stripper (114) serves to remove the absorbed contaminants, such as H2S and CO2, from the amine solution so that it can be used again in the amine contactor (108). The amine stripper (114) is equipped with a reboiler (116). The amine stripper (114) contains a tray column consisting of a stripping section and a water wash section at the top. The reboiler (116) takes the amine solution located at the bottom of the amine stripper (114) and partially boils it. Steam (hot, gaseous water) is typically used as the heat source in the reboiler (116). Steam, typically sourced from the reboiler (116), flows up the column in the amine stripper (114) and contacts the contaminated amine solution flowing down within the column. As the contaminated amine contacts the steam, it is heated up and the contaminants are stripped out of the rich amine solution and flow to the stripping section of the column.
The stripped gases, commonly referred to as amine acid gas, leaves the amine stripper through a stripped gas exit (115). The stripped gases undergo further processing, such as condensing out the water and passing the remaining acid gases to a sulfur recovery process, but these processes are not shown in
The decontaminated amine solution, leaving the bottom of the amine stripper (114), contains very low quantities of acid gas (such as H2S). This decontaminated amine solution may be recycled in a lean amine storage tank (not shown) and/or returned to the amine contactor (108). As shown in
The transport of the various fluids of the gas processing plant of
As noted above, it is emphasized that a gas processing facility (100) may implement different processes and mechanisms for achieving adequate gas processing. Some processes may include compression, stabilization, and dehydration. The gas processing plant (100) may also encompass the treatment of removed water for disposal through processes such as filtration and deionization. Additionally, elements for heating and cooling may be provided to prevent the formation of hydrates, and mitigate corrosion and aid in dehydration, respectively. With respect to decontaminating the incoming contaminated fluid (102), other chemical and physical washes may be used without departing from the scope of this disclosure.
As shown in
The one or more controllers (130) may herein be referred to as “controllers” or “controller” where appropriate. Controllers (130) may be distributed, local to the processes and associated device, global, connected, etc. Controllers (130) may include or consist of a programmable logic controller (PLC), a distributed control system (DCS), a supervisory control and data acquisition (SCADA), and/or a remote terminal unit (RTU). For example, a programmable logic controller (PLC) may control valve states, fluid levels, pipe pressures, warning alarms, and/or pressure releases throughout a gas processing plant (100). In particular, a programmable logic controller (PLC) may be a ruggedized computer system with functionality to withstand vibrations, extreme temperatures, wet conditions, and/or dusty conditions, for example, around a refinery. A distributed control system may be a computer system for managing various processes at a gas processing plant (100) using multiple control loops. As such, a distributed control system may include various autonomous controllers (130) (such as remote terminal units) positioned at different locations throughout the facility to manage operations and monitor processes. Likewise, a distributed control system may include no single centralized computer for managing control loops and other operations. On the other hand, a SCADA system may include a control system that includes functionality for enabling monitoring and issuing of process commands through local control at a gas processing facility (100) as well as remote control outside the facility. With respect to an RTU, an RTU may include hardware and/or software, such as a microprocessor, that connects sensors and/or actuators using network connections to perform various processes in the automation system. Likewise, a control system may be coupled to one or more gas processing plant (100) devices.
In general, facilities such as the example gas processing plant (100) of
There are several reasons why one or more drawings may not accurately represent a facility. In some instances, equipment of a facility may be modified in a manner different than that prescribed in a set of design or modification drawings. For example, an installation or repair crew may encounter previously unknown constraints when implementing a modification and may alter an aspect of the modification on-the-fly in view of the newly recognized constraints. In such a case, the drawings on file will not accurately represent the facility as modified. In other instances, a modification may be implemented outside of an established MOC process such that an update to the drawings to reflect the modification is never triggered. In other instances, drawings may not accurately represent a facility because the drawings themselves were incorrectly produced or erroneously updated after a modification to the facility. Similarly, incorrect drawings on file may be the result of human error, for example, through a mistake made while converting a physical drawing (e.g., paper) or markup to a digital drawing (e.g., using a computer aided design and drafting (CADD) software).
Regardless of the reason or origin of drawings that do not accurately represent a facility, inaccurate drawings may result in financial costs and safety hazards. Further, discrepancies in drawings may cause issues for various aspects of managing a facility, including engineering, operations, and maintenance. For example, inaccurate drawings consume time, with an associated financial cost, in terms of delays in the normal operation of a facility due to the necessity to determine, upon identifying a discrepancy between drawings and facility, whether a drawing or equipment item(s) require correction and then implementing said correction. Such a correction may require rework, redesign, and in some cases reimplementation which can be costly. Another issue related to inaccurate drawing is that of safety where decisions made based on inaccurate drawings can lead to serious safety hazards and injuries to operators of the facility (including loss of life).
Conventionally, to ensure that drawings accurately represent a facility, a physical audit or field survey is conducted to compare the drawings on file to the equipment and installations of the facility. Upon comparison, discrepancies between the equipment and installations depicted in the drawings and those in the field can be identified and annotated on the engineering drawings. Subsequently, the engineering drawings may be updated according to the annotations and uploaded (or re-uploaded) to a drawing management system and/or archive.
Conventionally, the process of auditing and updating drawings is performed manually. This manual process begins by conducting an in-person field survey. Typically, the in-person field survey consists of an engineer or auditor acquiring a physical copy of the drawings for a facility (or the portion of a facility that recently underwent a modification) and travelling to the facility to visually inspect the facility and compare it to the drawings in-hand. For example, the drawings may be obtained from a drawing management system, or other database, and processed with a printer to form a physical copy. If, while visually inspecting the facility, the engineer or auditor identifies a difference between the facility and the drawings, the engineer or auditor physically marks up the drawings on-site to indicate the difference. In some instances, the engineer or auditor physically marks up the drawings to indicate how the drawings should be updated to accurately represent the facility. Afterward, the engineer or auditor returns from the facility and transfers any markups placed on the physical drawings to a digital representation of the drawings. That is, the engineer or auditor performs a digital markup of the drawings. The digitally marked up drawings may be submitted to a computer aided design and drafting (CADD) group. The CADD group reviews and approves any changes indicated by the digitally marked up drawings. Further, the CADD group updates the digital version of the drawings according to the details of the digitally marked up versions. That is, any markups are implemented to produce a “clean” drawing accurate to the facility. Finally, the clean, or updated digital version of the drawings, are uploaded (or re-uploaded) to a document management system for storage and future retrieval as needed. The document management system may be a database or archive system. In some instances, the document management system may have an interactive interface and further allow for drafting capabilities. In such a case, the CADD group may operate within the document management system. This manual process and associated steps are laborious. Further, nearly each step requires human intervention. As such, the conventional process of auditing and updating drawings may be considered slow and costly.
In accordance with one or more embodiments, the autonomous drawing verification and revision system automates and consolidates the steps of conducting a field survey, identifying discrepancies between drawings and a facility, marking up drawings, and producing a new or updated clean drawing accurate to the facility.
In one or more embodiments, the autonomous drawing verification and revision system (300) such that it can receive or otherwise acquire a drawing (304) from a drawing management system (302). The drawing (304) may depict, or otherwise represent, a facility (or a portion of a facility). In one or more embodiments, the drawing is a piping and instrumentation diagram (P&ID). In one or more embodiments, the drawing (304) indicates two or more equipment items, as well as their relationship to each other (e.g., spatial relationship), connectivity (if applicable), and order when equipment items may be described ordinally (e.g., ordered processes associated with the equipment items).
In accordance with one or more embodiments, the autonomous drawing verification and revision system (300) is configured to receive process data (345). Process data (345) includes any control parameters or sensor data associated with a facility. For example, considering the example gas processing plant (100) of
In accordance with one or more embodiments, the autonomous drawing verification and revision system (300) includes an autonomous vehicle system (AVS) (310). The AVS (310) includes one or more AVs (360) that may be dispatched to a location of a facility. The present disclosure places no restrictions on the type of AV (360) employed by the AVS (310). In general, an AV (360) may be a robotic device such as a wheel or track propelled vehicle or a drone. In the instance of a drone, the drone may be of a fix-wing type or a rotary-wing type (e.g., quadcopter, tricopter, etc.).
In one or more embodiments, one or more AVs (360) may be dispatched by the autonomous drawing verification and revision system (300) to a location of the facility by the dispatcher (320). The dispatcher (320) will be described in greater detail below. In one or more embodiments, each AV (360) may operate autonomously without the need for human guidance and/or control. In general, each AV (360) of the AVS (310) possesses one or more cameras and an object detection and avoidance system. In some embodiments, the AV (360) further includes additional sensors, such as ultrasonic sensors and light detection and ranging (LiDAR) sensors to detect the presence and proximity of objects surrounding the AV (360). An AV (360) may further include global positioning system (GPS) components for positioning, navigation, and timing (PNT) services. The object detection and avoidance system is configured to receive sensory inputs from the AV (360), for example, visual data acquired from the one or more cameras, object proximity data, and GPS data. The object detection and avoidance system may fuse sensory inputs to maneuver the AV (360) to a desired location (e.g., an area of a facility that underwent an equipment modification) without colliding with an object (e.g., a flue-gas stack).
In one or more embodiments, an AV (360) of the AVS (310) possesses thermal imaging capabilities. Thermal imaging may be performed using a thermal camera, for example, a camera outfitted with an infrared sensor sensitive to infrared light. That is, in one or more embodiments, an AV (360) includes both a visual camera and a thermal camera. In some embodiments, both visual imaging and thermal imaging are performed using the same camera using one or more photosensors and/or light filters (i.e., a filter that limits incoming light according to frequency (or wavelength)). Thermal images acquired by the AV (360) may be used to detect and locate objects and/or equipment items at elevated temperatures which may aid in the recognition of equipment items.
In accordance with one or more embodiments, an AV (360) of the AVS (310) can simultaneously self-locate and map its surrounding environment, a process known as simultaneous localization and mapping (SLAM). Generally, SLAM is a method used principally by autonomous vehicles to map (spatially orient surrounding objects) and localize the vehicle in that map at the same time. In one or more embodiments, a SLAM algorithm may facilitate or be a part of the object detection and avoidance system of the AV (360). Further, a SLAM algorithm may be used to plan a path for the AV (360) to travel in order to safely arrive at desired destination. The SLAM algorithm can run in real time or traceable time. Methods used by the SLAM algorithm may include, but are not limited to: particle filter; extended Kalman filter; and covariance intersection.
Each of the one or more AVs (360) can communicate with the autonomous drawing verification and revision system (300) and thus components of the autonomous drawing verification and revision system (300). Communication may be enabled through wireless or wired connections or a combination of wireless and wired connections. Wireless communication may be facilitated through RFID, NFC, low-energy Bluetooth, low-energy wireless, low-energy radio protocols, LTE-A, and WiFi-Direct technologies. In one or more embodiments, an AV (360) communicates with the autonomous drawing verification and revision system (300), or its components and/or modules, using a wireless protocol. In other embodiments, an AV (360) establishes a wireless link with a ground control system and the ground control system has a wired connection (e.g., ethernet, USB, etc.) with the autonomous drawing verification and revision system (300). In general, one or more ground control stations may be disposed throughout, or near, a facility. In one or more embodiments, ground control systems further act as docking and charging stations for one or more AVs (360) when not in use. Communication between an AV (360) and the autonomous drawing verification and revision system (300) includes any transferred data and control signals that may be sent bidirectionally between an AV (360) and autonomous drawing verification and revision system (300). For example, an AV (360) may transfer, through an established communication link, its location and acquired visual images to the autonomous drawing verification and revision system (300). Likewise, the autonomous drawing verification and revision system (300) may send a control signal to the AV (360) specifying a target destination for the AV (360). Other data that may be communicated between an AV (360) and the autonomous drawing verification and revision system (300), including the battery level of the AV (360) and operational parameters of the AV (360) (e.g., internal temperature, rotary speeds, etc.). Herein, data sent, transferred, or communicated to the autonomous drawing verification and revision system (300) from the one or more AVs (360) is referred to a AV data (316).
In one or more embodiments, the autonomous drawing verification and revision system (300) may perform photogrammetric analysis using received AV data (316). Photogrammetry, defined generally, is the process of collecting and/or displaying physical information from two-dimensional (2D) photos or images. Thus, in one or more embodiments, the autonomous drawing verification and revision system (300) processes one or more images acquired by an AV (360) (and received as AV data (316)) to form a field representation (325). In accordance with one or more embodiments, AV data (316) may undergo preprocessing. For example, preprocessing of acquired visual images may include normalizing the images. Additional techniques such as aggregating multiple images, or other methods designed to reduce noise in an image and increase image quality may be employed. One with ordinary skill in the art will appreciate that many image preprocessing techniques exist and the fact that they are not enumerated herein does not impose a limit on the present disclosure. In some embodiments, preprocessing may not be required.
In accordance with one or more embodiments, the field representation (325) is a digital representation of the facility (or portion of the facility) developed using AV data (316). In one or more embodiments, the field representation simply consists of images acquired using one or more AVs (360), where the images may include metadata regarding the location and orientation of the AV (360) while the images were acquired. In one or more embodiments, the field representation (325) is a 2D or 3D representation of the facility generated from the AV data (316). In one or more embodiments, the field representation (325) is a 2D diagram of the region or area represented in the received image(s), such as the depiction of a gas processing plant (100) in
In one or more embodiments, the computational requirement or burden of an AV (360) is reduced by performing calculations using one or more computers included in the autonomous drawing verification and revision system (300) (computers such as that depicted in
In accordance with one or more embodiments, the autonomous drawing verification and revision system (300) further includes a dispatcher (320). The dispatcher (320) determines when, and where, an AV (360) should be sent to inspect a facility. In one or more embodiments, the dispatcher (320) operates according to a schedule, where one or more AVs (360) are dispatched to a location of the facility according to a pre-determined frequency. That is, the dispatcher (320) may mandate that a field survey is conducted using one or more AVs (360) every X days, where X is defined by a user. In one or more embodiments, the dispatcher (320) is configured such that an entire facility is inspected every 60 days. In such a case, the dispatcher (320) may inspect portions of a facility over various intervals within the prescribed time period to complete an entire facility inspection. In one or more embodiments, the dispatcher (320) may further be configured to dispatch one or more AVs (360) to conduct a field survey of a facility based on one or more pre-defined triggers or identified events. For example, the dispatcher (320) may dispatch an AV (360) to a location of a facility where a maintenance, repair, replacement, or update activity was recently performed. In such a case, the autonomous drawing verification and revision system (300) may ensure that the drawings (304) in the drawing management system (302) are aligned (or still aligned) with the facility after said maintenance, repair, replacement, or update activity. In one or more embodiments, the dispatcher (320) may further be configured to receive a user input, for example, through a user interface (390), such that a user may instruct one or more AVs (360) to inspect any portion of a facility at any given time according to the requirements of the user.
In accordance with one or more embodiments, the autonomous drawing verification and revision system (300) further includes an inventory and maintenance module (308). As the autonomous drawing verification and revision system (300) identifies the equipment items that make up a facility, the inventory and maintenance module (308) maintains a current inventory or list of all equipment items. Further, in one or more embodiments, the autonomous drawing verification and revision system (300) is further configured to analyze the AV data (316) to determine the condition of identified equipment items. For example, the inventory and maintenance module (308) may make use of one or more of the machine-learned models (330) to categorize the condition of each equipment item given the AV data (316). In this manner, while conducting autonomous field surveys, the autonomous drawing verification and revision system (300) may further determine which equipment items are in need of repair or replacement. In one or more embodiments, the autonomous drawing verification and revision system (300) recommends one or more maintenance options to a user and/or facility operator through the notification system (340). Recommendations produced by the inventory and maintenance module (308) may be proactive or reactive in nature. That is, the inventory and maintenance module (308), through analysis of AV data (316), may identify an equipment item in need of immediate maintenance or replacement and recommend a reactive option or the inventory and maintenance module (308) may identify an equipment item that will be in need of maintenance sooner than expected according to a maintenance schedule (a proactive option). As such, in one or more embodiments, the inventory and maintenance module (308) has access to, or is provided, a maintenance schedule (not shown in
In accordance with one or more embodiments, the facility is monitored and controlled by a control system (370). Examples of a control system (370) may include a distributed control system (DCS) and a supervisory control and data acquisition (SCADA) system. In one or more embodiments, the control system (370) is the same as—or analogous with—the controllers (130) depicted in
The principal output of the autonomous drawing verification and revision system (300) is an updated drawing (306). Upon conducting an autonomous field survey and developing a field representation (325) from the AV data (316), the field representation (325) is compared to an existing drawing (304). The comparison may take a variety of forms. In one or more embodiments, the field representation is an engineering drawing of the same type and style as the existing drawing (304) such that the autonomous drawing verification and revision system (300) can perform a direct comparison between the field representation (325) and the existing drawing (304). In one or more embodiments, conversion of AV data (316) to a field representation (325) of the same type and style as the existing, or current, drawing (304) is facilitated by one or more machine-learned models (330). The one or more machine-learned models (330) may be configured for object detection and image segmentation tasks. For example, in one or more embodiments, UV data (216) in the form of images and, in some instances, other data such as spatial distances from ultrasonic and/or LiDAR sensors, are used to construct 3D isometrics through edge computing. Then, the autonomous drawing verification and revision system (300) further converts the 3D isometric representation(s) into 2D P&ID(s) to match the existing format of the drawings (304) in the drawing management system (302). In one or more embodiments, one or more machine-learned models (330) accepts the AV data (316) and/or the field representation (325) (where the field representation itself may be formed using a machine-learned model) and extracts relevant features from the received input. Features may include, but are not limited to identified and labelled equipment as well as descriptors for various equipment items such as equipment size, color, orientation, relative position, etc. The extracted features are subsequently matched with corresponding features on the current drawing (304), where the current drawing may be 2D (e.g., P&ID) or 3D (e.g., isometric drawing). In one or more embodiments, the drawing (204) and the field representation (325) are each received by one or more machine-learned models (330) and are each converted to a data object, such as a directed graph. The data object (e.g., directed graph) is a compact representation of the equipment, and relationship between equipment items, of a facility. The data objects are orientation invariant such that alternative viewpoints of a facility (e.g., AV data acquired from two or more AVs (360) from different locations and perspectives) result in the same data object. By converting both the drawing (304) and field representation (325), where the field representation (325) may be raw AV data (316) (e.g., images and location metadata), to an orientation invariant data object, direct comparison of the data objects allows for the quick identification of discrepancies, if any, between the facility and the drawing (304).
In one or more embodiments, when at least one discrepancy between the drawing (304) and the facility is detected by the autonomous drawing verification and revision system (300), the autonomous drawing verification and revision system (300) outputs an updated drawing (306). In one or more embodiments, the updated drawing (306) is a marked-up version of the drawing (304) that overlays the corrections needed on the drawing (304). The overlaid corrections, or markups, are such that once used to alter the drawing (304), the resulting drawing will accurately represent the facility. In one or more embodiments, the updated drawing (306) is a clean (i.e., without markups) drawing that can be used to replace the drawing (304) in the drawing management system (302), where the updated drawing (306) accurately represents the facility.
Once an updated drawing (306) is produced, the updated drawing (306) may be reviewed and approved by a user (e.g., a CADD group member) before the updated drawing (306) is used to correct or replace the existing drawing (304) in the drawing management system (302). The updated drawing (304) may be reviewed alongside the original drawing (304) and the field representation (325) and/or AV data (316). If the updated drawing (306) is not approved upon review, a subject matter expert (e.g., a facility operator and/or a CADD group member) may revert to producing an updated drawing manually using the field representation (325) and/or AV data (316).
In accordance with one or more embodiments, the autonomous drawing verification and revision system (300) further includes a notification system (340). The notification system (340) provides, at least, notification functionalities to the autonomous drawing verification and revision system (300). Notifications simply refer to alerting one or more users that an updated drawing (306) has been produced by the autonomous drawing verification and revision system (300) and is ready for review and approval. Notifications may be provided by the notification system (340) in the form of email, SMS notifications, and other alerts through a user interface (350).
In one or more embodiments, one or more users may interact with the autonomous drawing verification and revision system (300) through a user interface (350). In one or more embodiments, the user interface (350) is a graphical user interface. The user interface (350) acts as a point of human-computer interaction and communication. Thus, the user interface (350) can receive inputs from a user. The user interface (350) may further provide visualizations, such as graphs, reports, and text and image data, to one or more users. In broad terms, the user interface (350) can include display screens, keyboards, and a computer mouse. In one or more embodiments, the user interface (350) is implemented as a computer program, such as a native application or a web application.
It is emphasized that while the example of
As stated, in one or more embodiments, the autonomous drawing verification and revision system (300) uses one or more machine-learned models. Machine learning, broadly defined, is the extraction of patterns and insights from data. The phrases “artificial intelligence”, “machine learning”, “deep learning”, and “pattern recognition” are often convoluted, interchanged, and used synonymously throughout the literature. This ambiguity arises because the field of “extracting patterns and insights from data” was developed simultaneously and disjointedly among a number of classical arts like mathematics, statistics, and computer science. For consistency, the term machine learning, or machine-learned, will be adopted herein, however, one skilled in the art will recognize that the concepts and methods detailed hereafter are not limited by this choice of nomenclature.
Machine-learned model types may include, but are not limited to, neural networks, random forests, generalized linear models, and Bayesian regression. Machine-learned model types are usually associated with additional “hyperparameters” which further describe the model. For example, hyperparameters providing further detail about a neural network may include, but are not limited to, the number of layers in the neural network, choice of activation functions, inclusion of batch normalization layers, and regularization strength. The selection of hyperparameters surrounding a model is referred to as selecting the model “architecture.” Generally, multiple model types and associated hyperparameters are tested and the model type and hyperparameters that yield the greatest predictive performance on a hold-out set of data is selected.
As noted, possible objectives of the machine-learned models used by the autonomous drawing verification and revision system (300) may include detecting and identifying objects in visual images, image segmentation tasks, generating a field representation of a facility, and identifying differences or discrepancies between a facility and the drawings on file that represent the facility. Machine-learned models may act in coordination or independently. In one or more embodiments the results of multiple machine-learned models are used in an ensemble to form a prediction for a target quantity. In accordance with one or more embodiments, one or more machine-learned model types and associated architectures are selected and trained to perform specific tasks such as object detection and discrepancy identification.
As an example of a machine-learned model that may be included in the autonomous drawing verification and revision system (300),
Nodes (502) and edges (504) carry additional associations. Namely, every edge is associated with a numerical value. The edge numerical values, or even the edges (504) themselves, are often referred to as “weights” or “parameters.” While training a neural network (500), numerical values are assigned to each edge (504). Additionally, every node (502) is associated with a numerical variable and an activation function. Activation functions are not limited to any functional class, but traditionally follow the form
where i is an index that spans the set of “incoming” nodes (502) and edges (504) and f is a user-defined function. Incoming nodes (502) are those that, when viewed as a graph (as in
When the neural network (500) receives an input, the input is propagated through the network according to the activation functions and incoming node (502) values and edge (504) values to compute a value for each node (502). That is, the numerical value for each node (502) may change for each received input. Occasionally, nodes (502) are assigned fixed numerical values, such as the value of 1, that are not affected by the input or altered according to edge (504) values and activation functions. Fixed nodes (502) are often referred to as “biases” or “bias nodes” (506), displayed in
In some implementations, the neural network (500) may contain specialized layers (505), such as a normalization layer, or additional connection procedures, like concatenation. One skilled in the art will appreciate that these alterations do not exceed the scope of this disclosure.
As noted, the training procedure for the neural network (500) comprises assigning values to the edges (504). To begin training the edges (504) are assigned initial values. These values may be assigned randomly, assigned according to a prescribed distribution, assigned manually, or by some other assignment mechanism. Once edge (504) values have been initialized, the neural network (500) may act as a function, such that it may receive inputs and produce an output. As such, at least one input is propagated through the neural network (500) to produce an output. Recall, that a given data set will be composed of inputs and associated target(s), where the target(s) represent the “ground truth,” or the otherwise desired output. The neural network (500) output is compared to the associated input data target(s). The comparison of the neural network (500) output to the target(s) is typically performed by a so-called “loss function;” although other names for this comparison function such as “error function,” “misfit function,” and “cost function” are commonly employed. Many types of loss functions are available, such as the mean-squared-error function, however, the general characteristic of a loss function is that the loss function provides a numerical evaluation of the similarity between the neural network (500) output and the associated target(s). The loss function may also be constructed to impose additional constraints on the values assumed by the edges (504), for example, by adding a penalty term, which may be physics-based, or a regularization term. Generally, the goal of a training procedure is to alter the edge (504) values to promote similarity between the neural network (500) output and associated target(s) over the data set. Thus, the loss function is used to guide changes made to the edge (504) values, typically through a process called “backpropagation.”
While a full review of the backpropagation process exceeds the scope of this disclosure, a brief summary is provided. Backpropagation consists of computing the gradient of the loss function over the edge (504) values. The gradient indicates the direction of change in the edge (504) values that results in the greatest change to the loss function. Because the gradient is local to the current edge (504) values, the edge (504) values are typically updated by a “step” in the direction indicated by the gradient. The step size is often referred to as the “learning rate” and need not remain fixed during the training process. Additionally, the step size and direction may be informed by previously seen edge (504) values or previously computed gradients. Such methods for determining the step direction are usually referred to as “momentum” based methods.
Once the edge (504) values have been updated, or altered from their initial values, through a backpropagation step, the neural network (500) will likely produce different outputs. Thus, the procedure of propagating at least one input through the neural network (500), comparing the neural network (500) output with the associated target(s) with a loss function, computing the gradient of the loss function with respect to the edge (504) values, and updating the edge (504) values with a step guided by the gradient, is repeated until a termination criterion is reached. Common termination criteria are: reaching a fixed number of edge (504) updates, otherwise known as an iteration counter; a diminishing learning rate; noting no appreciable change in the loss function between iterations; reaching a specified performance metric as evaluated on the data or a separate hold-out data set. Once the termination criterion is satisfied, and the edge (504) values are no longer intended to be altered, the neural network (500) is said to be “trained.”
Another type of machine-learned model that may be employed by the gas leak detection and resolution system (200) (e.g., for object detection) is a convolutional neural network (CNN). A CNN is similar to a neural network (500) in that it can technically be graphically represented by a series of edges (504) and nodes (502) grouped to form layers. However, it is more informative to view a CNN as structural groupings of weights; where here the term structural indicates that the weights within a group have a relationship. CNNs are widely applied when the data inputs also have a structural relationship, for example, a spatial relationship where one input is always considered “to the left” of another input. Images have such a structural relationship. Consequently. CNNs are particularly apt at processing images.
A structural grouping, or group, of weights is herein referred to as a “filter.” The number of weights in a filter is typically much less than the number of inputs. In a CNN, the filters can be thought as “sliding” over, or convolving with, the inputs to form an intermediate output or intermediate representation of the inputs which still possesses a structural relationship. Like unto the neural network (500), the intermediate outputs are often further processed with an activation function. Many filters may be applied to the inputs to form many intermediate representations. Additional filters may be formed to operate on the intermediate representations creating more intermediate representations. This process may be repeated as prescribed by a user. There is a “final” group of intermediate representations, wherein no more filters act on these intermediate representations. Generally, the structural relationship of the final intermediate representations is ablated; a process known as “flattening.” The flattened representation is usually passed to a neural network (500) to produce the final output. Note, that in this context, the neural network (500) is still considered part of the CNN. Like unto a neural network (500), a CNN is trained, after initialization of the filter weights, and the edge (504) values of the internal neural network (500), if present, with the backpropagation process in accordance with a loss function.
While a few types of machine-learned models have been briefly described, one with ordinary skill in the art will appreciate that the autonomous drawing verification and revision system (300) is not limited to only using the listed machine-learned models. Machine-learned models such as a random forest, visual transformers (ViTs), or non-parametric methods such as K-nearest neighbors or a Gaussian process may be readily inserted into this framework and do not depart from the scope of this disclosure.
In accordance with one or more embodiments, one or more of the machine-learned models (330) used by the autonomous drawing verification and revision system (300) may be trained to perform a specific task. Training of the one or more machine-learned models (330) is facilitated using modeling data that includes examples of AV data (316), associated drawings (304), and updated drawings (306) and/or identified discrepancies (e.g., differences are made apparent with a marked-up drawing). In accordance with one or more embodiments, the modeling data is stored in a historical database (390) accessible by the autonomous drawing verification and revision system (300). In one or more embodiments, the modeling data may be partitioned into various sets of data such as a training set, validation set, and test set. In one or more embodiments, the modeling data is preprocessed before use to train the one or more machine-learned models (330). Preprocessing may include normalization and imputation. Further, in one or more embodiments, modeling data may be augmented by perturbing example instances in the modeling data. For example, data augmentation may include the application of random affine operations of images of a facility. In accordance with one or more embodiments, the modeling data (or a partition of the modeling data) is used to train the one or more machine-learned models. Training may include the tuning and/or intelligent selection of model hyperparmeters scored according to a validation set of the modeling data and an estimation of the generalization error of the machine-learned models. Once trained, one or more machine-learned models (330) may be “deployed” for use in the autonomous drawing verification and revision system (300). As such, the autonomous drawing verification and revision system (300) may obtain or receive drawings (304) and AV Data (316) not present in the modeling data and produce updated drawings (306).
In Block 602, a drawing is obtained from a drawing management system or other drawings archive or database. The drawing represents a facility, or at least a portion of a facility. The facility may be a processing plant such as the gas processing plant depicted in
The computer (702) can serve in a role as a client, network component, a server, a database or other persistency, or any other component (or a combination of roles) of a computer system for performing the subject matter described in the instant disclosure. In some implementations, one or more components of the computer (702) may be configured to operate within environments, including cloud-computing-based, local, global, or other environment (or a combination of environments).
At a high level, the computer (702) is an electronic computing device operable to receive, transmit, process, store, or manage data and information associated with the described subject matter. According to some implementations, the computer (702) may also include or be communicably coupled with an application server, e-mail server, web server, caching server, streaming data server, business intelligence (BI) server, or other server (or a combination of servers).
The computer (702) can receive requests over network (730) from a client application (for example, executing on another computer (702) and responding to the received requests by processing the said requests in an appropriate software application). In addition, requests may also be sent to the computer (702) from internal users (for example, from a command console or by other appropriate access method), external or third-parties, other automated applications, as well as any other appropriate entities, individuals, systems, or computers.
Each of the components of the computer (702) can communicate using a system bus (703). In some implementations, any or all of the components of the computer (702), both hardware or software (or a combination of hardware and software), may interface with each other or the interface (704) (or a combination of both) over the system bus (703) using an application programming interface (API) (712) or a service layer (713) (or a combination of the API (712) and service layer (713)). The API (712) may include specifications for routines, data structures, and object classes. The API (712) may be either computer-language independent or dependent and refer to a complete interface, a single function, or even a set of APIs. The service layer (713) provides software services to the computer (702) or other components (whether or not illustrated) that are communicably coupled to the computer (702). The functionality of the computer (702) may be accessible for all service consumers using this service layer. Software services, such as those provided by the service layer (713), provide reusable, defined business functionalities through a defined interface. For example, the interface may be software written in JAVA, C++, or other suitable language providing data in extensible markup language (XML) format or another suitable format. While illustrated as an integrated component of the computer (702), alternative implementations may illustrate the API (712) or the service layer (713) as stand-alone components in relation to other components of the computer (702) or other components (whether or not illustrated) that are communicably coupled to the computer (702). Moreover, any or all parts of the API (712) or the service layer (713) may be implemented as child or sub-modules of another software module, enterprise application, or hardware module without departing from the scope of this disclosure.
The computer (702) includes an interface (704). Although illustrated as a single interface (704) in
The computer (702) includes at least one computer processor (705). Although illustrated as a single computer processor (705) in
The computer (702) also includes a memory (706) that holds data for the computer (702) or other components (or a combination of both) that can be connected to the network (730). The memory may be a non-transitory computer readable medium. For example, memory (706) can be a database storing data consistent with this disclosure. Although illustrated as a single memory (706) in
The application (707) is an algorithmic software engine providing functionality according to particular needs, desires, or particular implementations of the computer (702), particularly with respect to functionality described in this disclosure. For example, application (707) can serve as one or more components, modules, applications, etc. Further, although illustrated as a single application (707), the application (707) may be implemented as multiple applications (707) on the computer (702). In addition, although illustrated as integral to the computer (702), in alternative implementations, the application (707) can be external to the computer (702).
There may be any number of computers (702) associated with, or external to, a computer system containing computer (702), wherein each computer (702) communicates over network (730). Further, the term “client,” “user,” and other appropriate terminology may be used interchangeably as appropriate without departing from the scope of this disclosure. Moreover, this disclosure contemplates that many users may use one computer (702), or that one user may use multiple computers (702).
Although only a few example embodiments have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the example embodiments without materially departing from this invention. Accordingly, all such modifications are intended to be included within the scope of this disclosure as defined in the following claims. Although multiple dependent claims are not introduced, it would be apparent to one of ordinary skill that the subject matter of the dependent claims of one or more embodiments may be combined with other dependent claims.