This invention relates generally to the field of monitoring systems, and more particularly but not by way of limitation, to a system and method for automatically determining whether particular events have taken place at a remote location using computer-enabled vision.
Many of the most productive oil and gas assets are located in remote areas, with equipment and personnel often distributed over a large area. In these areas, it is particularly difficult to obtain real-time or near real-time monitoring of equipment, conditions and the status of operations. Numerous operations take place at remote well sites, with a variety of well operators, service companies, and other vendors. Determining whether a particular task or operation has been completed typically requires onsite personnel, which can be expensive, inefficient and dangerous.
In addition to monitoring the remote well site for ensuring safe and optimized operations, the remote nature of the wellsite creates challenges in confirming that a particular operation has been completed to the extent necessary to pay invoices for the remote operation. There is, therefore, a need for an improved system for monitoring the status of remote operations and determining whether a remote operation has been completed so that invoices can be approved and processed for the remote operation.
In one aspect, embodiments of the present disclosure are directed to a method for automatically determining whether an operator carried out a first operation on an asset positioned at a location. The method includes the steps of installing a camera module at the location to observe the asset, sending image data from the camera module to an analytics center, processing the image data to automatically determine the identity of the operator or changes at the location resulting from the first operation, developing an event signature based on the image data, wherein the event signature is indicative of whether the first operation on the asset took place, and verifying the status of the first operation based on the event signature.
In another aspect, embodiments of the present disclosure are directed to a method for automatically determining whether an operator carried out a first operation on an asset positioned at a location. In this embodiment, the method includes the steps of installing a camera module at the location to observe the asset, sending image data from the camera module to an analytics center, processing the image data to automatically determine the identity of the operator or changes at the location resulting from the first operation, developing an event signature based on the image data, wherein the event signature is indicative of whether the first operation on the asset took place, verifying the status of the first operation based on the event signature, and automatically processing an invoice for the first operation based on the step of verifying the status of the operation.
In yet another aspect, embodiments of the present disclosure are directed to a method for automatically determining whether an operator carried out a first operation on an asset positioned at a location. In this embodiment, the method includes the steps of installing a camera module at the location to observe the asset, sending image data from the camera module to an analytics center, processing the image data to automatically determine the identity of the operator or changes at the location resulting from the first operation, developing an event signature based on the image data, wherein the event signature is indicative of whether the first operation on the asset took place, verifying the status of the first operation based on the event signature, and automatically commanding a second operation based on the step of verifying the status of the first operation.
Beginning with
The well site 100 includes a number of camera modules 120 that are configured to observe activities, objects and personnel 122 within the well site 100. The camera modules 120 may each include one or more sensors, including visual range detectors, active IR systems, and passive thermal imaging sensors. In this way, the camera modules 120 can be configured to operate as both conventional digital video recorders and IR/thermal digital scanners. As illustrated in the functional data flow diagram in
The communications system 126 is also configured to transmit data from field sensors 128 at the well site 100. The field sensors 128 may include, for example, sensors on the artificial lift system 106, sensors on the wellhead 104, sensors within the well 102, RFID scanners, optical code scanners, wireless radio/phone scanners, and vehicle detection sensors. The field sensors 128 can be configured to transmit data to the analytics center 124 directly or through the communications system 126. It will be appreciated that the analytics center 124 and communications system 126 can include computer equipment that can be consolidated within a single location (e.g., the well site 100) or distributed between facilities over a data network. In some embodiments, the data from the camera modules 120 and field sensors 128 is streamed to the analytics center 124 in real-time (or near real-time). In other embodiments, the data from the camera modules 120 and field sensors 128 is recorded on local devices within the well site 100 and then uploaded or otherwise transferred to the analytics center 124 in a batched process.
Generally, the analytics center 124 is configured to process and analyze the data output by the camera modules 120 and field sensors 128. The analytics center 124 is optimally configured to apply optimization, machine learning and artificial intelligence to the various inputs to the analytics center 124 to determine the identity of objects and personnel at the well site 100. The analytics center 124 can, for example, be trained to detect from the image data provided by the camera module specific types of vehicles, equipment and personnel entering, exiting and operating within the well site 100. The analytics center 124 can be adapted to automatically determine company-specific affiliations of personnel 122, vehicles 118 and equipment based on logos and color schemes visible to the camera modules 120.
The vehicles 118 and personnel 122 can be tracked as they move around the well site 100. The location and time information recorded for the vehicles 118 and personnel 122 can be combined with information obtained from the field sensors 128 to create event signatures indicative of a particular operation taking place at the well site 100. For example, the analytics center 124 can be adapted to automatically determine that a vehicle 118 belonging to Service Company B entered through the access gate 114 at 10:00 am. The camera modules 120 detected that the vehicle 118 proceeded along the internal road 116 to the wellhead 104. By automatically recognizing the logo on hardhats worn by the personnel 122, the camera modules 120 and analytics center 124 were able to determine that Service Company B personnel 122 were at the wellhead 104 at 10:15 am.
The analytics center 124 paired the spatial and temporal information about the personnel 122 with data produced by the field sensors 128 at the wellhead 104 to produce an “event signature” that indicates the personnel 122 from Service Company B opened a choke valve on the wellhead 104 to increase the production of hydrocarbons from the well 102. In this way, the camera modules 120, field sensors 128 and analytics center 124 together form an “automatic event detection” system that is capable of automatically determining an event signature that represents: (i) that a particular event or operation has occurred; (ii) when the event or operation took place; and (iii) which personnel 122 (and company) were responsible for the event or the operation. The event signatures can be stored within a database at the analytics center 124 for future review.
The automatic determination of when an event took place at the well site 100 can be used to closely monitor the operation of the well site 100 to improve the efficient, safe and economic recovery of hydrocarbons from the well 102. For example, the automatic event detection system can be used to assess the impact of certain operations carried out by teams of personnel 122 at the well site 100.
In another aspect, the automatic detection of events at the well site 100 can be used to facilitate the processing of invoices attributable to the event. For example, on a specified date, Service Company B 130 was engaged by well site Operator A 132 to adjust the operation of the artificial lift system 106 at the well site 100. The analytics center 124 used data from the camera modules 120 and field sensors 128 to identify an event signature that is consistent with operation for which Service Company B was engaged. The camera modules 120 detected personnel 122 from Service Company B at the artificial lift system 106, and field sensors 128 detected changes in the operation of the artificial lift system 106 and production of hydrocarbons from the well 102.
As illustrated in
Turning to
At step 206, image and sensor data is harvested by the camera modules 120 and field sensors 128, respectively, and sent to the analytics center 124. At step 208, the image data is processed and analyzed by the analytics center 124. At step 210, the asset operational data from the field sensors 128 is aggregated at the analytics center 124. Next, at step 212, the analytics center 124 pairs the image data and operational data together to form event signatures. At step 214, the event signatures are used to automatically determine or predict whether a particular event or operation has taken place. In some embodiments, the method 200 stops at step 214. The determination of event signatures can then be used for a variety of purposes.
For example, the method 200 can proceed to step 216, where the event signatures are used to optimize the operation of equipment and personnel at the remote location. Correlations can be made between ongoing operations and steps taken by personnel 122 at the remote location. These correlations can be used to optimize future operations at the remote location. In another example, the method moves to step 218, where the event signatures are used to determine whether a particular operation has successfully taken place at the remote location for the purpose of approving invoices issued for that operation. The ability to automatically determine whether a particular entity has successfully completed a prescribed operation at the remote location significantly improves the efficiency of processing invoices for that operation.
It is to be understood that even though numerous characteristics and advantages of various embodiments of the present invention have been set forth in the foregoing description, together with details of the structure and functions of various embodiments of the invention, this disclosure is illustrative only, and changes may be made in detail, especially in matters of structure and arrangement of parts within the principles of the present invention to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 63/149,659 entitled, “Operation Identification of Sequential Events” filed Feb. 15, 2021, the disclosure of which is herein incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
7006920 | Newman | Feb 2006 | B2 |
8326538 | Hobbs | Dec 2012 | B2 |
11823517 | Haci | Nov 2023 | B2 |
20080208475 | Karr | Aug 2008 | A1 |
20110128160 | Overholt et al. | Jun 2011 | A1 |
20120101953 | James | Apr 2012 | A1 |
20150363738 | Haci | Dec 2015 | A1 |
20160118085 | Ptitsyn | Apr 2016 | A1 |
20160130917 | Torrione | May 2016 | A1 |
20160307143 | Mongeon | Oct 2016 | A1 |
20160364661 | Hurst | Dec 2016 | A1 |
20170140469 | Finkel et al. | May 2017 | A1 |
20170145807 | Wendorf et al. | May 2017 | A1 |
20170154341 | Gilbertson | Jun 2017 | A1 |
20170285622 | Figoli et al. | Oct 2017 | A1 |
20180006748 | Quezada | Jan 2018 | A1 |
20190018871 | Zheng | Jan 2019 | A1 |
20190078426 | Zheng | Mar 2019 | A1 |
20190197354 | Law | Jun 2019 | A1 |
20190197424 | Grehant | Jun 2019 | A1 |
20190213525 | Haci | Jul 2019 | A1 |
20200110398 | Cella et al. | Apr 2020 | A1 |
20200126386 | Michalopulos et al. | Apr 2020 | A1 |
20200167931 | Ge | May 2020 | A1 |
20200294390 | Rao | Sep 2020 | A1 |
20210342755 | Mossoba | Nov 2021 | A1 |
20210366256 | Michalopulos | Nov 2021 | A1 |
20210398045 | Hanebeck | Dec 2021 | A1 |
20220131867 | Kojima | Apr 2022 | A1 |
20240020813 | Weber | Jan 2024 | A1 |
Number | Date | Country |
---|---|---|
2536710 | Sep 2016 | GB |
2015141563 | Aug 2015 | JP |
WO-2013006165 | Jan 2013 | WO |
WO-2022081881 | Apr 2022 | WO |
Entry |
---|
R. S. Filho et al., “Semi-Autonomous Industrial Robotic Inspection: Remote Methane Detection in Oilfield,” 2018 IEEE International Conference on Edge Computing (EDGE), San Francisco, CA, USA, 2018, pp. 17-24, doi: 10.1109/EDGE.2018.00010. (Year: 2018). |
ISA/US; Search Report and Written Opinion for PCT/US22/16210 mailed May 13, 2022. |
Ellis, Keith, Wireless Monitoring Saves Hours on Each Well Workover; Oil & Gas Engineering, https://www.oilandgaseng.com/articles/wireless-monitoring-saves-hours-on-each-well-workover; May 29, 2019. |
Saeed, Saad, et al., Event Detection for Managed-Pressure Drilling: A New Paradigm; 2012 SPE Annual Technical Conference and Exhibition, San Antonio, TX, Oct. 8, 2012. |
Number | Date | Country | |
---|---|---|---|
20220262118 A1 | Aug 2022 | US |
Number | Date | Country | |
---|---|---|---|
63149659 | Feb 2021 | US |