Contactless real-time 3D mapping of surface equipment

Information

  • Patent Grant
  • 11725504
  • Patent Number
    11,725,504
  • Date Filed
    Monday, May 24, 2021
    3 years ago
  • Date Issued
    Tuesday, August 15, 2023
    a year ago
  • CPC
  • Field of Search
    • CPC
    • E21B47/135
    • E21B47/002
    • E21B47/0025
    • G01B11/002
    • G01S17/894
  • International Classifications
    • E21B47/135
    • G01S17/894
    • E21B47/002
    • G01B11/00
    • Term Extension
      37
Abstract
Systems and methods include a computer-implemented method for providing a photonic sensing system to perform an automated method to characterize displacement of equipment surfaces and monitor changes in real-time. A three-dimensional (3D) point cloud of one or more objects is generated by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward the one or more objects. Generating the point cloud includes defining points of the 3D point cloud that are relative to reference points on the one or more objects. Real-time contactless 3D surface measurements of the one or more objects are performed using the 3D point cloud. Changes in one or more parts of the one or more objects are determined by the an analysis and presentation system by analyzing the real-time contactless 3D surface measurements.
Description
TECHNICAL FIELD

The present disclosure applies to determining the condition of equipment.


BACKGROUND

Wellheads and surface structures are subject to complex forces and thermal gradients that cause structural changes or damage including, for example, anisotropic dilation, fatigue, displacement, among others. These effects may damage the platform, the wellhead, or both. Conventional systems may measure structural changes manually as part of routine inspections, and times that measurements are taken may be sporadic due to the number of wells and their locations.


SUMMARY

The present disclosure describes techniques that can be used for using a photonic sensing system to characterize the structural displacements of wellheads. In some implementations, a computer-implemented method includes the following. A three-dimensional (3D) point cloud of one or more objects is generated by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward the one or more objects. Generating the point cloud includes defining points of the 3D point cloud that are relative to reference points on the one or more objects. Real-time contactless 3D surface measurements of the one or more objects are performed using the 3D point cloud. Changes in one or more parts of the one or more objects are determined by the an analysis and presentation system by analyzing the real-time contactless 3D surface measurements.


The previously described implementation is implementable using a computer-implemented method; a non-transitory, computer-readable medium storing computer-readable instructions to perform the computer-implemented method; and a computer-implemented system including a computer memory interoperably coupled with a hardware processor configured to perform the computer-implemented method, the instructions stored on the non-transitory, computer-readable medium.


The subject matter described in this specification can be implemented in particular implementations, so as to realize one or more of the following advantages. The techniques of the present disclosure using a photonic sensing system can be used to provide a high-accuracy, high-speed, and contactless method to characterize deformations of surface structures, for example, in oil and gas applications. The techniques can also be used to characterize material changes and contamination by means of absorbance or cross-polarized spectrometry. Structured light sensing can be used to characterize deformations in real-time and in a contactless way. The term real-time can correspond to events that occur within a specified period of time, for example, within a few seconds or under one minute. The techniques can be used in the ongoing development of tools and methods for wellhead deformation characterization, for example, in upstream photonics program and advanced sensors program, and wellhead displacement analysis. The techniques of the present disclosure can solve the problems of conventional systems by improving the resolution of results and reducing the acquisition speed. The techniques can be used to generate a three dimensional (3D) point cloud (of equipment surfaces, for example) using structured light. Further, the techniques can be used to: characterize axial, radial, and azimuthal deformations; characterize mechanical tension and strain by analyzing cross polarized spectra of reflected beams (for example, with the probing beam being elliptically polarized); identify material degradation and contamination by using reflectance spectroscopy; and derive correlations between displacement, temperature, and flow rates. The techniques can be expanded, modified, or customized to characterize displacement of other surface equipment. This provides an advantage over conventional systems, for example, that are typically reactive and may be slower to react to problems associated with equipment.


The details of one or more implementations of the subject matter of this specification are set forth in the Detailed Description, the accompanying drawings, and the claims. Other features, aspects, and advantages of the subject matter will become apparent from the Detailed Description, the claims, and the accompanying drawings.





DESCRIPTION OF DRAWINGS


FIG. 1 is diagram of an example of a photonic sensing system for characterizing the displacement, according to some implementations of the present disclosure.



FIG. 2 is a diagram of a photonic sensing system used in a sensing and characterization process, according to some implementations of the present disclosure.



FIG. 3 is diagram showing examples of locations of beacons in a wellhead, according to some implementations of the present disclosure.



FIG. 4 is a schematic representation of an example structure of a photonic sensor, according to some implementations of the present disclosure.



FIG. 5 is a diagram showing an example of a photonic sensing system used on a wellhead structure, according to some implementations of the present disclosure.



FIGS. 6A and 6B show example top views and of a photonic sensing system, according to some implementations of the present disclosure.



FIG. 7 is a flowchart of an example of a method for using a real-time contactless three-dimensional (3D) surface measurements to determine changes in one or more parts of the one or more objects, according to some implementations of the present disclosure.



FIG. 8 is a block diagram illustrating an example computer system used to provide computational functionalities associated with described algorithms, methods, functions, processes, flows, and procedures as described in the present disclosure, according to some implementations of the present disclosure.





Like reference numbers and designations in the various drawings indicate like elements.


DETAILED DESCRIPTION

The following detailed description describes techniques for providing a photonic sensing system to perform an automated method to characterize displacement of equipment surfaces and monitor changes in real-time. For example, a method and a system can be used for contactless three-dimensional (3D) surface measurement based on structured light illumination. Laser-patterned illumination can be used to obtain a real-time three-dimensional map of surface equipment (for example, wellheads, production tubing, and manifolds). The term real-time can correspond to events that occur within a specified period of time, for example, within a few seconds or under one minute. The real-time map can be used to track the relative change in position of known markers on the surface of an object. The process creates a point cloud that can be used to derive structural properties and changes, such as displacements, deformations, and stress/strain states. In addition the use of different laser wavelengths or hyperspectral structured light illumination can enable a real-time monitoring of material accretion (contamination) and degradation. The techniques can be incorporated into, or used with, monitoring systems used in the petrochemical industry, such as gas operations, including offshore gas wells.


Various modifications, alterations, and permutations of the disclosed implementations can be made and will be readily apparent to those of ordinary skill in the art, and the general principles defined may be applied to other implementations and applications, without departing from scope of the disclosure. In some instances, details unnecessary to obtain an understanding of the described subject matter may be omitted so as to not obscure one or more described implementations with unnecessary detail and inasmuch as such details are within the skill of one of ordinary skill in the art. The present disclosure is not intended to be limited to the described or illustrated implementations, but to be accorded the widest scope consistent with the described principles and features.



FIG. 1 is diagram of an example of a photonic sensing system 100 for characterizing the displacement, according to some implementations of the present disclosure. The photonic sensing system 100 can provide remote sensing and can use a workflow, for example, that enables the real-time and contactless monitoring the growth 102 of a wellhead 104. The photonic sensing system 100 can include a single sensor 106 or an array (or group) of sensors 106, a data transmission system, a computerized data collection system, and toolboxes (for example, user interfaces 108) for visualization and data analytics. A user interface 108 used by a user can present growth 102 (for example, in inches (in.)), for example, as one or more graphs showing changes in a wellhead growth 108 and a production rate 110 (for example, in thousands of standard cubic feet per day (MMSCFD)) over time 114 (for example, in weeks, months, or years).


The photonic sensing system 100 can include an array of structured-light sensors (SLSes) (for example, the sensors 106) that are directed toward one or more objects, such as equipment at a wellhead 104. A computerized data collection system configured to collect light information from the array of SLSes. The light information collected from the array of SLSes can be used by an analysis and presentation system for providing analysis and visualization of the light information. A data transmission system can transmit information between the array of SLSes, the computerized data collection system, and the analysis and presentation system.


The spatial resolution of the sensor can be, for example, 100×10−6 m (or 100 μm), or 100 micrometers, with a repetition rate in the order of ≥60 Hz, for example. The data telecommunication can be attained using mobile networks (for example, Global System for Mobile Communications (GSM) or generations 4G or 5G), mesh wireless network, or fiber optics.



FIG. 2 is a diagram of a photonic sensing system 200 used in a sensing and characterization process, according to some implementations of the present disclosure. The photonic sensing system 200 can generate a point cloud of the wellhead 104, for example. Generating the point cloud 102 can include system features 204 including a structured light sensor system, signal processing, point cloud generation and storage, including using an optical flow (for example, using artificial intelligence (AI)) to determine deformations and transmission to supervisory control and data acquisition (SCADA) systems. The process can take place before integrating with a SCADA system. The process can use post-analysis routines and a machine learning engine. The photonic sensing system 200 can use a structure light, multi-laser ranging, laser-array ranging, or laser-patterned beam array ranging to obtain a point cloud P of n points whose coordinates can be, for example:

{{right arrow over (r)}i},i=1, . . . ,n  (1)

describing the surface of an object for a range {{right arrow over (r)}i} using index i from 1 to n. The photonic sensing system 200 can also record k scalar quantities associated with each point P as a function of illumination wavelength (λ), such that:

P={(xi,yi,zi,fi1(λ), . . . ,fik(λ))}  (2)

Note that fi(xi, yi, zi; λl) is the scalar quantity (for example, an absorbance function). The device can transmit the point cloud data directly to a central processing authority. Alternatively, the device can incorporate an edge-computer to perform analysis in-situ and transfer that data to the central processing authority. In the latter case, the raw data can be stored (either partially or full) or transmitted as needed. The central processing authority or edge computing system can perform the calculation of the optical flow calculation between the point clouds of two subsequent acquisition times. The process can involve calculation of the L2 norm between the spatial position of the same point (referenced to the structure) in two subsequent frames (displacement potential), and/or the gradient (or displacement field velocity). In addition, the photonic sensing system 200 can also perform a calculation of the divergence of the displacement field.


The photonic sensing system 200 can provide a point cloud capturing process, analysis, and transmission. Basic analysis can be conducted at the edge by incorporating a neural engine that calculates the optical flow of the point cloud between two subsequent acquisitions. Alternatively, data can be transmitted to a central processing authority. The information can provide the transient change in position/displacement. The data can further be utilized in other analytics toolboxes to correlate and predict flow using the data available through a SCADA system.



FIG. 3 is diagram showing examples of locations of beacons 302 in a wellhead 300, according to some implementations of the present disclosure. The beacons 302 can be used in the photonic sensing system 200, for example. In some implementations, the beacons 302 (or tracers), depicted as white squares in FIG. 3, can be placed in known positions of the wellhead 300 to serve as references or anchors in the calculations associated with FIG. 3.



FIG. 4 is a schematic representation of an example structure of a photonic sensor 400, according to some implementations of the present disclosure. The photonic sensor 400 includes field housing components 402a-402g (“S1” to “S7”). A sensor 402 can be a patterned light sensor and receiver (for example, commercially available). A lens 404 can be made of titanium dioxide/silicon dioxide (TiO2/SiO2), TiO2/diamond, titanium dioxide TiO2/IRFS (infrared fused silica glass), or hierarchical diamond/diamond window, where the TiO2 layer is used for self-cleaning. A beam 406 can be the beam that is projected and reflected to and from the measurement target. A cooler 408 can be a thermos-electric cooler and heat sink with air circulation to maintain the device operation below a threshold temperature, for example, 50 degrees Celsius (C). Air components 410 can include an air pump and conduits. Circulated air can serve at least two purposes, including keeping the photonic sensor 400 cool and cleaning the window using ionized air nozzles 412. Power components 414 can include an electric power supply and battery (for example, Lithium (Li)-Ion or hydrogen cell). A housing 416 can house a high gain and high temperature photocell, for example, using Gallium Nitride (GaN).


The photonic sensor 400 of FIG. 4 can be implemented with an integrated sensing head or structured-light sensor (SLS) system and field housing. The photonic sensor 400 can use several structure light sensors and laser ranging sensors that are available in the market, for example, Ladimo, Blackrock, or Velodyne, in various combinations. In most cases, the number of devices used can vary with axial or longitudinal resolution requirements. In some implementations, one sensor can be located and used every 30 centimeters (cm) to 50 cm if axial resolution is to be better than 0.1 millimeters (mm). This ratio can depend on the features of the sensor that is used. The use an array of sensors, for example, can depend on the position of each SLS, as each beam 406 may be occluded from seeing some parts of the wellhead. An onboard edge-computing system can use all-optical or ASIC processing. The photonic sensor 400 can include a self-cleaning and anti-fogging window that helps to ensure prolonged maintenance-free, optical monitoring. Two non-mutually exclusive routes that can be used to achieve the following two examples.


A first example includes modifying the surface to exhibit at least one of the following atypical behaviors: hydrophobicity, hydrophilicity, oleophobicity, superhydrophobicity and super olephobicity. This allows for the surface to either completely repel liquids and particles (hydro/oleophobic or super hydro/oleophobic surfaces), or to spread the liquids evenly across the whole surface in order to avoid lensing or diffusive effects, while flushing out contaminants (for example, hydrophilic or super hydrophilic surfaces). Implementations using a diamond window can include a deposition of a tailored diamond nanofilm, or diamond-like carbon structures to exhibit long-lasting self-cleaning and anti-fogging effects, while retaining high optical transmittance.


A second example includes the use of TiO2-based photocatalytic surfaces. When ambient ultraviolet light interacts with TiO2, the surface releases active oxygen species. The oxygen radical combusts the organics attached to the surface, and thus cleaning the surface. Photocatalytic surfaces can exhibit anti-fogging and self-cleaning properties as well as antibacterial and antifouling properties.



FIG. 5 is a diagram showing an example of a photonic sensing system 500 used on a wellhead structure 502, according to some implementations of the present disclosure. The photonic sensing system 500 includes an SLS array and projection 504. The SLS array can be mounted on a rack. Each SLS 506 can incorporate a laser meter to determine the distance between each SLS 506 and the ground. These features can makes the photonic sensing system 500 capable of measuring absolute displacement with reference to the ground. Reference/beacon markers 508 of the SLS array are shown in using white circles on the wellhead structure 502. The photonic sensing system 500 produces a point cloud 510 representing points on the wellhead structure 502.


Each of the SLSes can be mounted on a rack. Each SLS can use a laser ranging system at the bottom of the SLS to monitor the distance to the next measurement device (for example, below the SLS). This enables the process to know the relative position of each measurement device and hence improve the axial/longitudinal displacement characterization.


The horizontal (H) and vertical (V) resolution of structured light sensors can vary depending on the configuration of the photonic sensing system used. Thus, a resolution Δh,v, can be represented as a function of a distance rsls from a circumscribing cylinder to an optical output and the angular resolution ϕH/V, for example given by:

Δh,v=rsls tan ϕh,v  (3)


A field of view (FOV) of an SLS can typically span, for example, from 170-360 degrees in a horizontal direction and 5-30 degrees in a vertical direction. These spans and the distance to the target can determine the number of SLSes needed in the array configuration.



FIGS. 6A and 6B show example top views 600 and 602 of a photonic sensing system, according to some implementations of the present disclosure. The top views 600 and 602 show a wellhead structure 604 and an SLS array 606, with examples of recommended positions of remote structured light sensors (RSLSes) as seen from the top. The wellhead structure 604 has a maximum radius 608 of wellhead, including the arms of the wellhead. Long and short axes of the parabola can be chosen such that a pattern projected by each beam on the structure covers a pre-determined area, for example, a 30 centimeter (cm) to 50 cm wide plane that is tangent to the cylinder that circumscribes the wellhead structure 604. The suggested positions can be chosen to maximize view and characterization.


The photonic sensing system can be also used to characterize stresses along a structure (for example, of the wellhead structure 604). In this case, the photonic sensing system can be modified to perform photonic-stress analysis tomography. In this setting, the output beam of the analyzer can be elliptically polarized using variable or permanent wavelength retarders. The polarization can also be built into the laser system, which provides the beam. This polarization step is done prior to the creation of the pattern. The structure, completely or in part, is covered with a birefringent film (for example, epoxy or oil transparent to the wavelength used in the source). The reflected beam is then separated into its base polarization and the intensity patterns for each polarization are compared. The resulting polarization and spectra distribution can be used to derive the stress map according to the strain/stress-optical law.



FIG. 7 is a flowchart of an example of a method 700 for using a real-time contactless 3D surface measurements to determine changes in one or more parts of the one or more objects, according to some implementations of the present disclosure. For clarity of presentation, the description that follows generally describes method 700 in the context of the other figures in this description. However, it will be understood that method 700 can be performed, for example, by any suitable system, environment, software, and hardware, or a combination of systems, environments, software, and hardware, as appropriate. In some implementations, various steps of method 700 can be run in parallel, in combination, in loops, or in any order.


At 702, a 3D point cloud of the one or more objects is generating by an analysis and presentation system using light information collected through structured light illumination by an array of SLSes directed toward one or more objects. The SLSes can include laser-patterned illumination devices, for example. Generating the point cloud include defining points of the 3D point cloud that are relative to reference points on the one or more objects. The one or more objects can include equipment used in petrochemical industry operations. The equipment includes one or more of wellheads, production tubing, and manifolds. The reference points can be markers positioned on specific locations on the equipment. The SLSes can provide a spatial resolution, for example, of 100 micrometers and a repetition rate, for example, greater than or equal to 60 Hertz (Hz). The SLSes can determine, for each point in the point cloud, an x,y,z,λ value, where x,y,z are 3D spatial coordinates and λ is an illumination wavelength at the 3D spatial coordinates. In some implementations, the analysis and presentation system can include analytics tools for correlate and predict flow using data available through a supervisory control and data acquisition (SCADA) system. From 702, method 700 proceeds to 704.


At 704, real-time contactless 3D surface measurements of the one or more objects are performed using the 3D point cloud. From 704, method 700 proceeds to 706.


At 706, changes in one or more parts of the one or more objects are determined by an analysis and presentation system by analyzing the real-time contactless 3D surface measurements. As an example, determining the changes in the one or more parts of the one or more objects can includes deriving changes in structural properties including displacements, deformations, stress/strain states, and material accretion/contamination and degradation. After 706, method 700 can stop.



FIG. 8 is a block diagram of an example computer system 800 used to provide computational functionalities associated with described algorithms, methods, functions, processes, flows, and procedures described in the present disclosure, according to some implementations of the present disclosure. The illustrated computer 802 is intended to encompass any computing device such as a server, a desktop computer, a laptop/notebook computer, a wireless data port, a smart phone, a personal data assistant (PDA), a tablet computing device, or one or more processors within these devices, including physical instances, virtual instances, or both. The computer 802 can include input devices such as keypads, keyboards, and touch screens that can accept user information. Also, the computer 802 can include output devices that can convey information associated with the operation of the computer 802. The information can include digital data, visual data, audio information, or a combination of information. The information can be presented in a graphical user interface (UI) (or GUI).


The computer 802 can serve in a role as a client, a network component, a server, a database, a persistency, or components of a computer system for performing the subject matter described in the present disclosure. The illustrated computer 802 is communicably coupled with a network 830. In some implementations, one or more components of the computer 802 can be configured to operate within different environments, including cloud-computing-based environments, local environments, global environments, and combinations of environments.


At a top level, the computer 802 is an electronic computing device operable to receive, transmit, process, store, and manage data and information associated with the described subject matter. According to some implementations, the computer 802 can also include, or be communicably coupled with, an application server, an email server, a web server, a caching server, a streaming data server, or a combination of servers.


The computer 802 can receive requests over network 830 from a client application (for example, executing on another computer 802). The computer 802 can respond to the received requests by processing the received requests using software applications. Requests can also be sent to the computer 802 from internal users (for example, from a command console), external (or third) parties, automated applications, entities, individuals, systems, and computers.


Each of the components of the computer 802 can communicate using a system bus 803. In some implementations, any or all of the components of the computer 802, including hardware or software components, can interface with each other or the interface 804 (or a combination of both) over the system bus 803. Interfaces can use an application programming interface (API) 812, a service layer 813, or a combination of the API 812 and service layer 813. The API 812 can include specifications for routines, data structures, and object classes. The API 812 can be either computer-language independent or dependent. The API 812 can refer to a complete interface, a single function, or a set of APIs.


The service layer 813 can provide software services to the computer 802 and other components (whether illustrated or not) that are communicably coupled to the computer 802. The functionality of the computer 802 can be accessible for all service consumers using this service layer. Software services, such as those provided by the service layer 813, can provide reusable, defined functionalities through a defined interface. For example, the interface can be software written in JAVA, C++, or a language providing data in extensible markup language (XML) format. While illustrated as an integrated component of the computer 802, in alternative implementations, the API 812 or the service layer 813 can be stand-alone components in relation to other components of the computer 802 and other components communicably coupled to the computer 802. Moreover, any or all parts of the API 812 or the service layer 813 can be implemented as child or sub-modules of another software module, enterprise application, or hardware module without departing from the scope of the present disclosure.


The computer 802 includes an interface 804. Although illustrated as a single interface 804 in FIG. 8, two or more interfaces 804 can be used according to particular needs, desires, or particular implementations of the computer 802 and the described functionality. The interface 804 can be used by the computer 802 for communicating with other systems that are connected to the network 830 (whether illustrated or not) in a distributed environment. Generally, the interface 804 can include, or be implemented using, logic encoded in software or hardware (or a combination of software and hardware) operable to communicate with the network 830. More specifically, the interface 804 can include software supporting one or more communication protocols associated with communications. As such, the network 830 or the interface's hardware can be operable to communicate physical signals within and outside of the illustrated computer 802.


The computer 802 includes a processor 805. Although illustrated as a single processor 805 in FIG. 8, two or more processors 805 can be used according to particular needs, desires, or particular implementations of the computer 802 and the described functionality. Generally, the processor 805 can execute instructions and can manipulate data to perform the operations of the computer 802, including operations using algorithms, methods, functions, processes, flows, and procedures as described in the present disclosure.


The computer 802 also includes a database 806 that can hold data for the computer 802 and other components connected to the network 830 (whether illustrated or not). For example, database 806 can be an in-memory, conventional, or a database storing data consistent with the present disclosure. In some implementations, database 806 can be a combination of two or more different database types (for example, hybrid in-memory and conventional databases) according to particular needs, desires, or particular implementations of the computer 802 and the described functionality. Although illustrated as a single database 806 in FIG. 8, two or more databases (of the same, different, or combination of types) can be used according to particular needs, desires, or particular implementations of the computer 802 and the described functionality. While database 806 is illustrated as an internal component of the computer 802, in alternative implementations, database 806 can be external to the computer 802.


The computer 802 also includes a memory 807 that can hold data for the computer 802 or a combination of components connected to the network 830 (whether illustrated or not). Memory 807 can store any data consistent with the present disclosure. In some implementations, memory 807 can be a combination of two or more different types of memory (for example, a combination of semiconductor and magnetic storage) according to particular needs, desires, or particular implementations of the computer 802 and the described functionality. Although illustrated as a single memory 807 in FIG. 8, two or more memories 807 (of the same, different, or combination of types) can be used according to particular needs, desires, or particular implementations of the computer 802 and the described functionality. While memory 807 is illustrated as an internal component of the computer 802, in alternative implementations, memory 807 can be external to the computer 802.


The application 808 can be an algorithmic software engine providing functionality according to particular needs, desires, or particular implementations of the computer 802 and the described functionality. For example, application 808 can serve as one or more components, modules, or applications. Further, although illustrated as a single application 808, the application 808 can be implemented as multiple applications 808 on the computer 802. In addition, although illustrated as internal to the computer 802, in alternative implementations, the application 808 can be external to the computer 802.


The computer 802 can also include a power supply 814. The power supply 814 can include a rechargeable or non-rechargeable battery that can be configured to be either user- or non-user-replaceable. In some implementations, the power supply 814 can include power-conversion and management circuits, including recharging, standby, and power management functionalities. In some implementations, the power-supply 814 can include a power plug to allow the computer 802 to be plugged into a wall socket or a power source to, for example, power the computer 802 or recharge a rechargeable battery.


There can be any number of computers 802 associated with, or external to, a computer system containing computer 802, with each computer 802 communicating over network 830. Further, the terms “client,” “user,” and other appropriate terminology can be used interchangeably, as appropriate, without departing from the scope of the present disclosure. Moreover, the present disclosure contemplates that many users can use one computer 802 and one user can use multiple computers 802.


Described implementations of the subject matter can include one or more features, alone or in combination.


For example, in a first implementation, a computer-implemented method includes the following. A three-dimensional (3D) point cloud of one or more objects is generated by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward the one or more objects. Generating the point cloud includes defining points of the 3D point cloud that are relative to reference points on the one or more objects. Real-time contactless 3D surface measurements of the one or more objects are performed using the 3D point cloud. Changes in one or more parts of the one or more objects are determined by the an analysis and presentation system by analyzing the real-time contactless 3D surface measurements.


The foregoing and other described implementations can each, optionally, include one or more of the following features:


A first feature, combinable with any of the following features, where the one or more objects include equipment used in petrochemical industry operations, where the equipment includes one or more of wellheads, production tubing, and manifolds, and where the reference points are markers positioned on specific locations on the equipment.


A second feature, combinable with any of the previous or following features, where the SLSes include laser-patterned illumination devices.


A third feature, combinable with any of the previous or following features, where determining the changes in the one or more parts of the one or more objects includes deriving changes in structural properties including displacements, deformations, stress/strain states, and material accretion/contamination and degradation.


A fourth feature, combinable with any of the previous or following features, where the SLSes provide a spatial resolution of 100 micrometers and a repetition rate greater than or equal to 60 Hertz (Hz).


A fifth feature, combinable with any of the previous or following features, where the SLSes determine, for each point in the point cloud, an x,y,z,λ value, and where x,y,z are 3D spatial coordinates and λ is an illumination wavelength at the 3D spatial coordinates.


A sixth feature, combinable with any of the previous or following features, where the analysis and presentation system includes analytics tools for correlate and predict flow using data available through a supervisory control and data acquisition (SCADA) system.


In a second implementation, a non-transitory, computer-readable medium stores one or more instructions executable by a computer system to perform operations including the following. A three-dimensional (3D) point cloud of one or more objects is generated by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward the one or more objects. Generating the point cloud includes defining points of the 3D point cloud that are relative to reference points on the one or more objects. Real-time contactless 3D surface measurements of the one or more objects are performed using the 3D point cloud. Changes in one or more parts of the one or more objects are determined by the an analysis and presentation system by analyzing the real-time contactless 3D surface measurements.


The foregoing and other described implementations can each, optionally, include one or more of the following features:


A first feature, combinable with any of the following features, where the one or more objects include equipment used in petrochemical industry operations, where the equipment includes one or more of wellheads, production tubing, and manifolds, and where the reference points are markers positioned on specific locations on the equipment.


A second feature, combinable with any of the previous or following features, where the SLSes include laser-patterned illumination devices.


A third feature, combinable with any of the previous or following features, where determining the changes in the one or more parts of the one or more objects includes deriving changes in structural properties including displacements, deformations, stress/strain states, and material accretion/contamination and degradation.


A fourth feature, combinable with any of the previous or following features, where the SLSes provide a spatial resolution of 100 micrometers and a repetition rate greater than or equal to 60 Hertz (Hz).


A fifth feature, combinable with any of the previous or following features, where the SLSes determine, for each point in the point cloud, an x,y,z,λ value, and where x,y,z are 3D spatial coordinates and λ is an illumination wavelength at the 3D spatial coordinates.


A sixth feature, combinable with any of the previous or following features, where the analysis and presentation system includes analytics tools for correlate and predict flow using data available through a supervisory control and data acquisition (SCADA) system.


In a third implementation, a computer-implemented system includes: an array of structured-light sensors (SLSes) directed toward one or more objects; a computerized data collection system configured to collect light information from the array of SLSes; an analysis and presentation system configured to provide analysis and visualization of the light information collected from the array of SLSes; and a data transmission system for transmitting information between the array of SLSes, the computerized data collection system, and the analysis and presentation system. The computer-implemented system includes one or more processors and a non-transitory computer-readable storage medium coupled to the one or more processors and storing programming instructions for execution by the one or more processors. The programming instructions instruct the one or more processors to perform operations including the following. A three-dimensional (3D) point cloud of one or more objects is generated by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward the one or more objects. Generating the point cloud includes defining points of the 3D point cloud that are relative to reference points on the one or more objects. Real-time contactless 3D surface measurements of the one or more objects are performed using the 3D point cloud. Changes in one or more parts of the one or more objects are determined by the an analysis and presentation system by analyzing the real-time contactless 3D surface measurements.


The foregoing and other described implementations can each, optionally, include one or more of the following features:


A first feature, combinable with any of the following features, where the one or more objects include equipment used in petrochemical industry operations, where the equipment includes one or more of wellheads, production tubing, and manifolds, and where the reference points are markers positioned on specific locations on the equipment.


A second feature, combinable with any of the previous or following features, where the SLSes include laser-patterned illumination devices.


A third feature, combinable with any of the previous or following features, where determining the changes in the one or more parts of the one or more objects includes deriving changes in structural properties including displacements, deformations, stress/strain states, and material accretion/contamination and degradation.


A fourth feature, combinable with any of the previous or following features, where the SLSes provide a spatial resolution of 100 micrometers and a repetition rate greater than or equal to 60 Hertz (Hz).


A fifth feature, combinable with any of the previous or following features, where the SLSes determine, for each point in the point cloud, an x,y,z,λ value, and where x,y,z are 3D spatial coordinates and λ is an illumination wavelength at the 3D spatial coordinates.


Implementations of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, in tangibly embodied computer software or firmware, in computer hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Software implementations of the described subject matter can be implemented as one or more computer programs. Each computer program can include one or more modules of computer program instructions encoded on a tangible, non-transitory, computer-readable computer-storage medium for execution by, or to control the operation of, data processing apparatus. Alternatively, or additionally, the program instructions can be encoded in/on an artificially generated propagated signal. For example, the signal can be a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to a suitable receiver apparatus for execution by a data processing apparatus. The computer-storage medium can be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of computer-storage mediums.


The terms “data processing apparatus,” “computer,” and “electronic computer device” (or equivalent as understood by one of ordinary skill in the art) refer to data processing hardware. For example, a data processing apparatus can encompass all kinds of apparatuses, devices, and machines for processing data, including by way of example, a programmable processor, a computer, or multiple processors or computers. The apparatus can also include special purpose logic circuitry including, for example, a central processing unit (CPU), a field-programmable gate array (FPGA), or an application-specific integrated circuit (ASIC). In some implementations, the data processing apparatus or special purpose logic circuitry (or a combination of the data processing apparatus or special purpose logic circuitry) can be hardware- or software-based (or a combination of both hardware- and software-based). The apparatus can optionally include code that creates an execution environment for computer programs, for example, code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of execution environments. The present disclosure contemplates the use of data processing apparatuses with or without conventional operating systems, such as LINUX, UNIX, WINDOWS, MAC OS, ANDROID, or IOS.


A computer program, which can also be referred to or described as a program, software, a software application, a module, a software module, a script, or code, can be written in any form of programming language. Programming languages can include, for example, compiled languages, interpreted languages, declarative languages, or procedural languages. Programs can be deployed in any form, including as stand-alone programs, modules, components, subroutines, or units for use in a computing environment. A computer program can, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data, for example, one or more scripts stored in a markup language document, in a single file dedicated to the program in question, or in multiple coordinated files storing one or more modules, sub-programs, or portions of code. A computer program can be deployed for execution on one computer or on multiple computers that are located, for example, at one site or distributed across multiple sites that are interconnected by a communication network. While portions of the programs illustrated in the various figures may be shown as individual modules that implement the various features and functionality through various objects, methods, or processes, the programs can instead include a number of sub-modules, third-party services, components, and libraries. Conversely, the features and functionality of various components can be combined into single components as appropriate. Thresholds used to make computational determinations can be statically, dynamically, or both statically and dynamically determined.


The methods, processes, or logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform functions by operating on input data and generating output. The methods, processes, or logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, for example, a CPU, an FPGA, or an ASIC.


Computers suitable for the execution of a computer program can be based on one or more of general and special purpose microprocessors and other kinds of CPUs. The elements of a computer are a CPU for performing or executing instructions and one or more memory devices for storing instructions and data. Generally, a CPU can receive instructions and data from (and write data to) a memory.


Graphics processing units (GPUs) can also be used in combination with CPUs. The GPUs can provide specialized processing that occurs in parallel to processing performed by CPUs. The specialized processing can include artificial intelligence (AI) applications and processing, for example. GPUs can be used in GPU clusters or in multi-GPU computing.


A computer can include, or be operatively coupled to, one or more mass storage devices for storing data. In some implementations, a computer can receive data from, and transfer data to, the mass storage devices including, for example, magnetic, magneto-optical disks, or optical disks. Moreover, a computer can be embedded in another device, for example, a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a global positioning system (GPS) receiver, or a portable storage device such as a universal serial bus (USB) flash drive.


Computer-readable media (transitory or non-transitory, as appropriate) suitable for storing computer program instructions and data can include all forms of permanent/non-permanent and volatile/non-volatile memory, media, and memory devices. Computer-readable media can include, for example, semiconductor memory devices such as random access memory (RAM), read-only memory (ROM), phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and flash memory devices. Computer-readable media can also include, for example, magnetic devices such as tape, cartridges, cassettes, and internal/removable disks. Computer-readable media can also include magneto-optical disks and optical memory devices and technologies including, for example, digital video disc (DVD), CD-ROM, DVD+/−R, DVD-RAM, DVD-ROM, HD-DVD, and BLU-RAY. The memory can store various objects or data, including caches, classes, frameworks, applications, modules, backup data, jobs, web pages, web page templates, data structures, database tables, repositories, and dynamic information. Types of objects and data stored in memory can include parameters, variables, algorithms, instructions, rules, constraints, and references. Additionally, the memory can include logs, policies, security or access data, and reporting files. The processor and the memory can be supplemented by, or incorporated into, special purpose logic circuitry.


Implementations of the subject matter described in the present disclosure can be implemented on a computer having a display device for providing interaction with a user, including displaying information to (and receiving input from) the user. Types of display devices can include, for example, a cathode ray tube (CRT), a liquid crystal display (LCD), a light-emitting diode (LED), and a plasma monitor. Display devices can include a keyboard and pointing devices including, for example, a mouse, a trackball, or a trackpad. User input can also be provided to the computer through the use of a touchscreen, such as a tablet computer surface with pressure sensitivity or a multi-touch screen using capacitive or electric sensing. Other kinds of devices can be used to provide for interaction with a user, including to receive user feedback including, for example, sensory feedback including visual feedback, auditory feedback, or tactile feedback. Input from the user can be received in the form of acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to, and receiving documents from, a device that the user uses. For example, the computer can send web pages to a web browser on a user's client device in response to requests received from the web browser.


The term “graphical user interface,” or “GUI,” can be used in the singular or the plural to describe one or more graphical user interfaces and each of the displays of a particular graphical user interface. Therefore, a GUI can represent any graphical user interface, including, but not limited to, a web browser, a touch-screen, or a command line interface (CLI) that processes information and efficiently presents the information results to the user. In general, a GUI can include a plurality of user interface (UI) elements, some or all associated with a web browser, such as interactive fields, pull-down lists, and buttons. These and other UI elements can be related to or represent the functions of the web browser.


Implementations of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, for example, as a data server, or that includes a middleware component, for example, an application server. Moreover, the computing system can include a front-end component, for example, a client computer having one or both of a graphical user interface or a Web browser through which a user can interact with the computer. The components of the system can be interconnected by any form or medium of wireline or wireless digital data communication (or a combination of data communication) in a communication network. Examples of communication networks include a local area network (LAN), a radio access network (RAN), a metropolitan area network (MAN), a wide area network (WAN), Worldwide Interoperability for Microwave Access (WIMAX), a wireless local area network (WLAN) (for example, using 802.11 a/b/g/n or 802.20 or a combination of protocols), all or a portion of the Internet, or any other communication system or systems at one or more locations (or a combination of communication networks). The network can communicate with, for example, Internet Protocol (IP) packets, frame relay frames, asynchronous transfer mode (ATM) cells, voice, video, data, or a combination of communication types between network addresses.


The computing system can include clients and servers. A client and server can generally be remote from each other and can typically interact through a communication network. The relationship of client and server can arise by virtue of computer programs running on the respective computers and having a client-server relationship.


Cluster file systems can be any file system type accessible from multiple servers for read and update. Locking or consistency tracking may not be necessary since the locking of exchange file system can be done at application layer. Furthermore, Unicode data files can be different from non-Unicode data files.


While this specification contains many specific implementation details, these should not be construed as limitations on the scope of what may be claimed, but rather as descriptions of features that may be specific to particular implementations. Certain features that are described in this specification in the context of separate implementations can also be implemented, in combination, in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations, separately, or in any suitable sub-combination. Moreover, although previously described features may be described as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can, in some cases, be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.


Particular implementations of the subject matter have been described. Other implementations, alterations, and permutations of the described implementations are within the scope of the following claims as will be apparent to those skilled in the art. While operations are depicted in the drawings or claims in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed (some operations may be considered optional), to achieve desirable results. In certain circumstances, multitasking or parallel processing (or a combination of multitasking and parallel processing) may be advantageous and performed as deemed appropriate.


Moreover, the separation or integration of various system modules and components in the previously described implementations should not be understood as requiring such separation or integration in all implementations. It should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.


Accordingly, the previously described example implementations do not define or constrain the present disclosure. Other changes, substitutions, and alterations are also possible without departing from the spirit and scope of the present disclosure.


Furthermore, any claimed implementation is considered to be applicable to at least a computer-implemented method; a non-transitory, computer-readable medium storing computer-readable instructions to perform the computer-implemented method; and a computer system including a computer memory interoperably coupled with a hardware processor configured to perform the computer-implemented method or the instructions stored on the non-transitory, computer-readable medium.

Claims
  • 1. A system, comprising: an array of structured-light sensors (SLSes) directed toward one or more objects;a computerized data collection system configured to collect light information from the array of SLSes;an analysis and presentation system configured to provide analysis and visualization of the light information collected from the array of SLSes;a data transmission system for transmitting information between the array of SLSes, the computerized data collection system, and the analysis and presentation system;one or more processors; anda non-transitory computer-readable storage medium coupled to the one or more processors and storing programming instructions for execution by the one or more processors, the programming instructions instructing the one or more processors to perform operations comprising: generating, by the analysis and presentation system using light information collected through structured light illumination by the array of SLSes, a three-dimensional (3D) point cloud of the one or more objects, including defining points of the 3D point cloud that are relative to reference points on the one or more objects;performing, using the 3D point cloud, real-time contactless 3D surface measurements of the one or more objects; anddetermining, by the analysis and presentation system by analyzing the real-time contactless 3D surface measurements, material changes, damage, deformations, mechanical tension and strain, material degradation and contamination, and correlations between displacement, temperature, and flow rates in one or more parts of the one or more objects.
  • 2. The system of claim 1, wherein the one or more objects include equipment used in petrochemical industry operations, wherein the equipment includes one or more of wellheads, production tubing, and manifolds, and wherein the reference points are markers positioned on specific locations on the equipment.
  • 3. The system of claim 1, wherein the SLSes include laser-patterned illumination devices.
  • 4. The system of claim 1, wherein the SLSes provide a spatial resolution of 100 micrometers and a repetition rate greater than or equal to 60 Hertz (Hz).
  • 5. The system of claim 1, wherein the SLSes determine, for each point in the point cloud, an x,y,z,λ value, and wherein x,y,z are 3D spatial coordinates and λ is an illumination wavelength at the 3D spatial coordinates.
  • 6. The system of claim 1, wherein the analysis and presentation system includes analytics tools for correlate and predict flow using data available through a supervisory control and data acquisition (SCADA) system.
  • 7. A computer-implemented method, comprising: generating, by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward one or more objects, a three-dimensional (3D) point cloud of the one or more objects, including defining points of the 3D point cloud that are relative to reference points on the one or more objects;performing, using the 3D point cloud, real-time contactless 3D surface measurements of the one or more objects; anddetermining, by an analysis and presentation system by analyzing the real-time contactless 3D surface measurements, material changes, damage, deformations, mechanical tension and strain, material degradation and contamination, and correlations between displacement, temperature, and flow rates in one or more parts of the one or more objects.
  • 8. The computer-implemented method of claim 7, wherein the one or more objects include equipment used in petrochemical industry operations, wherein the equipment includes one or more of wellheads, production tubing, and manifolds, and wherein the reference points are markers positioned on specific locations on the equipment.
  • 9. The computer-implemented method of claim 7, wherein the SLSes include laser-patterned illumination devices.
  • 10. The computer-implemented method of claim 7, wherein the SLSes provide a spatial resolution of 100 micrometers and a repetition rate greater than or equal to 60 Hertz (Hz).
  • 11. The computer-implemented method of claim 7, wherein the SLSes determine, for each point in the point cloud, an x,y,z,λ value, and wherein x,y,z are 3D spatial coordinates and λ is an illumination wavelength at the 3D spatial coordinates.
  • 12. The computer-implemented method of claim 7, wherein the analysis and presentation system includes analytics tools for correlate and predict flow using data available through a supervisory control and data acquisition (SCADA) system.
  • 13. A non-transitory, computer-readable medium storing one or more instructions executable by a computer system to perform operations comprising: generating, by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward one or more objects, a three-dimensional (3D) point cloud of the one or more objects, including defining points of the 3D point cloud that are relative to reference points on the one or more objects;performing, using the 3D point cloud, real-time contactless 3D surface measurements of the one or more objects; anddetermining, by an analysis and presentation system by analyzing the real-time contactless 3D surface measurements, material changes, damage, deformations, mechanical tension and strain, material degradation and contamination, and correlations between displacement, temperature, and flow rates in one or more parts of the one or more objects.
  • 14. The non-transitory, computer-readable medium of claim 13, wherein the one or more objects include equipment used in petrochemical industry operations, wherein the equipment includes one or more of wellheads, production tubing, and manifolds, and wherein the reference points are markers positioned on specific locations on the equipment.
  • 15. The non-transitory, computer-readable medium of claim 13, wherein the SLSes include laser-patterned illumination devices.
  • 16. The non-transitory, computer-readable medium of claim 13, wherein the SLSes provide a spatial resolution of 100 micrometers and a repetition rate greater than or equal to 60 Hertz (Hz).
  • 17. The non-transitory, computer-readable medium of claim 13, wherein the SLSes determine, for each point in the point cloud, an x,y,z,λ value, and wherein x,y,z are 3D spatial coordinates and λ is an illumination wavelength at the 3D spatial coordinates.
US Referenced Citations (248)
Number Name Date Kind
2757738 Ritchey Sep 1948 A
2795279 Erich Jun 1957 A
2799641 Gordon Jul 1957 A
3016244 Friedrich et al. Jan 1962 A
3103975 Hanson Sep 1963 A
3104711 Haagensen Sep 1963 A
3114875 Haagensen Dec 1963 A
3133592 Tomberiin May 1964 A
3137347 Parker Jun 1964 A
3149672 Joseph et al. Sep 1964 A
3169577 Erich Feb 1965 A
3170519 Haagensen Feb 1965 A
3211220 Erich Oct 1965 A
3428125 Parker Feb 1969 A
3522848 New Aug 1970 A
3547192 Claridge et al. Dec 1970 A
3547193 Gill Dec 1970 A
3642066 Gill Feb 1972 A
3696866 Dryden Oct 1972 A
3735336 Long May 1973 A
3862662 Kern Jan 1975 A
3874450 Kern Apr 1975 A
3931856 Barnes Jan 1976 A
3946809 Hagedorn Mar 1976 A
3948319 Pritchett Apr 1976 A
4008762 Fisher et al. Feb 1977 A
4010799 Kern et al. Mar 1977 A
4019575 Pisio Apr 1977 A
4084637 Todd Apr 1978 A
4135579 Rowland et al. Jan 1979 A
4140179 Kasevich et al. Feb 1979 A
4140180 Bridges et al. Feb 1979 A
4144935 Bridges et al. Mar 1979 A
4185691 Tubin et al. Jan 1980 A
4193448 Jearnbey Mar 1980 A
4193451 Dauphine Mar 1980 A
4196329 Rowland et al. Apr 1980 A
4199025 Carpenter Apr 1980 A
4265307 Elkins May 1981 A
RE30738 Bridges et al. Sep 1981 E
4301865 Kasevich et al. Nov 1981 A
4320801 Rowland et al. Mar 1982 A
4373581 Toellner Feb 1983 A
4396062 Iskander Aug 1983 A
4412585 Bouck Nov 1983 A
4437519 Cha Mar 1984 A
4449585 Bridges et al. May 1984 A
4457365 Kasevich et al. Jul 1984 A
4462699 Herbert et al. Jul 1984 A
4470459 Copland Sep 1984 A
4476926 Bridges et al. Oct 1984 A
4484627 Perkins Nov 1984 A
4485868 Sresty et al. Dec 1984 A
4485869 Sresty et al. Dec 1984 A
4487257 Dauphine Dec 1984 A
4495990 Titus et al. Jan 1985 A
4498535 Bridges Feb 1985 A
4499948 Perkins Feb 1985 A
4508168 Heeren Apr 1985 A
4513815 Rundell et al. Apr 1985 A
4524826 Savage Jun 1985 A
4524827 Bridges et al. Jun 1985 A
4545435 Bridges et al. Oct 1985 A
4553592 Looney et al. Nov 1985 A
4576231 Dowling et al. Mar 1986 A
4583589 Kasevich Apr 1986 A
4592423 Savage et al. Jun 1986 A
4612988 Segalman Sep 1986 A
4620593 Haagensen Nov 1986 A
4660636 Rundell et al. Apr 1987 A
4701015 Saito Oct 1987 A
4705108 Little et al. Nov 1987 A
4717253 Pratt Jan 1988 A
4756627 Nelson Jul 1988 A
4817711 Jearnbey Apr 1989 A
4819723 Whitfill Apr 1989 A
4853507 Samardzija Aug 1989 A
5013126 Hattori May 1991 A
5039192 Basu Aug 1991 A
5055180 Klaila Oct 1991 A
5068819 Misra et al. Nov 1991 A
5072087 Apte Dec 1991 A
5082054 Kiamanesh Jan 1992 A
5236039 Edelstein et al. Aug 1993 A
5367157 Nilsson Nov 1994 A
5623576 Deans Apr 1997 A
5899274 Frauenfeld et al. May 1999 A
6041860 Nazzal et al. Mar 2000 A
6056882 Scalliet May 2000 A
6077400 Kartchner Jun 2000 A
6189611 Kasevich Feb 2001 B1
6214236 Scalliet Apr 2001 B1
6285014 Beck et al. Sep 2001 B1
6332500 Ellefsen Dec 2001 B1
6387327 Ricci May 2002 B1
6405802 Williams Jun 2002 B1
6413399 Kasevich Jul 2002 B1
6544411 Varandaraj Apr 2003 B2
6597446 Klooster et al. Jul 2003 B2
6678616 Winkler et al. Jan 2004 B1
6755262 Parker Jun 2004 B2
6814141 Huh et al. Nov 2004 B2
6888097 Batarseh May 2005 B2
6932155 Vinegar Aug 2005 B2
7024081 Dowd Apr 2006 B2
7048051 McQueen May 2006 B2
7091460 Kinzer Aug 2006 B2
7109457 Kinzer Sep 2006 B2
7115847 Kinzer Oct 2006 B2
7131498 Campo et al. Nov 2006 B2
7147064 Batarseh et al. Dec 2006 B2
7312428 Kinzer Dec 2007 B2
7331385 Symington Feb 2008 B2
7445041 O'Brien Nov 2008 B2
7461693 Considine et al. Dec 2008 B2
7484561 Bridges Feb 2009 B2
7486248 Halek et al. Feb 2009 B2
7562708 Cogliandro et al. Jul 2009 B2
7629497 Pringle Dec 2009 B2
7631691 Symington et al. Dec 2009 B2
7668419 Taverner Feb 2010 B2
7677673 Tranquilla et al. Mar 2010 B2
7719676 DiFoggio May 2010 B2
7775961 Meikrantz Aug 2010 B2
7828057 Kearl et al. Nov 2010 B2
7891416 Pankratz et al. Feb 2011 B2
7909096 Clark et al. Mar 2011 B2
8096349 Considine et al. Jan 2012 B2
8210256 Bridges et al. Jul 2012 B2
8378275 Novak Feb 2013 B2
8431015 Banerjee Apr 2013 B2
8485254 Huber et al. Jul 2013 B2
8526171 Wu et al. Sep 2013 B2
8555969 Goodwin et al. Oct 2013 B2
8586898 Novak Nov 2013 B2
8678087 Schultz et al. Mar 2014 B2
8824240 Roberts et al. Sep 2014 B2
8826973 Moxley et al. Sep 2014 B2
8925627 Tupper et al. Jan 2015 B2
8960215 Cui et al. Feb 2015 B2
9075155 Luscombe et al. Jul 2015 B2
9080949 Mestayer et al. Jul 2015 B2
9217291 Batarseh Dec 2015 B2
9255836 Taverner et al. Feb 2016 B2
9322255 Diehl et al. Apr 2016 B2
9353612 Batarseh May 2016 B2
9528364 Samuel et al. Dec 2016 B2
9546548 Hartog Jan 2017 B2
9567819 Cavender et al. Feb 2017 B2
9584711 Tjhang et al. Feb 2017 B2
9644464 Batarseh May 2017 B2
9690376 Davis et al. Jun 2017 B2
9765609 Chemali et al. Sep 2017 B2
10012758 Speck et al. Jul 2018 B2
10163213 Boyle et al. Dec 2018 B2
10330915 Rudolf et al. Jun 2019 B2
10641079 Aljubran et al. May 2020 B2
10732020 Spriggs Aug 2020 B1
10941644 Aljubran et al. Mar 2021 B2
20030075339 Gano Apr 2003 A1
20030098605 Vinegar May 2003 A1
20030173072 Vinegar Sep 2003 A1
20040256103 Batarseh Dec 2004 A1
20050199386 Kinzer Sep 2005 A1
20050207938 Hanawa Sep 2005 A1
20060012785 Funk et al. Jan 2006 A1
20060076347 Kinzer Apr 2006 A1
20060102343 Skinner et al. May 2006 A1
20060102625 Kinzer May 2006 A1
20060106541 Hassan et al. May 2006 A1
20070000662 Symington et al. Jan 2007 A1
20070108202 Kinzer May 2007 A1
20070131591 Pringle Jun 2007 A1
20070131594 Hakola Jun 2007 A1
20070137852 Considine et al. Jun 2007 A1
20070137858 Considine et al. Jun 2007 A1
20070153626 Hayes et al. Jul 2007 A1
20070181301 O'Brien Aug 2007 A1
20070187089 Bridges Aug 2007 A1
20070193744 Bridges Aug 2007 A1
20070204994 Wimmersperg Sep 2007 A1
20070261844 Cogliandro et al. Nov 2007 A1
20070267191 Pfeiffer Nov 2007 A1
20070284107 Crichlow Dec 2007 A1
20070289736 Kearl et al. Dec 2007 A1
20080073079 Tranquilla et al. Mar 2008 A1
20080111064 Andrews et al. May 2008 A1
20080173443 Symington et al. Jul 2008 A1
20090008079 Zazovsky Jan 2009 A1
20090071646 Pankratz et al. Mar 2009 A1
20090209825 Efinger et al. Aug 2009 A1
20090252842 Wang Oct 2009 A1
20090259446 Zhang et al. Oct 2009 A1
20090288820 Barron et al. Nov 2009 A1
20090296778 Kinugasa et al. Dec 2009 A1
20100044103 Moxley et al. Feb 2010 A1
20100089584 Burns Apr 2010 A1
20100095742 Symington et al. Apr 2010 A1
20100186955 Saasen et al. Jul 2010 A1
20100296100 Blacklaw Nov 2010 A1
20110011576 Cavender et al. Jan 2011 A1
20120000642 Betzer Tsilevich Jan 2012 A1
20120012319 Dennis Jan 2012 A1
20120048118 Hess Mar 2012 A1
20120075615 Niclass et al. Mar 2012 A1
20120169841 Chemali et al. Jun 2012 A1
20120181020 Barron et al. Jul 2012 A1
20120312538 Koolman Dec 2012 A1
20130008653 Schultz et al. Jan 2013 A1
20130037268 Kleefisch et al. Feb 2013 A1
20130126164 Sweatman et al. May 2013 A1
20130191029 Heck, Sr. Jul 2013 A1
20130213637 Kearl Aug 2013 A1
20130213795 Strohm et al. Aug 2013 A1
20130255936 Statoilydro et al. Oct 2013 A1
20140034144 Cui et al. Feb 2014 A1
20140050619 Meller Feb 2014 A1
20140090846 Deutch Apr 2014 A1
20140110118 Hocking Apr 2014 A1
20140199017 Den Boer et al. Jul 2014 A1
20140231147 Bozso et al. Aug 2014 A1
20140240951 Brady et al. Aug 2014 A1
20140278111 Gerrie et al. Sep 2014 A1
20140360778 Batarseh Dec 2014 A1
20150275636 Diehl et al. Oct 2015 A1
20150308248 Diehl et al. Oct 2015 A1
20150355015 Crickmore et al. Dec 2015 A1
20160153240 Braga et al. Jun 2016 A1
20160223389 Farhadiroushan et al. Aug 2016 A1
20160247316 Whalley et al. Aug 2016 A1
20170044889 Rudolf et al. Feb 2017 A1
20170097305 Prinz Apr 2017 A1
20170234104 James Aug 2017 A1
20170260847 Xia et al. Sep 2017 A1
20180010419 Livescu et al. Jan 2018 A1
20180156600 Cable et al. Jun 2018 A1
20180266226 Batarseh et al. Sep 2018 A1
20200024926 Reinas et al. Jan 2020 A1
20200048966 Batarseh Feb 2020 A1
20200134773 Pinter et al. Apr 2020 A1
20200190967 Gooneratne et al. Jun 2020 A1
20200210680 Shreve Jul 2020 A1
20200224524 Parmeshwar et al. Jul 2020 A1
20200319108 Butte et al. Oct 2020 A1
20200325764 Ruehmann Oct 2020 A1
20210054721 Batarseh Feb 2021 A1
20210115781 Stark Apr 2021 A1
20210156243 Aljubran et al. May 2021 A1
Foreign Referenced Citations (32)
Number Date Country
2669721 Jul 2011 CA
101079591 Nov 2007 CN
102493813 Jun 2012 CN
203081295 Jul 2013 CN
203334954 Dec 2013 CN
103591927 Feb 2014 CN
104295448 Jan 2015 CN
204627586 Sep 2015 CN
107462222 Dec 2017 CN
110847970 Feb 2020 CN
2317068 May 2011 EP
2737173 Jun 2014 EP
2230109 Oct 1990 GB
WO 2008146017 Dec 2008 WO
WO 2009020889 Feb 2009 WO
WO 2011038170 Mar 2011 WO
WO 2011101739 Aug 2011 WO
WO 2012038814 Mar 2012 WO
WO 2012136951 Oct 2012 WO
WO 2013155061 Oct 2013 WO
WO 2014171960 Oct 2014 WO
WO 2014189533 Nov 2014 WO
WO 2014201313 Dec 2014 WO
WO 2015095155 Jun 2015 WO
WO 2015140636 Sep 2015 WO
WO 2015142330 Sep 2015 WO
WO 2016148687 Sep 2016 WO
WO 2017011078 Jan 2017 WO
WO 2017095575 Jun 2017 WO
WO 2018169991 Sep 2018 WO
WO 2018222541 Dec 2018 WO
WO 2019171082 Sep 2019 WO
Non-Patent Literature Citations (35)
Entry
U.S. Appl. No. 17/064,459, filed Oct. 6, 2020, Alelaiwy et al.
Al-Nakhli et al., “Enhanced Oil Recovery by In-Situ Steam Generation,” U.S. Appl. No. 61/652,359, filed Jun. 7, 2010, 25 pages.
Antony et al., “Photonics and fracture toughness of heterogeneous composite materials,” 2017, Scientific Reports, 7:4539, 8 pages.
Batarseh et al., “Downhole high-power laser tools development and evolutions,” presented at the Abu Dhabi International Petroleum & Exhibition Conference, Abu Dhabi, United Arab Emirates, Nov. 12-15, 2018, 15 pages.
Batarseh et al., “High power laser application in openhole multiple fracturing with an overview of laser research; Past, present and future,” presented at the SPE Saudi Arabia Section Technical Symposium and Exhibition, Khobar, Saudi Arabia, Apr. 8-11, 2012, Society of Petroleum Engineers, 10 pages.
Batarseh et al., “Laser Gun: The Next Perforation Technology,” presented at the SPE Middle East Oil & Gas Show and Conference, Manama, Bahrain, Mar. 18-21, 2019, 15 pages.
Batarseh et al., “Microwave With Assisted Ceramic Materials to Maximize Heat Penetration and Improve Recovery Efficiency of Heavy Oil Reservoirs,” presented at the SPE Middle East Oil & Gas Show and Conference, Kingdom of Bahrain, Mar. 6-9, 2017, 24 pages.
Batarseh et al., “Well Perforation Using High-Power Lasers,” presented at the SPE Annual Technical Conference and Exhibition, Denver, Colorado, Oct. 5-8, 2003, 10 pages.
Berkowitz et al., “Extraction of Oil Sand Bitmens with Supercritical Water”; Fuel Processing Technology, 25: 1 (33-44), Apr. 1, 1990, 12 pages.
Bientinesi et al., “A New Technique for Heavy Oil Recovery Based on Electromagnetic Heating: Pilot Scale Experimental Validation”; Chemical Engineering Transactions, 32 (2287-2292), Jun. 2, 2013, 6 pages.
Boinott et al., “High resolution geomechanical profiling in heterogeneous source rock from the Vaca Muerta Formation, Neuquen Basin, Argentina,” presented at the 52nd US Rock Mechanics/Geomechanics Symposium, Seattle, Washington, USA, American Rock Mechanics Association, Jun. 17-20, 2018, 8 pages.
Born et al., “Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light,” 6th ed. Pergamon Press, 808 pages.
Caryotakis, “The klystron: A microwave source of surprising range and endurance.” The American Physical Society, Division of Plasma Physics Conference in Pittsburg, PA, Nov. 1997, 14 pages.
Cerutti et al.; “A New Technique for Heavy Oil Recovery Based on Electromagnetic Heating: System Design and Numerical Modelling”; Chemical Engineering Transaction, 32:1255-1260, Dec. 31, 2013, 6 pages.
Frank, “Discriminating between coherent and incoherent might with metasurfaces,” Jul. 2018, 11 pages.
Gemmeke and Ruiter, “3D ultrasound computer tomography for medical imagining,” Nuclear Instruments and Methods in Physics Research A 580, Oct. 1, 2007, 9 pages.
Ghatak and Thyagarajan, “An introduction to Fiber Optics,” Cambridge University Press, 1st Ed., Jun. 28, 1998, 6 pages.
Graves et al., “Temperatures Induced by High Power Lasers: Effects on Reservoir Rock Strength and Mechanical Properties,” presented at the SPE/ISRM Rock Mechanics Conference, Irvine, Texas, Oct. 20-23, 2002, 7 pages.
Guo et al., “Convolutional Neural Networks for Steady Flow Approximation,” presented at the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining—KDD, San Francisco, California, Aug. 13-17, 2016, 10 pages.
Hveding et al., “Integrated Applications of Fiber-Optic Distributed Acoustic and Temperature Sensing,” SPE Latin American and Caribbean Petroleum Engineering Conference, Nov. 20, 2015, 16 pages.
Johnson, “Design and Testing of a Laboratory Ultrasonic Data Acquisition System for Tomography” Thesis for the degree of Master of Science in Mining and Minerals Engineering, Virginia Polytechnic Institute and State University, Dec. 2, 2004, 108 pages.
Li et al., “Application of carbon nanocatalysts in upgrading heavy cmde oil assisted with microwave heating,” Nano letters, 2014, 14.6: 3002-3008, 7 pages.
Mutyala et al., “Microwave applications to oil sands and petroleum: A review,” Fuel Processing Technology, 2010, 91:127-135, 9 pages.
Nourbakhsh et al., “Embedded sensors and feedback loops for iterative improvement in design synthesis for additive manufacturing,” presented at the ASME 2016 International Design Engineering Technical Conference and Information in Engineering Conference, Charlotte, NC, 9 pages.
O-brien et al. et al., “StarWars Laser Technology for Gas Drilling and Completions in the 21st Century,” presented at the SPE Annual Technical Conference and Exhibition, Houston, Texas, Oct. 3-6, 1999, 10 pages.
Ruiter et al., “3D ultrasound computer tomography of the breast: A new era?” European Journal of Radiology 81S1, Sep. 2012, 2 pages.
Salehi et al., “Laser drilling—drilling with the power of light,” Gas Technology Institute Report, 2000-2007 period report, Chicago, IL, 318 pages.
San-Roman-Alerigi et al., “Machine learning and the analysis of high-power electromagnetic interaction with subsurface matter,” presented at the SPE Middle East Oil and Gas Show and Conference, Manama, Bahrain, Mar. 18-21, 2019, 11 pages.
San-Roman-Alerigi et al., “Geomechanical and thermal dynamics of distributed and far-field dielectric heating of rocks assisted by nano-enablers—A numerical exploration,” presented at the SPE Abu Dhabi International Petroleum Exhibition and Conference, Abu Dhabi, UAE, Nov. 13-16, 2017, 21 pages.
San-Roman-Alerigi et al., “Numerical Modeling of Thermal and Mechanical Effects in Laser-Rock Interaction—An Overview,” presented at the 50th U.S. Rock Mechanics/Geomechanics Symposium, Houston, TX, Jun. 26-29, 2016; American Rock Mechanics Association, 2016, 11 pages.
Towarddatascience.com [online], “Support vector machine—introduction to machine learning algorithms,” Ghandi, Jul. 7, 2018, retrieved May 19, 2021, retrieved from URL <https://towardsdatascience.com/support-vector-machine-introduction-to-machine-leaming-algorithms-934a444fca47>, 12 pages.
Towardsdatascience.com [online], “K-Means Clustering—Explained,” Yildrim, Mar. 2020, retrieved on May 19, 2021, retrieved from URL <https://towardsdatascience.com/k-means-clustering-explained-4528df86a120#:˜:text-K%2Dmeans%20clustering%20aims%20to,methods%20to%20measure%20the%20distance>, 12 pages.
Vaferi et al., “Modeling and analysis of effective thermal conductivity of sandstone at high pressure and temperature using optimal artificial neural networks,” Journal of Petroleum Science and Engineering, 2014, 119, 10 pages.
Zinati, “Using Distributed Fiber-Optic Sensing Systems to Estimate Inflow and Reservoir Properties,” Technische Universiteit Delft, 2014, 135 pages.
PCT International Search Report and Written Opinion in International Appln. No. PCT/US2022/072537, dated Sep. 8, 2022, 16 pages.
Related Publications (1)
Number Date Country
20220372868 A1 Nov 2022 US