In the oil and gas industry, daily operations include real world tasks that occur in the field, such as drilling and production of hydrocarbons. Real-time augmented reality comes at the intersection of real world tasks and technology. Technology advances at an amazingly fast rate nowadays, allowing technology to enter traditionally manual realms. One such realm is vision itself. In other words, how we as people view the world.
In general, in one aspect, embodiments relate to a method for augmenting an immediate user task. The method includes obtaining role information identifying a role of a user within an oilfield company. The user is performing oilfield operations in a field. The method further includes identifying a current location of the user in the field to identify the immediate user task being performed by the user in the field, defining, using the role information, a user perspective of the user, selecting metadata corresponding to the user perspective to obtain selected metadata, and presenting the selected metadata to the user.
Other aspects will be apparent from the following detailed description and the appended claims.
Specific embodiments will now be described in detail with reference to the accompanying figures. Like elements in the various figures are denoted by like reference numerals for consistency.
In the following detailed description of embodiments, numerous specific details are set forth in order to provide a more thorough understanding. However, it will be apparent to one of ordinary skill in the art that embodiments may be practiced without these specific details. In other instances, well-known features have not been described in detail to avoid unnecessarily complicating the description.
In general, embodiments provide a method and system for augmenting an immediate first user task of a user in a field. Specifically, embodiments identify a role of a user to define a user perspective. The current location of the user and the role are then used to identify the immediate user task of the user. Metadata is selected based on the user perspective on the immediate user task. In one or more embodiments, the selected metadata is presented to the user.
For the purposes of this application, a user task is “immediate” if the user is performing the user task on the field while the user is located at the field. In other words, an immediate task is a task in which the user is currently performing or at least will start performing within the next hour.
Sensors (S), such as gauges, may be positioned about the field to collect data relating to various field operations as described previously The data gathered by the sensors (S) may be collected by the surface unit (134) and/or other data collection sources for analysis or other processing. The data collected by the sensors (S) may be used alone or in combination with other data. Further, the data outputs from the various sensors (S) positioned about the field may be processed for use. The data may be collected in one or more databases and/or transmitted on or offsite. The data or select portions of the data may be selectively used for analyzing and/or predicting operations of the immediate and/or other wellbores. The data may be may be historical data, real time data or combinations thereof. The real time data may be used in real time, or stored for later use. The data may also be combined with historical data or other inputs for further analysis. The data may be stored in separate data repositories, or combined into a single data repository.
The collected data may be used to perform analysis, such as modeling operations. For instance, seismic data output may be used to perform geological, geophysical, and/or reservoir engineering. The reservoir, wellbore, surface and/or process data may be used to perform reservoir, wellbore, geological, geophysical or other simulations. The data outputs from the operation may be generated directly from the sensors (S), or after some preprocessing or modeling. These data outputs may act as inputs for further analysis.
The data is collected and stored at the surface unit (134). One or more surface units (134) may be located at the field (100), or connected remotely thereto. The surface unit (134) may be a single unit, or a complex network of units used to perform the data management functions throughout the field (100). The surface unit (134) may be a manual or automatic system. The surface unit (134) may be operated and/or adjusted by a user.
The surface unit (134) may be provided with a transceiver (137) to allow communications between the surface unit (134) and various portions of the field (100) or other locations. The surface unit (134) may also be provided with or functionally connected to one or more controllers for actuating mechanisms at the field (100). The surface unit (134) may then send command signals to the field (100) in response to data received. The surface unit (134) may receive commands via the transceiver or may itself execute commands to the controller. A processor may be provided to analyze the data (locally or remotely) and make the decisions and/or actuate the controller. In this manner, the field (100) may be selectively adjusted based on the data collected. This technique may be used to optimize portions of the operation, such as controlling wellhead pressure, choke size or other operating parameters. These adjustments may be made automatically based on computer protocol, and/or manually by an operator. In some cases, well plans may be adjusted to select optimum operating conditions, or to avoid problems.
As shown, the sensor (S) may be positioned in the production tool (106) or associated equipment, such as the christmas tree, gathering network, surface facilities and/or the production facility, to measure fluid parameters, such as fluid composition, flow rates, pressures, temperatures, and/or other parameters of the production operation.
While
The field configuration in
The respective graphs of
Data plots (308.1-308.3) are static data plots that may be generated by the data acquisition tools (302.1-302.4), respectively. Static data plot (308.1) is a seismic two-way response time. Static plot (308.2) is core sample data measured from a core sample of the formation (304). Static data plot (308.3) is a logging trace. Production decline curve or graph (308.4) is a dynamic data plot of the fluid flow rate over time, similar to the graph (206) of
The subterranean formation (304) has a plurality of geological formations (306.1-306.4). As shown, the structure has several formations or layers, including a shale layer (306.1), a carbonate layer (306.2), a shale layer (306.3) and a sand layer (306.4). A fault line (307) extends through the layers (306.1-306.2). The static data acquisition tools are adapted to take measurements and detect the characteristics of the formations.
While a specific subterranean formation (304) with specific geological structures is depicted, it will be appreciated that the field may contain a variety of geological structures and/or formations, sometimes having extreme complexity. In some locations, including below the water line, fluid may occupy pore spaces of the formations. Each of the measurement devices may be used to measure properties of the formations and/or its geological features. While each acquisition tool is shown as being in specific locations in the field, it will be appreciated that one or more types of measurement may be taken at one or more location across one or more fields or other locations for comparison and/or analysis.
The data collected from various sources, such as the data acquisition tools of
Data may be collected by various sensors, for example, during drilling operations. Specifically, drilling tools suspended by a rig may advance into the subterranean formations to form a wellbore (i.e., a borehole). The borehole may have a trajectory in the subterranean formations that is vertical, horizontal, or a combination thereof. Specifically, the trajectory defines the path of the drilling tools in the subterranean formation. A mud pit (not shown) is used to draw drilling mud into the drilling tools via flow line for circulating drilling mud through the drilling tools, up the wellbore and back to the surface. The drilling mud is filtered and returned to the mud pit. Occasionally, such mud invades the formation surrounding the borehole resulting in an invasion. Continuing with the discussion of drilling operations, a circulating system may be used for storing, controlling, or filtering the flowing drilling mud. The drilling tools are advanced into the subterranean formations to reach reservoir. Each well may target one or more reservoirs.
The drilling tools are adapted for measuring downhole properties using logging while drilling tools. Specifically, the logging while drilling tools include sensors for gathering well logs while the borehole is being drilled. In one or more embodiments, during the drilling operations, the sensors may pass through the same depth multiple times. The data collected by the sensors may be similar or the same as the data collected by the sensors discussed below with reference to
Each wellsite (402) has equipment that forms a wellbore (436) (i.e., borehole) into the earth. The wellbores extend through subterranean formations (406) including reservoirs (404). These reservoirs (404) contain fluids, such as hydrocarbons. The wellsites draw fluid from the reservoirs and pass them to the processing facilities via surface networks (444). The surface networks (444) have tubing and control mechanisms for controlling the flow of fluids from the wellsite to the processing facility (454).
Wellbore production equipment (564) extends from a wellhead (566) of wellsite (402) and to the reservoir (404) to draw fluid to the surface. The wellsite (402) is operatively connected to the surface network (444) via a transport line (561). Fluid flows from the reservoir (404), through the wellbore (436), and onto the surface network (444). The fluid then flows from the surface network (444) to the process facilities (454).
As further shown in
One or more surface units (534) may be located at the field 400, or linked remotely thereto. The surface unit (534) may be a single unit, or a complex network of units used to perform the data management functions throughout the field (400). The surface unit may be a manual or automatic system. The surface unit may be operated and/or adjusted by a user. The surface unit is adapted to receive and store data. The surface unit may also be equipped to communicate with various field equipment. The surface unit may then send command signals to the field in response to data received or modeling performed.
As shown in
The analyzed data (e.g., based on modeling performed) may then be used to make decisions. A transceiver (not shown) may be provided to allow communications between the surface unit (534) and the field (400). The controller (522) may be used to actuate mechanisms at the field (400) via the transceiver and based on these decisions. In this manner, the field (400) may be selectively adjusted based on the data collected. These adjustments may be made automatically based on computer protocol and/or manually by an operator. For example, based on revised log data, commands may be sent by the surface unit to the downhole tool to change the speed or trajectory of the borehole. In some cases, well plans are adjusted to select optimum operating conditions or to avoid problems.
To facilitate the processing and analysis of data, simulators may be used to process the data for modeling various aspects of the operation. Specific simulators are often used in connection with specific operations, such as reservoir or wellbore simulation. Data fed into the simulator(s) may be historical data, real time data or combinations thereof. Simulation through one or more of the simulators may be repeated or adjusted based on the data received.
As shown, the operation is provided with wellsite and non-wellsite simulators. The wellsite simulators may include a reservoir simulator (340), a wellbore simulator (342), and a surface network simulator (344). The reservoir simulator (340) solves for hydrocarbon flow through the reservoir rock and into the wellbores. The wellbore simulator (342) and surface network simulator (344) solves for hydrocarbon flow through the wellbore and the surface network (444) of pipelines. As shown, some of the simulators may be separate or combined, depending on the available systems.
The non-wellsite simulators may include process simulator (346) and economics (348) simulators. The processing unit has a process simulator (346). The process simulator (346) models the processing plant (e.g., the process facilities (454) where the hydrocarbon(s) is/are separated into its constituent components (e.g., methane, ethane, propane, etc.) and prepared for sales. The field (400) is provided with an economics simulator (348). The economics simulator (348) models the costs of part or the entire field (400) throughout a portion or the entire duration of the operation. Various combinations of these and other field simulators may be provided.
In one or more embodiments, a data repository (616) is any type of storage unit and/or device (e.g., a file system, database, collection of tables, or any other storage mechanism) for storing data. Further, the data repository (616) may include multiple different storage units and/or devices. The multiple different storage units and/or devices may or may not be of the same type or located at the same physical site.
In one or more embodiments, the data in the data repository (616) includes metadata (618). In one or more embodiments, metadata (618) is any data that describes the field, including the subsurface of the earth, wellsites, wellbores, pump stations, personnel located at the field or any other portion of the field. The metadata (618) is additional information to aid a user that is performing an oilfield task.
In one or more embodiments, metadata (618) includes historical data, alerts, measurements from sensors (e.g.,
In one or more embodiments, an alert is a notice of an irregular event or dangerous event to a user. For example, a driller may receive an alert. The alert may notify the driller that the subsurface is unstable at the driller's current location.
In one or more embodiments, a measurement from a sensor is real-time data that is collected by the sensor positioned about the field. In one or more embodiments, real-time data is data that is accessible during an oilfield task in the field. For example, the temperature, pressure, and flow rates within a wellbore may be determined by the sensor in the wellbore. Processed data from a measurement may correspond to data interpolated or derived from a measurement from a sensor. For example, the pressure and temperature may be determined by the sensor in the well. The density of the fluid may then be derived from the temperature and pressure.
In one or more embodiments, personnel data is any data about a user in the field. Personnel data may include hours of a user expended on an immediate task, a role of the user, an immediate task of the user. For example, a production manager may arrive at a rig site on the field. The production manager may be interested in the time spent by the drillers on fixing the pump at the rig site. The metadata of interest to the production manager is the hours spent fixing the pump and which of the personnel at the rig site have the role of driller.
Supply chain data is any data describing equipment or resources used in an oilfield task or equipment or resources that are out of service. For example, supply chain data may be a drilling tool that is out of service, a field tool that is not available as the field tool is in use, a driller required for an oilfield task, and a rig that is out of service. For example, a driller may decide the next wellsite to drill at based on which rigs are available.
In one or more embodiments, metadata (618) includes predictive data and current data. Predictive data is data that is expected based on extrapolating current data. In other words, predictive data is a possible future result if the trends in historical and current data remain the same. Predictive data may be data calculated based on a scenario. For example, historical data shows that the flow of drawing fluid from a wellsite has decreased each year. A production engineer may then obtain predictive data describing an expected amount of fluid in the next month from the wellsite. The production engineer may then choose to decrease the manpower at the wellsite. Current data is any data that is collected at the time during which an oilfield task is performed. For example, if the oilfield task is drilling a borehole using a logging while drilling tool, current data may be logs recorded by a sensor on the logging while drilling tool.
In one or more embodiments, metadata (618) augments an oilfield task by adding content to an oilfield output. In one or more embodiments, the oilfield output is any data a user in the field may visualize in a field of view of a computing device (e.g., computing device X (602.1) and computing device Y (602.2)) discussed below. A field of view is the extent of the real world a computing device can capture. Said another way, the field of view is the area of the real world visible using a computing device. For example, a camera on a computing device is limited to a field of view of 50 cm by 40 cm.
A user is any person working in the field. For example, the user may be a driller, a drilling engineer, a production engineer, a geologist, a field engineer, and a geophysicist.
An oilfield task is any task a user performs during an oilfield operation (e.g., exploration, drilling, production). An oilfield task includes performing Blocks in a workflow, reading measurements from wellsites, drilling, troubleshooting at a pump station, and determining fluid flow reservoir characteristics, characteristics of subterranean formations, and locations of wellsites.
The data repository (616) is operatively connected to a data collection system (622) and an oilfield application (608) in accordance with one or more embodiments.
In one or more embodiments, a data collection system (622) may be software, hardware, or a combination thereof. For example, Avocet is software that collects any production related information (Avocet is a mark of Schlumberger, Inc. located in Houston, Tex., USA). The data collection system (622) includes one or more data collectors (e.g., data collector A (624.1), data collector B (624.2), and data collector C (624.3)). The data collection system (622) includes functionality to receive metadata (618) from one or more data collectors and store the metadata in the data repository (616).
In one or more embodiments, a data collector (e.g., data collector A (624.1), data collector B (624.2), and data collector C (624.3)) is operatively connected to one or more information sources (e.g., information source A (626.1), information source C (626.2), information source D (626.3), and information source F (626.4)) in accordance with one or more embodiments. A data collector may be software, hardware, or a combination thereof. A data collector includes functionality to receive metadata (618) from one or more information sources. A data collector may be located at the field or linked remotely thereto. A data collector includes a user manually inputting data from an information source, a surface unit (e.g.,
In one or more embodiments, an information source (e.g., information source A (626.1), information source C (626.2), information source D (626.3), and information source F (626.4)) is a source of metadata (618). An information source includes a sensor positioned about the field (e.g.,
In one or more embodiments, an oilfield application (608) may be software, hardware, or a combination thereof. The oilfield application (608) includes a data aggregation system (614), an oilfield management program (612), and a sensory data manager (610). Each of these components is described below.
In one or more embodiments, the data aggregation system (614) includes functionality to aggregate information obtained from the data collection system (622). The data aggregation system includes further functionality to store the aggregated information in the data repository (616). For example, Studio is software that aggregates and manages information (Studio is a mark of Schlumberger, Inc. located in Houston, Tex., USA). For example, any data collected from a sensor that affects production may be aggregated into a production set of data. Any data collected from a sensor that affects drilling may be aggregated into a drilling set of data.
In one or more embodiments, the oilfield management program (612) includes functionality to perform an immediate user task with a user. In one or more embodiments, an immediate user task of the user is a real-time oilfield task of the user, including troubleshooting at a pump station, verifying measurements at a wellsite, and drilling.
In one or more embodiments, the sensory data manager (610) includes functionality to obtain role information of the user to identify a role of the user, and select metadata (618) accordingly. The sensory data manager (610) includes further functionality to encode oilfield output with selected metadata. The selected metadata is based on the user perspective of the user on an immediate user task the user is performing in the field.
The oilfield application (608) is operatively connected to the data repository (616) and one or more computing devices (e.g., computing device X (602.1) and computing device Y (602.2)) in accordance with one or more embodiments.
In one or more embodiments, a computing device (e.g., computing device X (602.1) and computing device Y (602.2)) is a hardware device that includes at least a microprocessor. A microprocessor may be an integrated circuit for processing instructions. The computing device may include a tablet, smart safety glasses, headphones, a wearable tactile device (e.g., gloves that heat up and/or cool down based on input) with at least a microprocessor, change color based on input), clothing and other accessories with at least a microprocessor, a laptop computer, a smartphone or any other computing device that may operate in the field. In one or more embodiments, computing devices are located on the field. However, the computing devices may be remote from each other, the data repository (616), the data collection system (622), and the information sources (e.g., information source A (626.1), information source C (626.2), information source D (626.3), and information source F (626.4)).
In one or more embodiments, the computing device includes a client application (e.g., client application X (604.1) and client application Y (604.2) and a local data store (e.g., local data store X (606.1) and local data store Y (606.2)). The client application is an instance of the oilfield application (608). An instance of the oilfield application is an executable copy of the oilfield application (608). The client application includes functionality to query a user of the client application on role information. Role information is data about the user that may identify the role of the user. For example, the role of the user may be determined from the name of the user. In one or more embodiments, the role of a user is the job title of a user in an immediate user task in the field. The role may include driller, field engineer, production engineer, geologist, drilling engineer, field engineer, and geophysicist, or any other profession or job title of the user. The client application includes further functionality to present any metadata (618) from the oilfield application to a user of the client application.
In one or more embodiments, each client application may not present, encode the same metadata or have access to the same features from the oilfield application. A feature of the oilfield application is a functionality in the oilfield application. A feature may include a tool, a workflow, and access, display, and analysis of metadata. For example, a driller using client application X (604.1) may be performing a drilling task. The production features of the oilfield application may be disabled on the client application X (604.1). When the driller faces a tablet towards a wellsite, the metadata visible to the driller in the client application X (604.1) describes the location of the drilling tools. Facing a computing device, such as a smartphone or tablet, towards a wellsite is to position the camera of the computing device in the direction of the wellsite, such that an image of the wellsite as currently viewable is shown in the display of the computing device. Similarly, a production engineer using client application Y (604.2) may be performing a production task. The drilling features of the oilfield application may be disabled on the client application Y (604.2). When the production engineer faces a smartphone towards a wellsite, the metadata visible to the production engineer in the client application Y (604.2) describes the predicted production from the wellsite.
In one or more embodiments, local data store (e.g., local data store X (606.1) and local data store Y (606.2)) is in the computing device. The local data store is any type of storage unit (e.g., database, collection of tables, or any other storage mechanism) for storing data. The local data store stores metadata and the role information for a user of the computing device.
While
Further, an alternate configuration may not include a component. For example, a computing device may directly access the oilfield application or data repository rather than executing a client application or having local data store.
In Block 702, the execution of the oilfield application is initiated. In one or more embodiments, the execution of the oilfield application is initiated by a user on a computing device, such as a tablet, smartphone, and laptop. In one or more embodiments, the execution of the oilfield application is automatically initiated at the boot-up of a computing device. In one or more embodiments, the oilfield application is a background process on a computing device. The oilfield application may execute while a user is located on the field and/or travelling to the field in one or more embodiments.
In Block 704, role information is obtained to identify a role of a user. In one or more embodiments, role information may be obtained by a user manually entering the name of the user or the role of the user on a computing device. The role may also be derived based on information a user manually enters on a computing device, including an oilfield task and project location.
In one or more embodiments, the role information may be automatically obtained from a human resources system. For example, the username of a user on a computing device may be mapped to a record in a human resources system. The mapping may then be used to identify a role of the user. In one or more embodiments, the human resources system stores data about each user that has a role at an oilfield company, including role information. An oilfield company is a company that drills and produces hydrocarbons.
In Block 706, a current location of the user is identified to identify an immediate user task of the user that includes a decision. In one or more embodiments, the immediate user task is identified by the current location of the user and the role of the user obtained in Block 704. In one or more embodiments, the current location of the user is the real-time location of the user in the field. As described above, in one or more embodiments, real-time corresponds to the present time in the field.
Continuing with Block 706, in one or more embodiments, the current location of the user may be identified automatically using a global positioning system (GPS) or a GPS embedded in a computing device. In one or more embodiments, the user may manually enter the current location of the user in the oilfield application. In one or more embodiments, the oilfield application may automatically identify the current location of the user using assigned immediate user tasks from the human resources system described above.
Continuing with Block 706, the current location of the user identifies an immediate user task of the user by locating the objects in the field that are within a radius to the current location in one or more embodiments. Objects are any component of the field, such as geological structures, personnel, and oilfield equipment. Objects include a well, a wellsite, a pump, a rig, a pipe, or any component of the field. The object and the role identified in Block 704 may identify an immediate user task of the user. The supply chain data described above may also be used to describe the real-time equipment of the user. In one or more embodiments, the real-time equipment of the user and the current location of the user may identify an immediate user task of the user.
For example, a driller's current location is at a rig site. The driller's equipment is a transmission pump. From the driller's current location and the driller's equipment, an immediate user task of replacing the transmission of the pump at the rig site is identified.
In Block 708, a user perspective based on the role information obtained in Block 704 is defined. In one or more embodiments, a user perspective is a point of view of the user on the immediate user task identified in Block 706. The point of view of the user defines what is interesting to the user based on the skills required by a role of the user. For example, a faulty field tool is of no interest to a driller. The drilling tools are of interest to the driller as the driller is trained to use the drilling tools. As an example, a user perspective of a driller at a wellsite may correspond to ensuring the drilling is safe. In contrast, a user perspective of a production engineer at a wellsite may correspond to ensuring that a rate of flow of a wellbore meets the required rate of flow determined by a production manager.
Continuing with Block 708, the user perspective is defined by limiting the point of view of the user to an oilfield area of interest determined by the role information in one or more embodiments. An oilfield area of interest is portion of the operations in the field that are interesting to the user. For example, a production engineer may be limited to the operational phase of production. In one or more embodiments, the user perspective of a user based on role information is not the same as the user perspective of another user that has different role information.
For example, a user perspective of a structural geologist looking at a portion of the surface of the earth is the characteristics of subterranean formation. However, a user perspective of a driller looking at the same portion of the surface of the earth as the structural geologist differs. The user perspective of the driller is the danger of drilling at that portion of the surface of the earth. Although both the structural geologist and the driller are looking at the same portion of the earth, the user perspective differs.
In Block 712, a determination is made whether additional context filters exist based on the user perspective. In one or more embodiments, an additional context filter is a filter that further limits the area of interest of the user. For example, a user perspective of a production engineer may be the production phase in the field. However, the user perspective may be further limited by an urgent matters context filter. Using the urgent matters context filter, the production engineer is limited to urgencies that affect productivity of the immediate task, such as a blocked pipe.
Additional context filters may include the current location of the user, schedule of the user, historical trends of the immediate user task, equipment of the immediate user task, team information of the user, and goals of the immediate user task. The current location of the user is described above. In one or more embodiments, a schedule of the user is a list of the oilfield tasks for the user, including the immediate task of the user. In one or more embodiments, historical trends of the immediate user task are historical data that may predict a trend for the immediate user task. In one or more embodiments, equipment of the immediate user task is any tools or mechanical devices that a user uses to complete the immediate user task. In one or more embodiments, team information of the user is the personnel working alongside the user on an immediate user task. Goals of the immediate user task are any requirements to complete the immediate user task in one or more embodiments.
In one or more embodiments, determining whether additional context filters exist based on the user perspective may correspond to a search of a set of additional context filters. In one or more embodiments, the set of additional context filters includes additional context filters regardless of the user perspective. A search of the set of additional context filters may be based on keywords from the user perspective defined in Block 708. As an example, a user perspective of a driller may correspond to drilling. Additional context filters may be searched based on the keyword “drilling” and any variation of the keyword, such as “driller” and “drill”. Additional context filters are found and may include available drilling tools at the present time, drillers at the driller's current location, and drilling safety at the driller's current location. If the determination is made that additional context filters exist based on the user perspective, the method may proceed to Block 714.
In Block 714, the selected context filters are ranked to obtain ranked context filters. In one or more embodiments, the selected context filters are ranked by the context filter's relevance with respect to the immediate user task and/or current location of the user. For example, a driller that has an immediate user task of drilling. The current location of the driller is dangerous. Based on the current location and the immediate user task, a drilling safety at the driller's current location context filter is more relevant to the driller than a drilling tools at the present time context filter.
In Block 716, the ranked context filters are applied to select metadata according to the user perspective and additional context filters. In one or more embodiments, selecting metadata is first selected according to the user perspective and then further limited by applying the ranked context filters. In one or more embodiments, applying the ranked context filters may correspond to first applying the top ranked context filter, then applying the next ranked context filter and so on.
If the determination is made that additional context filters do not exist based on the user perspective, the method may proceed to Block 718. In Block 718, metadata is selected according to the user perspective. In one or more embodiments, the selected metadata in Block 716 is more limited than the selected metadata from Block 718. As an example, selecting metadata according to the user perspective of a production engineer may be color coding of each pipe at a production engineer's current location to show the rate of flow in each pipe. In contrast, selecting metadata according to the user perspective and the ranked context filters may further limit the selected metadata to color coding of each pipe that has a rate of flow that is less than half of the previous day.
In Block 722, a determination is made whether to present the selected metadata to the user based on a viewpoint. If a determination is made to present the selected metadata to the user based on a viewpoint, the method proceeds to Block 726. In Block 726, oilfield output is obtained from a viewpoint in the current location of the user. As described above, an oilfield output is any data a user in the field may visualize in a field of view of a computing device in one or more embodiments. In one or more embodiments, the viewpoint is the direction a computing device is facing. Said another way, the viewpoint is the point of view of a computing device. The field of view may be visible using a camera, or any device that provides vision to the user.
For example, a production engineer has a current location by a well in the field and faces the well. Without changing direction, the production engineer raises a tablet. The viewpoint of the production engineer is the direction the production engineer faces the tablet towards the well. The oilfield output is the image of the well displayed in the field of view of a camera in the tablet from the viewpoint of the production engineer.
In Block 728, the oilfield output obtained in Block 726 is encoded with the selected metadata to obtain a revised output. In one or more embodiments, a revised output is the oilfield output with additional information displayed in the form of the selected metadata.
In one or more embodiments, encoding the oilfield output may correspond to overlaying the selected metadata on the oilfield output. Overlaying metadata may include overlaying text or a graphic. A graphic may include an image, a video, or any visualization that may be overlaid on an oilfield output. For example, a user visualizing a well may have the recordings from a sensor in the well overlaid on the well as text. The selected metadata overlaid on the oilfield output is the revised output.
For example, a geologist facing a smartphone towards a portion of the surface of the earth may visualize an image of the portion of the surface. Percentages of the minerals and/or elements in the subsurface below the portion of the surface are overlaid on the image. As another example, a production engineer may visualize the flow of fluid in a pipe by facing a tablet towards the pipe. The pipe is visible in the field of view of a camera of the tablet from the viewpoint of the production engineer. While the tablet faces the pipe, no alerts are displayed signifying that the flow of the pipe is in the normal range. Although the previous example uses a camera on a tablet to visualize the pipe, one of ordinary skill in the art recognizes that any augmented reality device may be used to visualize a field of view of a person in the field.
In one or more embodiments, encoding the oilfield output with the selected metadata may correspond to altering the display of objects in the oilfield output. Altering the display may include color coding an object in the oilfield output based on the selected metadata in one or more embodiments. Color coding corresponds to assigning a color to an object to identify a property. A property is a characteristic, attribute or quality of an object. A property may include a temperature, a thickness, and a material of an object.
For example, a driller feels that a drilling tool is overheating and wants to verify the temperature. The driller may face a tablet towards the drilling tool, such that the drilling tool is in the field of view of a camera of the tablet. On the display of the tablet, the drilling tool is displayed in orange signifying that although the drilling tool is hot, the drilling tool is safe to use. As another example, a production engineer visualizes a pipe as the oilfield output from the production engineer's laptop. The fluid flow of the pipe may be color coded based on the rate of flow. The production engineer then gains an understanding of the real-time rate of flow through the color coding.
In Block 730, the revised output is presented to the user. In one or more embodiments, the user is presented with the revised output in a computing device. In one or more embodiments, the user is presented with the revised output during an immediate user task and in a current location of the user in the field. The revised output is presented to the user by showing a visual of the revised output in one or more embodiments. Since the revised output is presented to the user during the immediate user task, the user may base a decision of the immediate user task on the revised output that is presented.
For example, a drilling engineer assesses the current state of a well by facing a tablet towards the well. The decision of a drilling engineer to continue the drilling operation at the well is based on the revised output on the tablet. The revised output shows the pressure and temperature of the well overlaid on the image of the well.
In Block 732, a determination is made whether the selected metadata has updated. In one or more embodiments, the determination whether the selected metadata has updates occurs if any of the selected metadata is different compared to the selected metadata initially selected in Block 716 or Block 718. In one or more embodiments, the determination whether the selected metadata has updated occurs while the oilfield output remains the same. The oilfield output remains the same when the objects in the oilfield output are the same. In one or more embodiments, an object recognition algorithm, such as background subtraction, may be used to verify that the oilfield output remains the same. Background subtraction removes the background from an image and emphasizes the foreground objects of the image. The oilfield output may remain the same when the foreground objects of the image may have moved, but remain in the field of view. For example, a tablet held by a drilling engineer to visualize a well may not remain perfectly still; however, the movements of the tablet do not remove the well from the field of view of a camera in the tablet.
If a determination is made that an update to the selected metadata exists, the method may proceed to Block 734. In Block 734, the selected metadata is updated. In one or more embodiments, the update is a real-time change to the selected metadata. Since an oilfield output exists, Block 728 is then executed to encode the update to the selected metadata that is overlaid on the oilfield output. For example, a sensor (e.g.,
If a determination is made that an update to the metadata does not exist, the method may proceed to Block 736. In Block 736, a determination is made whether to update the viewpoint. In one or more embodiments, a user may visualize more than one oilfield output captured from a different viewpoint to complete an immediate user task. For example, a production engineer may need verify the rate of flow of fluid in pipes in two locations to determine the location to add more drillers.
If a determination is made to update the viewpoint, the method proceeds to Block 736. In one or more embodiments, the determination to update the viewpoint is based on whether different objects are displayed in the field of view of a camera in a computing device. In one or more embodiments, the determination to update the viewpoint may use an object recognition computer algorithm to automatically identify objects. In one or more embodiments, the identification of objects may correspond to matching a computer-aided design (CAD) model of a tool to a tool in the image. A CAD model is a mechanical drawing of a tool produced on a computing device. In one or more embodiments, the identification of objects may also correspond to matching features to objects in the image. A feature is a visual property of an object. For example, the size and shape of a well are features that distinguish the well from other objects in the field.
In Block 736, a different viewpoint is obtained. In one or more embodiments, the different viewpoint is obtained by changing the direction of a computing device to capture another field of view. The method then returns to Block 726 to obtain the oilfield output from the different viewpoint obtained.
Returning to Block 722, if a determination is made not to present the selected metadata to the user based on a viewpoint, the method proceeds to Block 724. In one or more embodiments, the determination not to present the selected metadata to the user is based on a delivery method of the selected metadata. In one or more embodiments, the delivery method is the method the selected data is presented to the user. Delivery methods include, visual, auditory, and vibratory. In one or more embodiments, the determination not to present the selected metadata occurs for an auditory and/or a vibratory delivery method of the selected metadata. The method then proceeds to Block 734 to determine if any updates to the selected metadata exist.
For example, a drilling engineer has a smartphone in the drilling engineer's tool belt and travels to a wellbore. The drilling engineer's smartphone vibrates to send an alert to the drilling engineer that an immediate user task of creating fractures in the wellbore is delayed. The alert may also suggest an alternate immediate user task the drilling engineer may complete during the delay. As another example, background music on a production engineer's tablet increases in volume while the production engineer is walking on the surface of the earth in areas where the subterranean formations are predicted to produce oil.
Turning to
Additional context filters exist based on Nisha's user perspective of level 1 field engineer. The additional context filters include Nisha's current field location and field schedule context filter (hereinafter, “schedule filter”) and pump functionality context filter (hereinafter, “functionality filter”). The schedule filter is ranked as more relevant than the functionality filter based on the requirements of a level 1 field engineer. The level 1 field engineer is required to visit each pump site on the schedule of the level 1 engineer. However, it is recommended to a field engineer to submit a report on the functionality of pumps visited each day. The metadata is then selected first according to the level 1 field engineer perspective, then the schedule filter is applied followed by the functionality filter.
Continuing with
Turning to
Turning to
Continuing with
Turning to
The additional context filters include a field engineering tools context filter, field engineering functionality context filter, and field engineering location context filter. The metadata is first selected based on Nisha's user perspective. The metadata is then limited by the field engineering location context filter, followed by the field engineering tools context filter, and finally the field engineering functionality context filter.
Continuing with
Turning to
No additional context filters exist. The selection of metadata is based on Sarah's user perspective as a diagnostic drilling engineer. The selected metadata (e.g., 902 and 910) is overlaid on the oilfield output (904). The location callout (902) displays Sarah's current location as location well 23-X-1. The drill callout (910) is a visual reminder to check the drill bit. Sarah then recognizes that the drill bit is faulty.
Turning to
No additional context filters exist. The selection of metadata is based on Bob's user perspective as a lead production engineer from
Embodiments may be implemented on virtually any type of computing system regardless of the platform being used. For example, the computing system may be one or more mobile devices (e.g., laptop computer, smart phone, personal digital assistant, tablet computer, or other mobile device), desktop computers, servers, blades in a server chassis, or any other type of computing device or devices that includes at least the minimum processing power, memory, and input and output device(s) to perform one or more embodiments. For example, as shown in
Software instructions in the form of computer readable program code to perform embodiments may be stored, in whole or in part, temporarily or permanently, on a non-transitory computer readable medium such as a CD, DVD, storage device, a diskette, a tape, flash memory, physical memory, or any other computer readable storage medium. Specifically, the software instructions may correspond to computer readable program code that when executed by a processor(s), is configured to perform embodiments.
Further, one or more elements of the aforementioned computing system (1000) may be located at a remote location and connected to the other elements over a network (1014). Further, embodiments may be implemented on a distributed system having a plurality of nodes, where each portion may be located on a different node within the distributed system. In one embodiment, the node corresponds to a distinct computing device. The node may also correspond to a computer processor with associated physical memory. The node may also correspond to a computer processor or micro-core of a computer processor with shared memory and/or resources.
While augmenting an immediate first user task has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this disclosure, will appreciate that other embodiments can be devised which do not depart from the scope as disclosed herein. Accordingly, the scope should be limited by the attached claims.
This application claims benefit under 35 U.S.C. §119(e) of U.S. Provisional Patent Application No. 61/746,446, filed on Dec. 27, 2012, and entitled “Augmented Reality For Oilfield”, which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
61746446 | Dec 2012 | US |