Imaging systems employed on downhole tools generally generate large amounts of data, which cannot be communicated in real-time through low bandwidth telemetry systems such as, for example, mud pulse telemetry systems. Further, the optical fields of view of imaging systems employed on downhole tools are often obstructed by opaque fluids and debris.
This summary is provided to introduce a selection of concepts that are further described below in the detailed description. This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in limiting the scope of the claimed subject matter.
An example method disclosed herein includes projecting flushing fluid into an optical field of view of an imaging system disposed on a downhole tool. The example method also includes directing a pattern of light onto a target in the optical field of view via a light source of the imaging system and determining three-dimensional shape information of the target based on the light directed from the target and received via an image detection plane of the imaging system. The example method further includes determining a characteristic of the target based on the three-dimensional shape information.
Another example method includes projecting flushing fluid from a downhole tool into a field of view of an imaging system disposed on the downhole tool. The imaging system includes a light source and an image detection plane. The example method also includes determining three-dimensional shape information of a target via a processor of the imaging system based on a first pattern of light directed onto the target via the light source and a second pattern of light received by the image detection plane. The example method further includes generating an image based on the three-dimensional shape information and controlling the downhole tool based on the image.
Another example method includes determining three-dimensional shape information of a target via an imaging system and determining shape characteristic data of the target based on the three-dimensional shape information. The example method also includes matching the shape characteristic data with first predetermined target data stored in a first database and determining a database index associated with the first predetermined target data. The example method further includes retrieving second predetermined target information from a second database using the database index.
The figures are not to scale. Instead, to clarify multiple layers and regions, the thickness of the layers may be enlarged in the drawings. Wherever possible, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts. As used in this patent, stating that any part (e.g., a layer, film, area, or plate) is in any way positioned on (e.g., positioned on, located on, disposed on, or formed on, etc.) another part, means that the referenced part is either in contact with the other part, or that the referenced part is above the other part with one or more intermediate part(s) located therebetween. Stating that any part is in contact with another part means that there is no intermediate part between the two parts.
Downhole imaging systems and methods are disclosed herein. An example imaging system disclosed herein includes a light source, an image sensor, and an image processor. In some examples, the light source directs a pattern of light such as, for example, an array of spots, onto a target. The target may be, for example, a casing, a borehole wall, and/or any other object(s) and/or area(s). Light is directed (e.g., reflected) from the target based on a shape of the target. For example, some of the light directed from the target may be received via the image sensor and some of the light may be directed away from the image sensor and, thus, not received via the image sensor. In some examples, the image sensor includes an image detection plane having a plurality of photo detectors disposed on a plane. In some examples, the image processor determines where on the image sensor the light is received and determines a plurality a measurements based on where the light is received relative to where the light source directed the pattern of light. The example image processor may generate an image based on the measurements and/or determine a characteristic of the target such as, for example, texture, shape, size, position, etc.
In some examples, the imaging system retrieves first predetermined target information from a first database based on the three-dimensional shape information. For example, the image processor may associate (e.g., match) the three-dimensional shape information and/or the characteristic of the target with the first predetermined target information using spatial correlation. In some examples, a database index is assigned to and/or associated with the first predetermined target information, and the imaging system communicates in real-time the database index to a surface system employing a second database. In some examples, the second database employs an organizational structure similar or identical to the first database, and the second database includes second predetermined target information assigned and/or associated with the database index. In some examples, the surface system retrieves the second predetermined target information, which may include a variety of information related to the target and/or similar targets. The second predetermined target information may be logged and/or displayed to an operator of a downhole tool including the example imaging system. Thus, the example imaging system enables communication of a small amount of information (e.g., database indexes) uphole while enabling monitoring and/or detection of downhole targets in real-time.
For example, the imaging system may determine texture data of a downhole target and match the texture data to predetermined texture data stored in the first database. The example imaging system may then determine a database index associated with the predetermined texture data and communicate in real-time the database index to the surface system. When the surface system receives the database index, the surface system may retrieve a composition of a subterranean formation from the second database associated with the database index. The composition of the subterranean formation may be logged with a depth of the downhole tool when the database index was received to generate a map and/or facilitate navigation of a borehole.
In some examples, the three-dimensional shape information determined via the imaging system is used to control a downhole tool. For example, the imaging system may determine three-dimensional shape information and/or generate images of a borehole wall as the downhole tool is lowered in a multilateral well. When the downhole tool moves past a borehole window (e.g., an opening from a first borehole to a second borehole in the multilateral well), the example imaging system may be used to detect the window. For example, three-dimensional shape information may be communicated to the surface system, and images of the window may be presented to an operator of the downhole tool. The operator may use images to align the downhole tool with the window and move the downhole tool from the first borehole into the second borehole.
A drill string 12 is suspended within the borehole 11 and has a bottom hole assembly 100 which includes a drill bit 105 at its lower end. The surface system includes platform and derrick assembly 10 positioned over the borehole 11, the derrick assembly 10 including a rotary table 16, a kelly 17, a hook 18 and a rotary swivel 19. The drill string 12 is rotated by the rotary table 16, energized by means not shown, which engages the kelly 17 at an upper end of the drill string 12. The drill string 12 is suspended from the hook 18, attached to a traveling block (also not shown), through the kelly 17 and the rotary swivel 19, which permits rotation of the drill string 12 relative to the hook 18. In some examples, a top drive system can be used.
In the illustrated example, the surface system further includes drilling fluid or mud 26 stored in a pit 27 formed at the well site. A pump 29 delivers the drilling fluid 26 to the interior of the drill string 12 via a port in the swivel 19, causing the drilling fluid 26 to flow downwardly through the drill string 12 as indicated by directional arrow 8. The drilling fluid 26 exits the drill string 12 via ports in the drill bit 105, and then circulates upwardly through the annulus region between the outside of the drill string 12 and the wall of the borehole 11, as indicated by directional arrows 9. In this manner, the drilling fluid 26 lubricates the drill bit 105 and carries formation cuttings up to the surface as it is returned to the pit 27 for recirculation.
The bottom hole assembly 100 of the illustrated example includes a logging-while-drilling (LWD) module 120, a measuring-while-drilling (MWD) module 130, a roto-steerable system and motor, and the drill bit 105.
The LWD module 120 is housed in a special type of drill collar, as is known in the art, and can contain one or more logging tools. It will also be understood that more than one LWD and/or MWD module can be employed, for example, as represented at 120A. References throughout to a module at the position of module 120 can mean a module at the position of module 120A. The LWD module 120 includes capabilities for measuring, processing, and storing information, as well as for communicating with the surface equipment. In the illustrated example, the LWD module 120 includes a fluid sampling device.
The MWD module 130 is also housed in a special type of drill collar, as is known in the art, and can contain one or more devices for measuring characteristics of the drill string 12 and the drill bit 105. The MWD module 130 further includes an apparatus (not shown) for generating electrical power to the downhole system. This may include a mud turbine generator powered by the flow of the drilling fluid 26, and/or other power and/or battery systems. In the illustrated example, the MWD module 130 includes one or more of the following types of measuring devices: a weight-on-bit measuring device, a torque measuring device, a vibration measuring device, a shock measuring device, a stick slip measuring device, a direction measuring device, and an inclination measuring device.
The example extendable probe assembly 316 is configured to selectively seal off or isolate selected portions of the wall of the wellbore 302 to fluidly couple to an adjacent formation F and/or to draw fluid samples from the formation F. The extendable probe assembly 316 may be provided with a probe having an embedded plate. Formation fluid may be expelled through a port (not shown) or it may be sent to one or more fluid collecting chambers 326 and 328. In the illustrated example, the electronics and processing system 306 and/or a downhole control system are configured to control the extendable probe assembly 316 and/or the drawing of a fluid sample from the formation F.
The coiled tubing system 402 of
With reference still to
In the illustrated example, the control unit 436 is computerized equipment secured to the truck 408. However, the control unit 436 may be portable computerized equipment such as, for example, a smartphone, a laptop computer, etc. Additionally, powered controlling of the application may be hydraulic, pneumatic and/or electrical. In some examples, the control unit 436 controls the operation, even in circumstances where subsequent different application assemblies are deployed downhole. That is, subsequent mobilization of control equipment may not be included.
The control unit 436 may be configured to wirelessly communicate with a transceiver hub 438 of the coiled tubing reel 410. The receiver hub 438 is configured for communication onsite (surface and/or downhole) and/or offsite as desired. In some examples, the control unit 436 communicates with the sensing system 426 and/or logging tool 428 for conveying data therebetween. The control unit 436 may be provided with and/or coupled to databases, processors, and/or communicators for collecting, storing, analyzing, and/or processing data collected from the sensing system and/or logging tool.
During operation of the example drill bit 500, the drill bit 500 and, thus, the example imaging system 502 rotate relative to the target 506, and the example imaging system 502 acquires three-dimensional shape information of the target 506 and/or captures images of the target 506 based on the light projected by the light source 504 and the light received by the light sensor 510. For example, the image processor 512 detects where light is received on the image sensor 510 and, based on where the light is received, the image processor 512 determines a plurality of measurement of the target 506. Based on the measurements, the example image processor 512 determines three-dimensional shape information such as texture data, size data, shape data, and/or other three-dimensional shape information of the target 506. In some examples, the image processor 512 also determines information related to the target 506 such as, for example, color(s) of the target 506, a position of the target 506, a distance of the target 506 relative to one or more components of the drill bit 500, and/or any other target information. In some examples, the image processor 512 analyzes one or more captured images of the target 506 and determines three-dimensional shape information and/or other target information based on the image(s).
In some examples, the example image processor 512 processes and/or formats the target information to facilitate storage of the target information in one or more databases, enable the image processor 512 to associate (e.g., match) the target information or a portion of the target information with predetermined target information stored in one or more databases, facilitate communication of the target information toward a surface of Earth via a low bandwidth telemetry link 513 (e.g., a mud-pulse telemetry link), enable one or more images of the target 506 to be generated, and/or perform and/or facilitate other actions. For example, the image processor 512 may generate vector data based on the image(s) of the target 506, the three-dimensional shape information, and/or other information. In some examples, the image processor 512 generates a spatial gradient vector field such as, for example: grad
In some examples, the vector data is communicated toward the surface in real-time to enable a surface system to generate an image of the target and/or retrieve additional information related to the target.
In the illustrated example, the drill bit 500 includes a port 514 to project flushing fluid 516 into a borehole 518 and the optical field of view of the example imaging system 502. The example flushing fluid 516 is substantially transparent or clear to enable the light generated via the light source 504 to propagate through the flushing fluid 516 to the target 506 and from the target 506 to the image sensor 512. In some examples, the light source 504 generates light at a predetermined wavelength (e.g., infrared wavelengths) to facilitate propagation of the light through the flushing fluid 516.
In the illustrated example, the drill bit 500 includes a flushing fluid system 520 to control the projection of flushing fluid 516 via the drill bit 500. In some examples, the flushing fluid system 520 includes a controller, one or more valves, nozzles, pumps, motors, and/or other components to control an amount of time and/or a schedule during which the flushing fluid 516 is projected into the borehole 518, a rate at which the flushing fluid 516 is expelled from the drill bit 500 via the port 514, a direction in which the flushing fluid 516 is projected, and/or other aspects of operation of the flushing fluid system 520, the drill bit 500, and/or the imaging system 502.
In some examples, the flushing fluid is projected momentarily during times when the example imaging system 502 is directing and receiving light, capturing images of the target 506, and/or determining three-dimensional information of the target 506. In some examples, the flushing fluid is projected substantially continuously, during predetermined intervals of time, and/or using any other pattern or sequence of operation. Example methods and apparatus that can be used to implement the example flushing fluid system 520 of
In the illustrated example, the downhole tool 700 includes an orientation sensor 710 such as, for example, a gyroscope to determine an orientation (e.g., vertical, horizontal, thirty degrees from vertical, etc.) of the downhole tool. In some examples, the downhole tool 700 includes a depth sensor to determine a depth of the downhole tool 700.
In the illustrated example, a flushing fluid system 712 is disposed on the downhool tool 700 to project flushing fluid through a first port 714 and/or a second port 716 to flush or wash away opaque fluid (e.g., mud, formation fluid, etc.) and/or debris from the fields of view of the first imaging system 702 and/or the second imaging system 704.
In the illustrated example, the downhole tool is disposed in a multilateral well 718 including a first borehole 720 and a second borehole 722 in communication with the first borehole 720. In some examples, the example first imaging system 702 is employed to detect a borehole window 724. In the illustrated example, the borehole window 724 is an opening defined by the first borehole 720 through which the downhole tool 700 may enter the second borehole 722.
In some examples, as the downhole tool 700 is moved (e.g., lowered) in the first borehole 720, the first imaging system 702 generates three-dimensional shape information and/or captures images of a wall 726 of the first borehole 720. In the illustrated example, the three-dimensional shape information, the images and/or other information is communicated to a surface system 725 (e.g., the control unit 436 of
In some examples, the images and/or the three-dimensional shape information is used to determine characteristics of the borehole wall 726 and/or the window 724. For example, the images and/or the three-dimensional shape information may be used to detect corrosion, chemical buildup, physical damage, perforations, surface texture, a size and/or shape of the window 724, a position of the window 724 relative to the downhole tool 700, and/or other characteristics.
In some examples, entry of the downhole tool 700 into the second borehole 722 is detected and/or verified based on an orientation of the bent sub 900 determined via the orientation sensor 710. For example, if the orientation sensor 710 determines that the bent sub 900 is oriented at a predetermined angle away from being vertical, the entry of the downhole tool 700 into the second borehole 722 is detected and/or verified. In some examples, entry of the downhole tool 700 into the second borehole 722 is fully automated and/or semi-automated via the surface system 725 and/or downhole controllers employing the images 800, 100 and/or three-dimensional shape information generated via the first imaging system 702 and/or the second imaging system 704.
The example image detection plane 1104 includes a plurality of detectors disposed in a substantially planar array. In some examples, the image processor 1106 includes an array of photo detectors and/or pixel sensors in communication with processing elements. In some examples, each of the processing elements determines three-dimensional shape information of a portion of the target 1106 that corresponds to a portion (e.g., pixel) of the image of the target 1106. In some examples, the example imaging system 1100 of
In some examples, the imaging system 1100 of
In the illustrated example, each measured spot lies on the intersection of two lines: a projection line and a vision constraining line. If geometric information about the projected line is known, a three-dimensional point Mi=[Xw, Yw, Zw]t can be determined from an image point mi=[Xv, Yv]t. Suffix i indicates the spot number. The expression for the projection line is shown in Equation 1:
M
i
=c+δs
i(i=1, . . . , Np). Equation 1:
The projection line of Equation 1 is a line with gradient s, passing through a projection center c and on which the measured spot i lies. Np is a total number of projected spots. An expression for the vision constraining line is shown in Equation 2 below:
P{tilde over (M)}{tilde over (Mt)}=w{tilde over (m)}{tilde over (mt)}. Equation 2:
The expression of the vision constraining line illustrates a relationship between image point {tilde over (m)}{tilde over (mt)}=[mit, 1]t of spot i and a three-dimensional point {tilde over (M)}{tilde over (Mt)} connected by perspective projection matrix P.
In Equations 1 and 2, c, si, and P are known parameters, and mi is observed data. The three-dimensional point Mi is obtained from Equations 1 and 2 form the observed image points. The example imaging system 1100 enables high-speed image processing employing a large number of calculations by using a parallel and dedicated vision processing unit as a co-processor. An example vision processing unit is described in Watanabe, et al., 955-fps Real-time Shape Measurement of a Moving/Deforming Object using High-speed Vision for Numerous-point Analysis”, 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10-14 Apr. 2007.
In some examples, the image processor 1106 calculates image moments as spot information. The image moments are parameters that can be converted or formatted to various geometric features such as, for example, size, centroid, orientation, shape information, and/or other geometric features. The (i+j)th image moments mij are calculated from Equation 3 below:
In Equation 3, I(x, y) is the value at pixel (x, y). In the illustrated example, by employing a parallel processing unit, the example image processor 1104 uses O(√n) calculations and enables observation or monitoring of a few thousand objects at frame rates of thousands of frames per second.
A geometrical relationship between the image detection plane 1104 and each spot projected via the light source 1102 is predetermined via calibration. Calibration can be set by determining the following three functions of Equation 4 from known pairs of three-dimensional points Mi and image points mi of each projected spot i without obtaining intrinsic parameters c, si, and P:
[xw,yw,zw]t=[f1i(zw),f2i(zw),f3i(Xv)]t. Equation 4:
Functions f1i and f2i are used to determine the xw and yw coordinates of the three-dimensional point for spot i from a depth distance zw. The relationships are expressed as a linear function in Equation 5 below:
f
i
i(zw)=∝j,1(i)zw+∝j,0(i) (j=1, 2). Equation 5:
The function f3i is used to determine the depth distance zw from the Xv coordinate of an image point. In some examples, the function f3i is expressed as a hyperbola about Xv and Yv. In other examples (e.g., over a small range), the function f3i can be determined via a polynomial expression shown in Equation 6 below:
f
3
i(Xv)=Σk=1n∝3,k(i)Xvk. Equation 6:
In some examples, a two-dimensional polynomial approximation is employed. In some examples, the function f3i is determined by obtaining multiple spot patterns to xwyw planes at known distances zw.
In some examples, the image processor 1106 determines which image point corresponds to each projected spot based on a previous frame via a tracking-based technique, which can perform dynamic modification of a search area according to pattern changes. In some examples, at a beginning or an outset of the measurement, initialization is performed.
A start time t(i) of projecting about each spot i is expressed as follows:
t(i)=Tδ(i∈Aδ: δ=1, . . . , Ne). Equation 7:
In Equation 7, Aδ is a class of projected spots having epipolar lines li(Yv=li(Xv)) constraining movement of spot i in the image space that do not intercross. Ne is the number of divided classes. Initialization enables high versatility. Moreover, because this spot pattern is already projected when commencing sequential frame operation, substantially no loss of three-dimensional shape information occurs after the measurement begins.
After initialization, three-dimensional shape information is measured in input frames. When the frame rate is high relative to changes in the target shape, differences between spots projected on a smooth surface between successive frames is small. Thus, an operation to correspond an image point to a spot i could be expressed as a tracking operation between frames, in which a point mi(t−1) corresponding to a point m(t) is searched for using corrected points at time t−1 based on the following evaluation:
min{|mi(t−1)−m(t)|+|Mi(t−1){tilde over (M)}(t)|. Equation 8:
Searching of neighbor points in two-dimensional image space can be performed using a bucket method, which can efficiently perform the search operation of the nearest point to an input point by dividing the search space into grids and accessing neighbor areas. The bucket method enables the number of calculations to have a linear relationship relative to the number of measured image points if the points are distributed substantially equally, which results in an equal number of points included within each grid.
In some examples, points move discontinuously because they are on points of contact between the measured object and the projected line of the spot. These points are mapped exceptionally by using the epipolar line based on the following evaluation:
min{|Yv(t)−li(Xv(t))|}. Equation 9:
A number of these discontinuously moving points can be assumed to be small. In some examples, constraints are defined for the speed at which these points jump or change in the depth direction between frames in order to avoid overlapping spots in the image space.
In some examples, the light source 1202 includes one or more lasers, light emitting diodes, and/or any other light source. Light generated via the light source 1202 may be directed toward a target via an optical fiber, an optical fiber bundle and/or optics (e.g., lenses, filters, etc.). In some examples, the light source 1202 generates light having a wavelength that enables the light to propagate through flushing fluid projected into a field of view of the example imaging system 1200. In some examples, the light source 1202 directs a pattern of light such as, for example, an array of spots onto and/or toward the target.
In the illustrated example, the image sensor 1204 can be implemented via a camera, a video camera, an image detection plane such as the example image sensor 1104 of
The example three-dimensional shape information determiner 1208 of the example imaging system 1200 determines three-dimensional shape information of the target based on the images captured and/or the light received via the image sensor 1204. For example, the three-dimensional shape information determiner 1208 may determine three-dimensional shape information based on the technique described above in conjunction with
The example formatter 1210 formats and/or processes the three-dimensional shape information to facilitate storage of the three-dimensional shape information, real-time communication of the three-dimensional shape information, and/or generation of image(s). In some examples, the formatter 1210 generates vector data based on the image(s) and/or the three-dimensional shape information. In some examples, the vector data is a spatial gradient vector field (e.g., grad
In some examples, the vector data includes a shape, a size, a plurality of measurements, and/or other three-dimensional shape information.
In the illustrated example, the first database 1214 includes predetermined target information such as, for example, target names or types, target three-dimensional patterns (e.g., textures), shapes, sizes, and/or other predetermined target information. In some examples, the predetermined target data is organized and/or indexed via one or more database indexes (e.g., numbers, letters, and/or any database index and/or organizational scheme). In some examples, the first database 1214 is used to store downhole tool depth information, downhole tool orientation information, and/or any other information generated via the downhole tool sensor(s) 1218.
The example database manager 1212 of
The example output generator 1216 generates an output and communicates the output to the surface system 1222 via a telemetry system 1226 employing, for example, a transmitter, a telemetry link (e.g., a mud-pulse telemetry link, etc.) and/or any other telemetry tools. In some examples, the output generator 1216 generates an output including one or more images, three-dimensional shape information, vector data, one or more database indexes, and/or outputs including other information. In some examples, the telemetry system 1226 has limited or low bandwidth, and the output generator 1216 generates an output communicable in real-time to the surface system 1222. For example, the output generator 1216 may communicate the database index and/or vector data without images of the target.
The example surface system 1222 of
In some examples, if the data manager 1228 receives a database index from the example imaging system 1200, the data manager 1228 may retrieve predetermined target information stored in the second database 1224 that is assigned to and/or associated with the database index. In some examples, the second database 1224 includes more predetermined target information than the first database 1214. For example, the first database 1214 may include predetermined texture data, and the second database 1224 may include information associated with the predetermined texture data such as, for example, a composition of a portion of a subterranean formation, an indication of a condition of a casing (e.g., presence of corrosion, cracks, perforations, etc.), an indication of a borehole window, an indication of material build-up around the borehole window, and/or other target information. Thus, the three-dimensional shape information 1208 determined via the example imaging system 1200 may be used to determine and/or retrieve information related to the target.
The predetermined target information may be presented to an operator of a downhole tool via the display 1232 and/or used by the downhole tool controller 1234 to control operation of the downhole tool. In some examples, the image generator 1230 generates images of the target based on the output communicated to the example surface system 1222. For example, if the output is vector data, the example image generator 1230 may generate one or more images based on the vector data, and the images may be displayed via the example display 1232 of
In some examples, the example downhole tool controller 1234 controls operation of the imaging system 1200 and/or the downhole tool on which the example imaging system 1200 is disposed based on the output generated via the output generator 1216. For example, if the data manager 1228 receives three-dimensional shape information and/or images from the imaging system 1200 and determines that the downhole tool is adjacent a borehole window, the example downhole tool controller 1234 may operate the downhole tool to move the downhole tool through the borehole window and into a lateral borehole as described in conjunction with
While an example manner of implementing the example imaging system 502 of
Flowcharts representative of example methods for implementing the example imaging system 502 of
As mentioned above, the example methods of
The example method 1300 of
Three-dimensional shape information of the target is determined based on the light received via an image detection plane of the imaging system (block 1306). In some examples, the three-dimensional shape information includes a plurality of measurements based on where the light is received by the image detection plane relative to the pattern of light directed toward the target. In some examples, the example image processor 1106 determines the three-dimensional information using the technique described in Watanabe, et al., “955-fps Real-time Shape Measurement of a Moving/Deforming Object using High-speed Vision for Numerous-point Analysis,” 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10-14 Apr. 2007.
A characteristic of the target is determined based on the three-dimensional shape information (block 1308). The characteristic may include a size; a shape; a texture; recognition and/or identification of an object such as, for example, a composition of a subterranean formation, a borehole window, material buildup, a crack, a perforation, etc.; recognition and/or identification of a condition of an object such as, for example corrosion, wear, etc.; movement of an object; and/or any other characteristic. In some examples, the characteristic of the target is determined by analyzing the three-dimensional shape information and/or one or more images generated based on the three-dimensional shape information. The example method 1300 then returns to block 1302 and, thus, the example method 1300 may be used to monitor targets in the optical field of view of an imaging system while a downhole tool is operating such as, for example, during drilling, navigation of the downhole tool through a multilateral well, sampling, etc.
A first pattern of light is directed into an optical field of view of the imaging system (block 1404). For example, a light source (e.g., the example light source 1102) of the example first imaging system 702 may direct an array of spots toward the wall 726 of the first borehole 720. Three-dimensional shape information of a target is determined via a processor of the imaging system based on the first pattern of light and a second pattern of light received via an image sensor (block 1406). For example, some of the spots of light directed onto the wall 726 may be directed to the image detection plane 1104. In some examples, the spots of light may be directed from the wall 726 to the image detection plane 1104 at angles different than angles at which the spots of light were directed onto the wall 726 via the light source 1102 because of a shape (e.g., curvature, texture, presence of cracks or apertures, etc.) of the wall 726. In some examples, the image processor 1106 determines a plurality of measurements based on where the spots of light are received on the image detection plane 1104 and/or where the spots of light are not received on the image detection plane 1104 to determine three-dimensional shape information of the target. For example, the technique described in Watanabe, et al., “955-fps Real-time Shape Measurement of a Moving/Deforming Object using High-speed Vision for Numerous-point Analysis,” 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10-14 Apr. 2007 may be employed to determine the three-dimensional shape information.
An image is generated based on the three-dimensional shape information (block 1408). For example, the three-dimensional shape information may be formatted and/or processed to generate vector data, and the vector data is communicated to a surface system (e.g., the example electronics and processing unit 306 of
The downhole tool is controlled based on the image (block 1410). For example, an operator of the downhole tool 700 may operate the example bent sub 900 to move the downhole tool 700 from the first borehole 720 through the window 724 and into the second borehole 722 by orienting the bent sub 900 such that an optical field of view of the second imaging system 704 is substantially centered relative to the window 724 using the image generated via the first imaging system 702 and/or an image generated via the second imaging system 704. In some examples, if corrosion and/or material buildup on and/or near the window 724 is detected based on the image generated via the first imaging system 702 and/or the second imaging system 704, treatment fluid is projected toward and/or near the window 724 to remove and/or reduce the corrosion and/or material buildup. In other examples, the downhole tool 700 is operated in other ways based on the image(s). The example method 1400 then returns to block 1402.
The shape characteristic data is associated with first predetermined target data stored in a database (block 1506). For example, the formatter 1210 may generate vector data based the shape characteristic data, and the database manager 1212 may match the vector data to predetermined target data such as, for example, texture data stored in the first database 1214 via spatial correlation. A database index associated with the first predetermined target data is determined (block 1508). For example, the first predetermined target data stored in the first database 1214 may be assigned one of a plurality of database indexes (e.g., letters, numbers and/or other designation), and the database manager 1212 determines which one of the databases indexes is assigned to the first predetermined target information.
The database index is communicated to a receiver at or near a surface of Earth (block 1510). For example, the database index may be communicated via the telemetry system 1226 to a receiver (e.g., the transceiver hub 438 of the coiled tubing reel 410) of the surface system 1222. In some examples, the three-dimensional shape information and/or the shape characteristic data is stored in the first database 1214, and the database index is communicated to the receiver via a low bandwidth telemetry link such as, for example, a mud pulse telemetry link.
Second predetermined target information is retrieved from a second database using the database index (block 1512). For example, the second database 1224 may be organized using the same or similar database indexes as the example first database 1214. Thus, the example data manager 1228 of the example surface system 1222 may use the database index communicated from the example imaging system 1200 to retrieve second predetermined target data from the second database 1224 that is assigned and/or associated with the database index and different that the first predetermined target data. In some examples, the retrieved predetermined target data includes, for example, information related to a subterranean formation (e.g., a composition of a portion of the subterranean formation), information related a borehole window (e.g., a size of the borehole window, mapping information of a lateral borehole defining the borehole window, identification of corrosion and/or material buildup), a condition of a target (e.g., presence of cracks, perforations, wear, etc. of a casing) and/or any other information. In some examples, the predetermined target information is presented in real-time to an operator of the downhole tool. Thus, the operator may be presented with information related to objects detected downhole via the imaging system 1200.
The processor platform 1600 of the illustrated example includes a processor 1612. The processor 1012 of the illustrated example is hardware. For example, the processor 1612 can be implemented by one or more integrated circuits, logic circuits, microprocessors or controllers from any desired family or manufacturer.
The processor 1612 of the illustrated example includes a local memory 1613 (e.g., a cache). The processor 1612 of the illustrated example is in communication with a main memory including a volatile memory 1614 and a non-volatile memory 1616 via a bus 1618. The volatile memory 1614 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-volatile memory 1616 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 1614, 1616 is controlled by a memory controller.
The processor platform 1600 of the illustrated example also includes an interface circuit 1620. The interface circuit 1620 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a PCI express interface.
In the illustrated example, one or more input devices 1622 are connected to the interface circuit 1620. The input device(s) 1622 permit(s) a user to enter data and commands into the processor 1012. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), an image detection plane, a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 1624 are also connected to the interface circuit 1620 of the illustrated example. The output devices 1024 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display, a cathode ray tube display (CRT), a touchscreen, a tactile output device, a light emitting diode (LED), a printer and/or speakers). The interface circuit 1620 of the illustrated example, thus, may includes a graphics driver card, a graphics driver chip or a graphics driver processor.
The interface circuit 1620 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem and/or network interface card to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 1626 (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, etc.).
The processor platform 1600 of the illustrated example also includes one or more mass storage devices 1628 for storing software and/or data. Examples of such mass storage devices 1628 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, RAID systems, and digital versatile disk (DVD) drives.
The coded instructions 1632 of
From the foregoing, it will be appreciated that the above disclosed methods, apparatus and articles of manufacture enable three-dimensional shape information to be determined and/or used to monitor downhole objects and/or conditions substantially in real-time. Some examples disclosed herein enable real-time communication of the three-dimensional shape information acquired downhole to a surface system. As a result, image generation and, thus, image monitoring and/or analysis may be performed uphole and/or at the surface system in real-time. In some examples, the three-dimensional shape information is used to control operation of a downhole tool. Some examples disclosed herein employ a downhole database and an uphole database to enable uphole retrieval and/or presentation of predetermined information related to a downhole target based on the three-dimensional shape information.
Although only a few examples have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the examples without materially departing from this disclosure. Accordingly, such modifications are intended to be included within the scope of this disclosure as defined in the following claims. In the claims, means-plus-function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents, but also equivalent structures. Thus, although a nail and a screw may not be structural equivalents in that a nail employs a cylindrical surface to secure wooden parts together, whereas a screw employs a helical surface, in the environment of fastening wooden parts, a nail and a screw may be equivalent structures. It is the express intention of the applicant not to invoke 35 U.S.C. §112, paragraph 6 for any limitations of any of the claims herein, except for those in which the claim expressly uses the words ‘means for’ together with an associated function.
The Abstract at the end of this disclosure is provided to comply with 37 C.F.R. §1.72(b) to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.