Disclosed embodiments relate to the field of inspection of storage tanks that hold liquids, and more particularly to the automatic inspection of such storage tanks.
It is standard practice to use large metal storage tanks for storing a variety of liquids, such as beverage and petroleum products. Conventional large storage tanks are usually made from non-stainless steel plates, and in the case of petroleum products the storage tanks are generally made from ¼ inch (0.63 cm) to ½ inch (1.27 cm) thick steel plates welded together. The dimensions of conventional large storage tanks usually range in the hundreds of feet (100 feet=30.5 meters) in height and hundreds of feet in diameter. Tanks of this type require regular inspection and cleaning. Inspection of a large storage tank may include determining the sediments or contamination levels within the tank and evaluating the structural integrity of the structure (e.g., the tank shell).
Currently, the inspection and cleaning processes are performed manually, which poses certain disadvantages. First, manual inspection of conventional large storage tanks is prone to human error. Second, manual inspection of such expansive units can be cumbersome for the human inspector. Third, the involvement of an individual in such a process can pose serious health hazards to the personnel involved. In addition, the inspection process usually requires the shutdown of the storage tank from production, which can lead to revenue losses.
Another problem associated with the manual inspection of conventional large storage tanks involves the presence of multiple layers formed within the tank due to various compositions. In a petroleum product application, for example, multiple layers, such as crude, paraffin, water and sediment (technically collectively referred to as “sludge”), may form within the storage tank. Sludge, due to its higher density compared to the petroleum product stored, may form at the bottom of the tank, or stick to the sides of the tank, due to prolonged storage. The sludge layer may be anywhere from several millimeters to one meter or more thick.
One of the goals of a proper inspection is to estimate the sludge level in a storage tank, as well as other layer levels, as accurately as possible, in order to proceed with cleaning if required. It can be difficult for human inspectors, however, to measure the sludge level underneath other layers of liquid, nor is it easy for them to manually measure tens of sampling points on top of the tank. Another concern is integrity of the shell of the tanks, which can degrade due to aging, earthquakes, or bad weather impacts, etc., which generally requires accurate inspection with a traditional visual method applied to an emptied storage tank.
Therefore, there is a need to improve upon the inspection of storage tanks that store liquids, and, more specifically, a need for an automated system and method for inspecting large storage tanks that can accurately determine liquid levels within, as well as the sludge level.
Disclosed embodiments include a multi-sensor method for inspecting storage tanks that hold a liquid. Although each sensor has its own particular limitations when working in isolation, as disclosed herein, data from multiple sensors is combined in a way that complement each other to provide a reliable and accurate solution for the inspection of storage tanks that hold liquids.
The method includes generating an infrared (IR) image and an ultrasonic image of the exterior of the storage tank, and generating at least one of an ultrasonic (ultrasound) image and a radar image of the interior of the storage tank. The method further includes fusing the respective images, to generate a sludge level in the storage tank, a liquid level in the storage tank, and optionally an integrity profile of a shell of the storage tank.
Another disclosed embodiment comprises a multi-sensor imaging system for a storage tank that holds a liquid. The imaging system comprises an IR sensor including a scanning vertical mount positioned outside the storage tank, an ultrasonic sensor positioned outside the storage tank, and at least one of an ultrasonic sensor positioned inside the storage tank and a phased array radar positioned inside the storage tank secured to the interior top surface of the storage tank. The imaging system further includes a processor coupled to receive multi-sensor data from the respective sensors. The processor fuses (i.e. combines) the multi-sensor data to generate a sludge level in the storage tank, a liquid level in the storage tank, and optionally an integrity profile for a shell of the storage tank.
Disclosed embodiments are described with reference to the attached figures, wherein like reference numerals are used throughout the figures to designate similar or equivalent elements. The figures are not drawn to scale and they are provided merely to illustrate certain disclosed aspects. Several disclosed aspects are described below with reference to example applications for illustration. It should be understood that numerous specific details, relationships, and methods are set forth to provide a full understanding of the disclosed embodiments. One having ordinary skill in the relevant art, however, will readily recognize that the subject matter disclosed herein can be practiced without one or more of the specific details or with other methods. In other instances, well-known structures or operations are not shown in detail to avoid obscuring certain aspects. This Disclosure is not limited by the illustrated ordering of acts or events, as some acts may occur in different orders and/or concurrently with other acts or events. Furthermore, not all illustrated acts or events are required to implement a methodology in accordance with the embodiments disclosed herein.
Disclosed embodiments include a multi-sensor method and system for inspecting a storage tank that includes a liquid.
The phased array radar system 112 includes an antenna array 113, where the radiating elements of the antenna array 113 are connected to electronics and processing unit comprising high-resolution RF transmitter-receiver modules 118. The wide field of view (FOV) shown provided by phased array radar system 112 in
Data fusion of the data from the respective sensors (e.g., infrared, ultrasonics, and phased array radar) can be realized by converting the respective sensor measurements into a common coordinate system, since each of the sensors is placed at a different location, and therefore the reference axis varies among each sensor. Measurements taken from individual sensors can be mapped to a common reference system. Positioning data, such from as global positioning system (GPS) associated with the sensors helps to map the measurements from each of them onto one common coordinate system.
Alternatively, cross-sensor calibration, which refers to calibration between heterogeneous (i.e., different) sensors, can be used to convert the measurements from the different sensors into the common coordinate system. In the embodiment, including both an IR sensor 104 and an ultrasound sensor array 110 on the outside of the tank 102 mounted on the same vertical mount can ease the data fusion processing.
Once the measurements from the different sensors are transformed into a common coordinate system, non-uniform interpolation (or non-linear interpolation since the profiling is dealing with rough surface fluctuation) can be performed to populate the missing measurements from the various sensors to produce a continuous 3D sludge profile. Since the measurements from the different sensors are distributed randomly inside the tank, to obtain a continuous 3D (complete 360°) profile, the missing measurement points needs to be filled. The operation to fill missing measurements from available measurements is referred to as non-uniform interpolation (or non-linear interpolation since the profiling is dealing with rough surface fluctuation) to populate the missing measurements inside the tank. There are many known approaches available in the literature including, but not limited to, cubic spline, volume spline, polynomial. This data fusion process can provide a complete 360° sludge profile 120 inside the tank 102. Further, in one embodiment, the sludge profile 120 may be visualized with an interactive visualization tool provided on a computer, through which a tank operator can view the complete sludge profile by panning and zooming the tank image displayed on a suitable display device.
The phased array radar 112 for multi-sensor imaging system 100 can comprise a high-resolution phased-array imaging radar for imaging the liquid (e.g., oil) surface and sludge surface 120 within the storage tank 102. Conventional high-resolution imaging radars are known as radio detection and ranging instruments that are used in level measurements in tanks. Existing radar based gauging solutions are limited to measure a single liquid level, which is a single point measurement. The disclosed use of phased array radar facilitates measuring non-uniform profiles of sludge, which requires multi-point measurements. Depending on product type and level within the tank, the phase and power of the radar signal can be varied to accommodate different dielectric media.
Moreover, conventional radar level gauges cannot be used for imaging inside a large storage tank because they are developed for distance measurements, because they have large beam widths covering a wide footprint on the surface of liquid inside the tank, and because their antennas must be sealed to the nozzle on the storage tank to prevent leakage (e.g., petrochemical gas leakage) from exiting the tank. Further, moving parts on the antenna and radar body inhibit its use in storage tanks. Therefore, imaging the contents of a tank with conventional radar gauging technology is not generally possible.
Disclosed embodiments overcome the above-described problems by including a phased array radar 112 that can be a miniaturized high-resolution phased-array radar, which has a narrow beam-width, a smaller footprint, broad bandwidth, and electronically-controlled beam scanning over the surface of the liquid and/or sludge on the tank bottom.
The phased-array radar 112 is mounted at a fixed mechanical structure 208 inside the tank, shown at the top of the tank 102. The electronics and processing unit comprising high-resolution RF transmitter-receiver modules 118 associated with the phased array radar 112 are generally located outside the tank 102 as shown in
The beamforming of the phased-array radar 112 can be performed either by software, hardware or any combination of the two. Since the antenna array 113 is sealed and generally fixed to the tank nozzle, the radar beam is electronically steered with an angle 212 to cover the area of the tank content. The phased-array radar 112 cannot only provide the surface profile of the tank bottom, and sludge, but it can also provide accurate intermediate liquid-level data, which are currently measured by level gauges. Since movement of the liquid(s) within the tank 102 tends to be much more frequent than the change of the sludge and/or bottom, the phased-array radar 112 can be used as a level gauging radar when liquid movement takes place, in addition to being used as an imaging radar to detect the sludge and any defects in the bottom of the tank 102.
Multi-sensor imaging system 100 shown in
Ultrasound technology can be used for calculating the sludge level 120 in a tank 102. The acoustic frequency waves transmitted by the ultrasonic sensor into a tank farm provides different responses for different liquid levels, since sound waves travel at different velocities in different mediums and the reflected signal varies for sludge and other liquids. Ultrasonic waves can further penetrate through metallic material such as the tank wall and tank bottom. Consequently, ultrasound technology can also detect defects/cracks of the tank shell, which aids in preventing potentially large losses caused by leakages. Ultrasound technology further includes 3D ultrasonic sensing that formats sound wave data into 3D images. Moreover, four-dimensional (4D) ultrasound is 3D ultrasound in motion.
Obtaining a sludge profile on one side of a liquid storage tank is generally not helpful, due to the large circumference of the tank. Thus, obtaining the sludge profile on all sides of the tank is generally desirable. This can be provided by placing more than one sensor around exterior of the tank 102 such that the sensors cover the complete circumference of the tank. Once data from all sensors is obtained, the data can be used to generate a 2D profile. To obtain a 3D profile, either 3D data can be used by adding data for a third dimension as described below, or be synthetically generated to provide a quasi-3D profile by non-linear interpolation as described above. In the case of real 3D data, the top profile of the sludge within the tank 102 can be obtained, such as by an array of sensors placed on the tank that transmit narrow-beam signals, and the reflected signals are captured and used to calculate the height distribution of the top profile of the sludge so that a 3D volumetric profile may be generated.
As disclosed above, multi-sensor imaging system 100 also comprises IR sensor(s) or “thermal” cameras. There are several example single-view and multi-view camera configuration options for tank farm monitoring applications disclosed herein. The benefits and limitations of these configurations are also described with respect to complexity, pixel resolution and measurement accuracy.
Embodiments of the invention include many different other sensor combinations. In another example arrangement, the imaging system includes an array of ultrasonic sensors on the bottom of the tank that are connected to a single ultrasound sensor or array of ultrasound sensors on the top of the tank. This arrangement provides direct transmission measurements, in comparison with more conventional back reflection sensor measurements.
Where X is the height 312 of the tank 102 in meters and θ is the FOV in radians.
For the above considered tank dimensions, the pixel resolution is 6.25 cm in vertical direction and 1 cm in horizontal direction. If the image-based measurement of the sludge level 120 deviates from actual level by 1 pixel in the vertical direction, it translates to a variation of 7,960 liters of liquid for the tank dimension under consideration.
For a tank 102 of 40 meters in diameter and the IR camera 104 having a 24° FOV and resolution of 640×480, the achievable pixel resolution (and hence measurement accuracy) in terms of distance between camera and tank is summarized in Table 1 provided below.
As evident from the above table, it is desirable to place the IR camera 104 as close as possible to the tank 102 for better pixel resolution and hence measurement accuracy.
While this solution provides advantages for determining the topography of sludge in both uniform as well as non-uniform deposition scenarios, it faces several technological challenges. Some of the challenges include image registration from texture-less tank surfaces and data/inference fusion from IR cameras with potentially different temperature calibration.
Consequently, the accuracy of the solution above may be bolstered by taking input from other systems while estimating the sludge level within the tank 102. A predictive model, for example, which takes input from operational data (such as the type of storage, duration of storage, tank age, fluid properties, etc.), inspection data, maintenance data and prevailing ambient weather data (temperature, humidity etc.), may be used estimate a statistical prediction for the sludge level 120 within the tank 102. Conventional predictive model systems provide predictions with accuracies of a few feet (such as 4-5 feet). Below is a more detailed description of how a predictive model may improve the performance of the multi-sensor imaging systems described herein.
As described above, one disclosed embodiment predicts the approximate level of the sludge 120 using the predictive model. Using the railing controller 706, the camera position can be adjusted to point at those regions of the tank indicated by the predictive model as being the location of the sludge level, so as to cover those areas the camera's FOV. On the captured image, image pre-processing may be performed such as de-noising (such as to remove speckle noise) and image enhancement (such as to counter poor contrast). The preprocessed image can then be subjected to an image segmentation algorithm for estimating the sludge level. The image segmentation can exploit the gray level difference resulted in the captured image due to temperature differences between crude oil and sludge in the tank 102.
Using the following expression, the image-based measurement can be translated to an absolute value for sludge level:
Where Hc is the height of the camera, α0 and α1 are the tilt angles of the top most and bottom most points relative to the camera's optical axis and computed using the following equation:
where v=vbot or vtop, (U0, V0) is image center
The IR camera configuration shown in
A summary of the various characteristics of the different IR camera configuration choices for image-based tank farm monitoring is provided in Table 3 below.
As explained above, IR camera configurations can provide a 360 degree panoramic view of the sludge profile 120. For a given tank 102, such as shown in
While various disclosed embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Numerous changes to the subject matter disclosed herein can be made in accordance with this Disclosure without departing from the spirit or scope of this Disclosure. In addition, while a particular feature may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, to the extent that the terms “including,” “includes,” “having,” “has,” “with,” or variants thereof are used in either the detailed description and/or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising.”
As will be appreciated by one skilled in the art, the subject matter disclosed herein may be embodied as a system, method or computer program product. Accordingly, this Disclosure can take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, this Disclosure may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium.
Any combination of one or more computer usable or computer readable medium(s) may be utilized. The computer-usable or computer-readable medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include non-transitory media including the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CDROM), an optical storage device, or a magnetic storage device.
Computer program code for carrying out operations of the disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The Disclosure is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a physical computer-readable storage medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.