This application claims priority to GB Application No. 2201020.1, filed on Jan. 27, 2022, which is incorporated herein by reference in its entirety.
The invention relates generally to processing industrial ultrasound data, in particular, large streams of raw channel data collected from high density phased arrays.
Ultrasound is commonly used to inspect parts and tubulars, such as pipelines and wells, for defects. The ultrasound waves reflect off surface features and penetrate materials to reflect off cracks and voids then return to the ultrasound transducer. Devices for in-line inspection (ILI) of pipelines and Non-Destructive Testing (NDT) of parts electronically sample the ultrasound reflections and convert them to digital data. These Radio Frequency data are termed ‘RF data’ as the data are sampled in the radio frequency band. This data may be stored or streamed to an operator.
Downhole devices typically beamform on the device in real-time and store the beamformed image in local memory.
While improving the image resolution by increasing transducer count and sampling frequency is desirable, the devices eventually struggle to store, send or process that much more data. Thus, the transducer count and/or the sampling rate is kept within a manageable range.
Phased array transducers are particularly effective for improving the image quality. These arrays typically include hundreds of transducer elements that acquire huge amounts of data, especially running continuously over the length of a pipeline, which data will be beamformed or otherwise processed in order to view it. In future tools, it is predicted that the data may accumulate at 50 TB per kilometer of pipe. This is a formidable amount to store on tool or transfer off.
In portable Ultrasonic Testing (UT), the issue is less about data storage and more about real-time processing and transfer of data to the operator's computing device. Future scanners could generate over 10 Gbit/sec of data, exceeding common network technologies or data transfer solutions.
One general aspect includes a method of processing ultrasound data may include the steps of: sampling ultrasound reflections at high-frequency to created raw sampled data; converting the raw radio frequency (RF) data into in-phase and quadrature (I/Q) data, video compressing the I/Q data to create compressed I/Q data, and then outputting the compressed I/Q data. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
Implementations may include one or more of the following features. The method may include imaging a tubular using an ultrasonic imaging tool and storing the compressed I/Q data on a local memory of the tool. The method may include capturing ultrasound images of mechanical or manufactured part using a handheld imaging tool and outputting the compressed I/Q data in real-time to a separate computing device. The method may include decompressing the compressed I/Q data into decompressed I/Q data and beamforming the decompressed I/Q data to create ultrasound images. I/Q conversion may include converting the radio frequency raw sampled data to in-phase/quadrature data through quadrature sampling or demodulation, then downsampling and decimating the I/Q data to create the downsampled I/Q data having a lower data rate than the radio frequency raw sampled data. The method may include combining multiple frames of I/Q data for the step of video compressing, preferable combined to a file size matched to a memory of a compression chip that performs the video compression. The combining may include grouping I and Q terms together in the same tile, thus improving the video compression ratio. The video compressing may include separately compressing I and Q terms in the converted I/Q data The video compressing may use h264/h265/h266 standard or newer, h265 being High Efficiency Video Coding (HEVC) also known as MPEG-H. The transducer, processing circuits and memory are parts in a handheld non-destructive testing tool. The transducer, processing circuits and memory may include in each of plural imaging modules of an inline inspection tool, preferably where the plural imaging modules each transfer their respective compressed I/Q data to a central network switch for uploading to external computing devices.
One general aspect includes an industrial inspection device having an ultrasonic phased-array transducer; one or more processing chips arranged to : sample ultrasonic reflections at high-frequency from the transducer to created radio-frequency raw sampled data, convert the raw sampled data into I/Q data, video compress the I/Q data to create compressed I/Q data. The device also includes a memory to store the compressed I/Q data. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
Implementations may include one or more of the following features. The device where the one or more processing chips includes a dedicated video compression unit (VCU), preferably arranged for high efficiency video coding through MPEG-H video compression standard. The one or more processing chips includes an ultrasonic front-end chip for driving the transducer, sampling received signals at the transducer, and converting the signals to the raw sampled data in digital format. The device may include a computing device communicatively connectable to a transfer network of the device and arranged to receive and decompress the compressed I/Q data from the memory and then beamform decompressed data to create ultrasound images. The device may include a transfer network for outputting the compressed I/Q data from memory. The transfer network uses gigabit or faster Ethernet. The transfer network is a wireless network arranged for real-time streaming of the compressed I/Q data.
Thus preferred embodiments of the invention enable the device to image conduits, such as pipes and wells over long distances, providing the image data for real-time monitoring or subsequent visualization.
Various objects, features and advantages of the invention will be apparent from the following description of embodiments of the invention, as illustrated in the accompanying drawings. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of various embodiments of the invention.
With reference to the accompanying figures, devices and methods are disclosed for capturing, processing, and storing ultrasonic reflections from an ultrasonic transducer array. In accordance with the embodiment of
As a persistent example used hereinbelow, the imaging device may be used to image long pipelines (e.g. >100 km) and comprise plural (e.g. 10) ultrasonic phased arrays, each having hundreds (e.g. 128) transducer elements. Acoustic reflections may be sampled at 40 Mhz for several hours. The total data may thus exceed 5000 TB (terabyte).
The device may comprise hardware, exemplified by
To make the ultrasonic channel data more manageable, it is compressed. This compression can happen in multiple steps.
The downsampled, compressed I/Q data are then stored on device memory or transferred to a remote computer for real-time inspection or offline analysis. On this computer 5, 19, separate from the imaging device, the compressed data is decompressed and unassembled into original I/Q data, still downsampled, for individual image captures. The processor then beamforms the I/Q data directly, i.e. without up-sampling or modulation back to RF data. This is different than most existing systems where no compression is applied to raw channel data and beamforming is done on the original RF data to create an ultrasound image.
This chip is capable of capturing the channels at some settable sample frequency (e.g., 20-40 MHz), converting this RF data to I/Q data, then downsampling or decimation. The combined output from these chips may provide 128 channels of downsampled I/Q data, which may be configured to 10-bits per channel. This data may be sent to FPGA 20 for further arrangement of data and to facilitate communication with the compression chip 30.
The FPGA may receive the downsampled I/Q data for plural frames and place them into the memory buffer 50 to assemble a tile of data. The FPGA may be programmed to re-arrange the order of the streamed data, change bit rates or data formats to create frames of I/Q data in the correct format. Each frame N represents the data captured for a full transmit event using the phased array, which frame may be for different or repeated steered angles M. As shown in
The image processor 30 reads the I/Q frame and compresses it using the standards H264, H265 (HVEC), and H266. The compressed image is stored to device memory 35. The image processor may be a VCU with dedicated compression hardware and may be part of a larger System on Module
For efficiency, the dimensions of the I/Q tile matches the video compression unit's native image dimensions, which allows I/Q data from plural ultrasound images to be combined into one super I/Q tile, shown as the assembled I and Q buffer 50. For example, the VCU may be designed to compress 4K video, which is normally 4096×2160 pixels up to 10-bit depth. The VCU's memory and processing are optimized for data blocks of those dimensions. An exemplary 256 channel transducer, downsampled to 480 points and split into I and Q terms could thus be tiled 8 (8×256 pixels<2160) by 8 across (8×480<4096) to fit into the 4K memory. In this manner 36 ultrasound acquisitions can be compressed in a single operation.
Quadrature sampling and Demodulation are known techniques for representing RF signals as in-phase and quadrature carriers. This technique is ideal for narrow band signals, such as the ultrasonic reflections, which may be sampled at 40 MHz and centered on 5 MHz for a 5 MHz centered array, with 80% of the signal bandwidth between 3 MHz and 7 MHz. The I/Q terms sample a specific part of the spectrum. This does initially double the data size as each channel is now represented by a I and Q data.
In quadrature sampling, The RF signals can be converted into I/Q signals by sampling with a quarter wavelength (90° shift) between the I and Q terms. Doing so will result directly in digital I/Q signals without need to sample the RF raw signals themself, if done analogically. A decimation step to reduce the number of samples can be done at the same time, during sampling giving the I/Q terms.
In demodulation shown in
The downmixed I/Q signals are then decimated 5×, 10×, 20×, or 40× to reduce the total samples per channel.
The FPGA may assemble these downmixed, decimated I/Q signals into buffer 50, preferably interleaved I and Q terms to improve video compression. Preferably this data is also encoded into smooth positive integer ranges. The MIPI transfers this buffer frame comprising multiple channels to the image processor. The image processor has a VCU for real-time video compression, using a codec, such as MJPEG or (HEVC) High Efficiency Video Coding (HEVC), also known as H.265 and MPEG-H Part 2.
HEVC compression on the assembled frame is able to compress the data by 10 to 100 times depending upon the sparsity and noisiness of the input data. Combining the 10× video compression with 40× decimation on the I/Q data (where I and Q signals are double the signals of the RF alone) leads to a 200× overall (10× 40/2) compression factor. The compressed data can then be stored on a memory of the tool and transferred to a remote computing device. When a user is ready to view the ultrasound image, the compressed data is decompressed, but left in I/Q format.
In preferred embodiments, the phased array emits a single defocused wave toward the inspection target. The wavefront may be a planewave or curved wave insonifying a large area of the target at once. This may use substantially all of the array elements for transmission and receiving for each image capture. Compared to a focused scan line, using only a subset of the array elements, a defocused wave returns reflections over a wide volume. Parallel beamforming deconvolves the reflections in the data to create an image by sampling from each channel at a time corresponding to the time-of-flight to each location and combines them to calculate the image value for the corresponding pixel.
Normally beamforming is performed on the original RF data or restored RF data using standard delay and sum algorithms. In preferred embodiments of the present system, beamforming is performed by the separate computing device (5, 19), on the decompressed I/Q data directly using a delay-and-phase beamformer coupled to a phase rotation algorithm. The beamforming is performed on the I/Q data without upsampling or modulating back to RF data.
To clarify, ‘decompression’ is used herein to describe the process of uncompressing compressed data into a second uncompressed data file, being distinct from uncompressed data that was never compressed.
In one application of the present method, ultrasonic imaging devices are present in an in-line inspection tool, commonly referred to as a PIG. Such a tool may have multiple imaging devices, each comprising ultrasonic sensors, ultrasonic driver 14, FPGA 20, image processor 30, and local memory 35. During in-line inspection, captured data is video compressed and stored in real-time to local memory. When a pipeline job is complete, there will be several Terabytes of compressed data to transfer out to a computer external to the tool.
Similar ultrasonic imaging devices and methods may be used in a downhole imaging tool, typically used to inspects oil and gas wells (cased or uncased). Wells are tubular in shape and typically a few kilometers, instead of the hundreds seen in pipelines, but there is still a vast amount of data generated that needs to be stored and uploaded.
In one application the present method and processing circuit may be used in a Non-Destructive Test (NDT) device, as shown in
The proposed compression method and circuitry is thus used to compress Rf data from the NDT transducer array and transfer it to an external computer using the wired or wireless network. The external computer then decompresses the I/Q demodulated, downsampled I/Q video stream and beamforms the I/Q terms to create an ultrasound image or video, which is then displayed on a screen. The external computer may be a mobile computing device running applications to beamform and render the ultrasound images.
The devices and system may comprise one or more computers configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
Without loss of generality, each of the processing components may comprise multiples of such chips, e.g., the memory may be multiple memory chips. For the sake of computing efficiency, several of the functions and operations described separately above may actually by combined and integrated within a chip. Conversely certain functions described above may be provided by multiple chips, operating in parallel.
The term ‘processor’ is intended to include computer processors, cloud processors, microcontrollers, firmware, GPUs, FPGAs, and electrical circuits that manipulate analogue or digital signals. While it can be convenient to process data as described herein, using software on a general computer, many of the steps could be implemented with purpose-built circuits. In preferred embodiments of the present system, the device processing circuits (14 , 20, 30) provides signal conditioning, data compression and data storage, while the remote processor 19 provides data decompression and image processing.
It will be appreciated that the various memories discussed may be implemented as one or more memory units. Non-volatile memory is used to store the compressed data and instructions so that the device can function without continuous power. Volatile memory (RAM and cache) may be used to temporarily hold raw data and intermediate computations.
Although the present invention has been described and illustrated with respect to preferred embodiments and preferred uses thereof, it is not to be so limited since modifications and changes can be made therein which are within the full, intended scope as understood by those skilled in the art.
Number | Date | Country | Kind |
---|---|---|---|
2201020.1 | Jan 2022 | GB | national |