In the related application mentioned above, processes are described that assist with the identification of potential hydrocarbon deposits that include performing a structural interpretation of a three-dimensional seismic volume, transforming the three-dimensional seismic volume into a stratal-slice volume, performing a stratigraphic interpretation of the stratal-slice volume which includes the extracting of bounding surfaces and faults and transforming the stratal-slice volume into the spatial domain. As illustrated, an exemplary seismic volume before domain transformation is presented in
This workflow and automated or semi-automated method and system for identifying and interpreting depositional environments, depositional systems and elements of depositional systems from 3-D seismic volumes benefits from data pre-processing.
It is an aspect of the present invention to provide systems, methods and techniques for data processing.
It is another aspect of this invention to provide systems, methods and techniques for seismic data pre-processing.
It is a further aspect of this invention to provide systems, methods and techniques for 3-D seismic data pre-processing.
A further aspect of this invention is directed toward determining a voxel connectivity score.
A still further aspect of this invention relates to reducing data “clutter” based on the voxel connectivity score.
Still other exemplary aspects of the invention relate to reducing a seismic response of a reflector to a lobe.
Still other exemplary aspects of the invention relate to reducing a seismic response of a reflector to a main lobe.
Another exemplary aspect of the invention is directed toward removing extraneous reflections in seismic data.
Further exemplary aspects of the invention relate to highlighting and enhancing lithologic boundaries to assist with interpretation of seismic data.
Additional aspects of the invention relate to scoring and utilizing confidence in a 3-D data volume.
Still further aspects of the invention relate to using local data redundancy to generate and output a stable estimate of confidence in a data set.
Additional aspects of the invention relate to one or more means for performing the steps enumerated in claims 1-28.
Additional aspects of the invention relate to a data volume stored on a computer-readable storage media including data representing geologic information formed in accordance with any one or more of the steps in claims 1-28.
Additional aspects of the invention relate to a computer-readable storage media having stored thereon instructions that when executed by a processor performs the steps of any one of claims 1-28.
These and other features and advantages of this invention are described in, or are apparent from, the following detailed description of the exemplary embodiments.
The exemplary embodiments of the invention will be described in detail, with reference to the following figures. It should be understood that the drawings are not necessarily shown to scale. In certain instances, details which are not necessary for an understanding of the invention or which render other details difficult to perceive may have been omitted. It should be understood, of course, that the invention is not necessarily limited to the particular embodiments illustrated herein.
a-f) illustrate an exemplary application of voxel connectivity with progressively higher connectivity score thresholds to seismic data: a—input “sparse” seismic section; b-f—input seismic data filtered by voxel connectivity with connectivity scores progressively increasing from 100 in b to 20,000 in f;
a-c) illustrate an exemplary application of voxel connectivity with progressively higher connectivity score thresholds to seismic data: a—input “sparse” seismic section; b-c—input seismic data filtered by Voxel Connectivity with progressively increasing connectivity scores;
a-c) illustrate an exemplary application of voxel connectivity with progressively higher connectivity score thresholds to a different vertical seismic section from the same seismic data volume used in
a-d) illustrate an exemplary seismic section processed using reflection collapser: a—input data; b—reflection collapser processed section; c—reflection collapser processed section for peaks only; d—reflection collapser processed section for troughs only;
a-d) illustrate an exemplary reflection collapser applied to a sparse input data set: a—input data; b—reflection collapser processed section; c—reflection collapser processed section for peaks only; d—reflection collapser processed section for troughs only;
a-b) illustrate an example of voxel suppression applied to an example seismic volume: a—a section from the input seismic volume; b—the same section from the volume filtered with Voxel Suppression;
a-b) illustrate an example of voxel suppression applied to the same example seismic volume as in
a-b) illustrate an example of voxel suppression applied to an second example seismic volume: a—a section from the input seismic volume; b—the same section from the volume filtered with voxel suppression;
a-c) illustrate an example of the numerical results of voxel density calculations on a 10×10 data array: a—the input two-dimensional array of data; b—the output two dimensional array of data processed using a 3×3 voxel density operator accepting all input values greater than or equal to 6; c—the results of further constraining the output density score to be greater than or equal to 4;
a-b) illustrate an example of the numerical results of voxel density calculations on a 10×10 data array from
a-d) illustrate an example of a graphical representation of the results described in
a-f) illustrate en exemplary comparison of some standard data smoothing operators with density guided smoothing: a, b—the results of applying a 3×3 mean and median filters, respectively, to the raw data in
a-d) illustrate various exemplary described filters applied to a horizontal slice through a continuity or coherence volume showing a canyon: a—input data; b—removing voxels that have a density score lower than a specified cutoff; c—the results of applying confidence-adaptive smoothing where voxels that failed the minimum density test and were outside the valid threshold range were included in smoothing; d—the result of applying contrast enhancement the data in a;
a-b) illustrate exemplary numerical and graphical results of applying contrast enhancement to the sample data array from
a-e) illustrate an exemplary effect of locally adaptive voxel density-controlled smoothing and contrast enhancement on a time slice from a Gulf of Mexico data set: a—the raw seismic data that with a 3×3 median filter applied to reduce random noise; b—the result of calculating coherence on the data in a; c—the result of calculating the variance of the data in b; d,e—the result of applying locally adaptive contrast enhancement (d) and smoothing (e), controlled by the variance distribution in c, to the data in b;
a-d) illustrate an exemplary effect of locally adaptive voxel density-controlled smoothing and contrast enhancement on a deeper time slice from the Gulf of Mexico data set used in
a-f) illustrate exemplary effects of applying contrast enhancement to coherence data on the output of a fault enhance calculation: a—a coherence time slice showing a portion of a salt body, with surrounding faults; b,c—the result of applying two levels of contrast enhancement to the data in a; d—the fault enhanced output using the raw coherence data (a) as input; e—the fault enhanced output using the contrast enhanced data from (b) as input; f—the fault enhanced output using the contrast enhanced data (c) as input;
a-d) illustrate an example of voxel density applied to coherence. Panel (a) contains a coherence image of a submarine canyon. Panel (b) shows the result of applying binary voxel density filtering to the data in Panel (a). Voxels that fail the minimum density threshold test are assigned a null value. Panel (c) show the result of voxel density controlled smoothing. voxel density score are used to alter the data contrast in Panel (d). Voxel Density-controlled smoothing and contrast enhancement preserve the original context of the data, rather than simply removing voxels that fail the density threshold test;
a-b) illustrate an example of a voxel suppression result. Panel (a) contains a raw amplitude section. The top and bottom of the horizontal tabular salt body are indicated by the arrows in panel (a). Panel (b) shows the result of applying voxel suppression to the data in panel (a);
a-d) illustrate an example of the reflection collapser applied to sparse seismic data;
a-c) illustrate exemplary results of applying voxel connectivity; and
a-d) illustrate an example of the workflow applied to real data.
The exemplary embodiments of this invention will be described in relation to processing and interpretation of data, and in particular seismic data. However, it should be appreciated, that in general, the systems and methods of this invention will work equally well for any type of data representing any environment, object or article.
The exemplary systems and methods of this invention will also be described in relation to seismic data interpretation and manipulation. However, to avoid unnecessarily obscuring the present invention, the following description omits well-known structures and devices that may be shown in block diagram form or otherwise summarized.
For purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the present invention. However, it should be appreciated that the present invention may be practiced in a variety of ways beyond the specific details set forth herein.
Furthermore, while the exemplary embodiments illustrated herein show the various components of the system collocated, it is to be appreciated that the various components of the system can be located at distant portions of a distributed network, such as a communications network and/or the Internet, or within a dedicated secure, unsecured and/or encrypted system. Thus, it should be appreciated that the components of the system can be combined into one or more devices or collocated on a particular node of a distributed network, such as a communications network. As will be appreciated from the following description, and for reasons of computational efficiency, the components of the system can be arranged at any location within a distributed network without affecting the operation of the system.
Furthermore, it should be appreciated that various links can be used to connect the elements and can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements. The term module as used herein can refer to any known or later developed hardware, software, firmware, or combination thereof that is capable of performing the functionality associated with that element. The terms determine, calculate and compute, and variations thereof, as used herein are used interchangeably and include any type of methodology, process, mathematical operation or technique, including those performed by a system, such as a processor, an expert system or neural network.
Additionally, all references identified herein are incorporated herein by reference in their entirely.
The voxel connectivity module 110 assists with the mapping of connected voxels. Seismic data volumes can be rendered sparse by data processing steps designed to remove insignificant data points. Similarly, some seismic attributes result in large areas of null or undefined data. In both these cases, the null and undefined areas are commonly speckled with insignificant data that ‘leak’ through the processing step used to create the volume. This visual clutter can complicate the use of such volumes for segmentation or user or computer-interpretation of important features present in the volume. The removal of some or all of this visual clutter is one exemplary goal for increasing the utility of these data volumes.
An exemplary embodiment of the operation of the voxel connectivity mapping module 110 determines which voxels are constituent members of connected features in the data volume. The ‘connectivity score’ (how many voxels make up the feature) can then be used to remove what are identified as small, and thus insignificant, features through instituting a minimum feature size threshold for the output volume.
An exemplary embodiment of voxel connectivity maps out all connected non-null voxels in a volume. After mapping connected voxels, the connectivity score of each connected feature in the volume is defined as its number of constituent voxels. Visual clutter can then be filtered by removing features from the output data volume that have a connectivity score lower than some minimum threshold. In this manner, small features are removed from the data volume which can then be output and saved.
If voxel connectivity is applied to a sparse amplitude volume, amplitude polarity can be used as an additional optional constraint for connectivity mapping. For example, if mapping out a positive amplitude reflection, only positive amplitudes are considered to be non-null.
The output volume from the voxel connectivity module 110 allows the user to remove unwanted visual clutter from a sparse data set. Connected bodies are scored based on the number of constituent voxels. Features with a connectivity score lower than a user-specified threshold are then removed from the output data volume. This technique can be a powerful tool for preparing data sets for seismic interpretation such as segmentation.
Seismic reflections consist of a main amplitude response, and several more minor flanking amplitude responses. These additional responses, other than the main response, complicate the use of amplitude volumes for computer segmentation of significant reflectors. In most processed seismic data volumes, a zero phase wavelet as illustrated in
An exemplary embodiment of the operation of the reflection collapser module 120 reduces the seismic response of a given reflector to a main lobe. This removes ‘clutter’ that may be unnecessary for the interpretation of high amplitude reflections in the seismic volume. Computer interpretation processes and algorithms are also aided by removing extraneous reflections from a seismic data volume.
In its most basic exemplary form, the reflection collapser module 120, for example in cooperation with the controller 130, convolves a 1-D operator with the input volume. For each operator position, a test is run to see if the operator's center voxel has the highest absolute amplitude of all the voxels contained within the operator. If the highest absolute amplitude voxel is not in the center of the operator, nothing is done and the operator moves to the next voxel. However, if the highest absolute amplitude is at the center of the operator, the module writes that voxel value to the output volume in its original position. Two searches are then performed to determine the extent of this reflection lobe.
The first search is performed upward from the center voxel. The search extends upward until a zero-crossing is encountered. The search then ceases. The second search is performed in a similar manner in the downward direction. In this manner, the full extent of the main reflection lobe is written to the output volume by the module.
Other steps are also performed in order to ensure stability of the module's performance. Local variations in amplitude, as well as random noise, can cause a reflection side lobe to locally have a greater absolute amplitude than the main reflection lobe. In order to prevent this from introducing discontinuities to the main reflection lobe, a pre-processing step can be performed to regularize the amplitude of all reflection lobes present.
In this exemplary pre-processing step, all member voxels of each reflection lobe are mapped using connected polarity analysis. Connected polarity analysis is similar to connected threshold analysis in that it determines which voxels are connected in a 3-D body. The difference lies in the fact that the polarity of the voxel is the only parameter used to determine connectivity, rather than the use of a threshold range. Once all member voxels of a reflection lobe are mapped, the mean amplitude of that lobe is calculated. This mean is the amplitude value used to determine which lobe in a reflection is the main reflection lobe. The main process described above is then used to remove the side lobes of the reflection.
a contains the raw data for this section. Both high amplitude and low amplitude reflections are present.
a shows a sparse version of the data in
The exemplary reflection collapsor module 120 removes one or more side lobes (either above and/or below a main lobe) from reflectors in a seismic data volume. This removes unnecessary clutter from a volume being used to interpret high amplitude reflections. Human interpretation and computer segmentation of these high-amplitude reflections can benefit from these data processing techniques.
The expression of lithologic boundaries in a seismic data volume can be quite complicated. In the case of salt or diagenetic boundaries, they can cut through the data set in any imaginable orientation and configuration. The manual interpretation of such interfaces can be extremely time consuming when performed by hand. The automation of this type of interpretation is a very important research goal in seismic data interpretation. Voxel suppression is a first step towards highlighting and enhancing the lithologic boundaries to aid their human and computer-automated interpretation.
An exemplary embodiment of voxel suppression is a method to emphasize high amplitude events in a seismic volume. This is accomplished by the voxel suppression module 160 rendering a volume sparse, while maintain locally high amplitude events in their original positions. Preserved voxel values can optionally be resealed in order to boost the strength of weak events. This resealing normalizes the expression of significant reflections throughout the volume.
The exemplary voxel suppression module 160 convolves a 3-dimensional operator with the input seismic data volume. For each operator position, all voxels within the operator are sorted by absolute value. A user-specified (entered via a user input device (not shown)) percent of the highest values are preserved in their original position. This percent of preservation is typically small; less than 15% for all applications.
If the user prefers, these preserved values can be resealed by the voxel suppression module 160 to regularize the expression of locally significant reflections throughout the volume. This is accomplished by calculating the standard deviation of all the voxels contained within the operator, and resealing these values to make the local standard deviation match the standard deviation of the whole volume. To accomplish this, the all preserved voxels are multiplied by a resealing factor (RF). The RF is calculated as:
RF=Whole volume Std. Dev./Operator Std. Dev.
In some cases voxel values are boosted, while in other cases they may be lowered. The end result is that all features preserved within the volume have a similar appearance.
The data can also be resealed by the voxel suppression module 160 to give emphasis to voxels in the center of the operator. A radial cosine taper can be used to give more emphasis to preserve voxels in the center of the operator, rather than at its edges. This cosine taper rescales voxels based on their distance from the center of the operator. The center voxel is resealed by a factor of 1 (no change). The most distal voxels are resealed by a factor of zero (zeroed out). In between, a sinusoidal taper can define the rescale factor for each individual voxel contained within the operator.
This exemplary combination of steps can enhance locally high-amplitude reflections, while removing extraneous surrounding voxels. The result is a visually cleaner volume that is more easily enhanced by other attributes for the purpose of automating interpretation.
a shows the raw data volume before running the voxel suppression module 160 applies voxel suppression. Locally high-amplitude reflections are preserved by the voxel suppression technique (
a contains raw seismic data from another survey.
Thus, one exemplary operational embodiment of the operation of the voxel suppression module 160 is a running window operator that is convolved with the whole volume. For each operator position, a series of exemplary processing steps are performed. They are:
sort the voxels based on absolute value,
rescale all voxel values to make the local operator's standard deviation match the global standard deviation,
preserve the upper user-specified percent of rescaled values (zero out all other values),
scale the preserved values based on position within the operator using a cosine taper, and
output the tapered values in their original position.
In this manner, the locally significant amplitude events are preserved and given a regular expression, while insignificant reflections are removed. The resulting saved volume is sparse, including only the rescaled highest amplitude reflectors.
The voxel suppression module 160 renders a volume sparse by removing all but the most significant reflections throughout the volume. The resulting volume emphasizes major acoustic impedance boundaries. These high impedance contrasts will be present at major lithology changes. As such, the application of voxel suppression can be a useful first step to highlight complex interfaces such as salt boundaries.
Attributes calculated from 3D seismic data volumes are commonly noisy and chaotic in their representation of geologic trends. The complex morphology and expression of geologic features can result in inconsistent performance of a given attribute for highlighting features of interest. Structural and diagenetic overprinting can also complicate attribute results.
The handling of noise and regularization of uneven attribute performance is potentially a very important research goal. An exemplary embodiment of a voxel density technique is a way to score the local significance of data trends within a 3-D seismic volume. Significant regions can then be enhanced or normalized, while insignificant regions can be suppressed or filtered out.
An exemplary operational embodiment of the voxel density module 170 includes a running window algorithm carried out by the running window module. For each operator position, the number of data points within the window that fall within a given threshold range are counted; yielding a density score. Areas of high density score are considered to have high confidence. Conversely, areas of low density score are assumed to be noise and are filtered out or deemphasized. Noise can be filtered by removing data points from regions of low density score. By smoothing high confidence regions less aggressively, significant edges can be preserved during smoothing. Volume contrast can also be enhanced in an attribute volume; boosting the signal-to-noise (S/N) ratio.
A variety of techniques can be used to control noise in a data volume. Mean and median filtering are methods of filtering that work well for random noise. Similarly, wavelet transforms are another powerful tool for the filtering of random noise. However, noise is not the only issue that plagues attribute results. Uneven performance is perhaps a greater impediment to the rapid utilization attribute results.
The realities of geology rarely mirror the simplicity of conceptual models. Factors not accounted for by conceptual models commonly confuse an attribute designed to image a given geologic feature. Further complicating attribute performance is the variety of scales imaged by seismic surveys. Sub-seismic resolution features can introduce tuning effects into the data that are indistinguishable from noise by many attributes.
The exemplary voxel density technique uses local data redundancy to create a stable estimate of confidence in a data set. Features of interest in a data volume generally persist for some distance in each direction. The persistence of these features can be used to overcome their uneven expression in a given data volume. This is accomplished by the convolution of a 3-D operator with the data set. A measure of confidence is calculated by the voxel density module 170 for all voxels. This confidence score can then be used to guide filtering and enhancement operations.
The exemplary voxel density module 170 convolves a 3-D operator with the input data volume. For each position of the running window operator, the number of voxels that fall within a given threshold range are counted. The result of this counting operation is the density score of the window's center voxel. High density scores indicate voxels of high confidence. Low density scores highlight voxels of low confidence. In this manner a stable, non-chaotic, estimate of volume (or attribute) quality can be achieved. The user can select a specific range of density values that are significant, and highlight the areas where the values exist in a high concentration.
Density estimates may be determined in two exemplary ways. In the first, the voxel density module 170 determines a density score for every operator position. This is the manner of calculation used in
When applied to a data volume, voxel density produces a density score volume. This volume is similar to the results presented in
The density score volume can also be thought of as a volumetric confidence estimate. With this estimate of data confidence, a variety of operations can be performed. These operations include one or more of noise filtering, edge-preserving smoothing, and volume contrast enhancement.
An input data volume can be modified and enhanced in a variety of ways using the voxel density module 170. The density score volume can be thought of as an estimate of confidence for trends in the volume. Using this confidence estimate, it is possible to enhance the volume through density score-guided resealing of voxel values. Threshold filtering can remove data that are not of interest. It is also possible to control the degree of smoothing, where regions of low confidence are smoothed more than regions of high confidence.
Filtering by Density Threshold
Binary filtering can be accomplished by removing voxels that have a density score lower than a specified cutoff. In this manner, insignificant data regions can be removed. Voxels that have a density score lower than the specified minimum are replaced by null values. This is demonstrated on a numerical array in
Density-Guided Smoothing
The same criterion of whether a voxel passes the minimum density threshold can be used to control smoothing operations within a data volume. By smoothing voxels that fail the minimum density test more than voxels that pass the test, insignificant data regions can be deemphasized. It is also possible to control which voxels are included in the smoothing operations.
a and 19b contain the results of applying a 3×3 mean and median filter, respectively, to the raw data in
c contains the results of applying this confidence-adaptive smoothing to the Coherence image of a submarine canyon. Only voxels that failed the minimum density test were smoothed. Only voxels that fall outside the valid threshold range were included in smoothing. It should be noted that this method of adaptive smoothing has preserved the fine detail present in the edges in the canyon.
Contrast Enhancement of Data Volumes
Whether a voxel passes or fails the minimum density test can be used to control resealing of the data volume. Voxels that pass are resealed by a factor greater than 1. Those that fail the test are resealed by a factor less than 1. The precise resealing factor depends on the original data value, and the density score of that voxel. In concept, each voxel is resealed by a percent of the difference between its own value and the extreme value it is being resealed towards. The percent resealing is controlled by two equations. For voxels that pass the minimum density test:
Ratiopass=1+(Dscore−Dneutral)/(Nvalues−Dneutral)
For voxels that fail the minimum density test, the resealing ratio equation is:
Ratiofail=(Dneutral−Dscore)/Dneutral
This ratio is then multiplied by the original voxel value to obtain the resealed voxel value. The addition of a ‘resealing strength’ term allows for a more subtle resealing operation.
d shows the result of applying this operation to the Coherence image of the submarine canyon. This resealing was done with a strength of 0.5. The same type of contrast enhancement was performed on the curvature image of a channel for
Locally Adaptive Operations
It is also possible to modify the resealing operation to only rescale locally high values. By linking the threshold range to local variance in the data volume, only locally high voxels will be counted in the density calculations. This prevents a high noise background from overwhelming the voxel density process, and provides a more robust result where the characteristic voxel value of a feature varies significantly.
a is a Coherence image showing a portion of a salt body, with surrounding faults.
The voxel density module provides a way to score confidence in a data volume. The number of voxels that fall within a given threshold range are counted within a moving 3-D operator. This result of this count is the density score of the operator's center voxel. Voxels with a high density score are considered significant, while those with a low density score can be considered noise. Significant data regions can be preserved or enhanced, while insignificant data regions are smoothed or filtered out.
Voxel density-guided smoothing and resealing operations are edge preserving. Significant trends can be enhanced while maintaining their overall shape and internal texture. This is accomplished by selectively smoothing insignificant areas more than valid data trends. Similarly, significant data trends can be selectively gained upward while muting surrounding noise. Such an operation preserves the original flavor of the data, but with an increased S/N ratio.
The threshold range included in density scoring can be linked to local variance in the data volume. In this manner, locally significant data trends are preserved. This allows voxel density to be used for data sets that have significant data value ranges that vary between data regions.
Voxel density represents a potentially very valuable tool when used to pre-process data for visual and automated interpretation. For example, at least a S/N ratio can be improved, and visual emphasis can be given to important trends through selective smoothing.
In optional step S240, and based on a selectivity score, features can be filtered. Similarly, in step S250, features can be filtered if they are within a connectivity range. Control then continues to step S260.
In step S260, a visual-clutter reduced seismic data volume is output and saved. Control then continues to step S270 where the control sequence ends.
In step S330, and for each operator position, steps S332-S338 are performed. In particular, in step S332, a determination is made whether the operator's center voxel has the highest absolute amplitude of all voxels within the operator. Next, in step S334, and if the highest amplitude voxel is not in the center of the operator, the process moves to the next voxel.
In step S336, and if the highest absolute amplitude voxel is at the center of the operator, the voxel value is written to the output volume in its original position. Next, in step S337, a search upward and downward from the center voxel is performed to determine the extent of the main reflection lobe. Then, in step S338, the full extent of the main reflection lobe is saved to the output. Control then continues to step S340 where the control sequence ends.
In step S440, voxels in the center of the operator are rescaled such that they are emphasized. Next, in step S450, the visually improved volume is output and saved.
In step S530, the density score volume is output and saved, for example, as a volumetric confidence estimate. Control then continues to step S540 where the control sequence ends.
While the above-described flowcharts have been discussed in relation to a particular sequence of events, it should be appreciated that changes to this sequence can occur without materially effecting the operation of the invention. Additionally, the exact sequence of events need not occur as set forth in the exemplary embodiments. Additionally, the exemplary techniques illustrated herein are not limited to the specifically illustrated embodiments but can also be utilized with the other exemplary embodiments and each described feature is individually and separately claimable.
The systems, methods and techniques of this invention can be implemented on a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device such as PLD, PLA, FPGA, PAL, any means, or the like. In general, any device capable of implementing a state machine that is in turn capable of implementing the methodology illustrated herein can be used to implement the various methods and techniques according to this invention.
Furthermore, the disclosed methods may be readily implemented in processor executable software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms. Alternatively, the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this invention is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized. The systems, methods and techniques illustrated herein can be readily implemented in hardware and/or software using any known or later developed systems or structures, devices and/or software by those of ordinary skill in the applicable art from the functional description provided herein and with a general basic knowledge of the computer and geologic arts.
Moreover, the disclosed methods may be readily implemented in software that can be stored on a computer-readable storage medium, executed on programmed general-purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like. The systems and methods of this invention can be implemented as program embedded on personal computer such as an applet, JAVA® or CGI script, in C or C++, Fortran, or the like, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated system or system component, or the like. The system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system, such as the hardware and software systems of a dedicated seismic interpretation device.
It is therefore apparent that there has been provided, in accordance with the present invention, systems and methods for interpreting data. While this invention has been described in conjunction with a number of embodiments, it is evident that many alternatives, modifications and variations would be or are apparent to those of ordinary skill in the applicable arts. Accordingly, it is intended to embrace all such alternatives, modifications, equivalents and variations that are within the spirit and scope of this invention.
This application claims the benefit of and priority under 35 U.S.C. §119(e) to U.S. Patent Application No. 60/987,906, filed 14 Nov. 2007, entitled “Seismic Data Processing,” and is related to PCT Application PCT/US2007/071733 (Published as WO2008/005690), both of which are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
60987906 | Nov 2007 | US |