Systems and methods to determine depth of soil coverage along a right-of-way

Information

  • Patent Grant
  • 12087002
  • Patent Number
    12,087,002
  • Date Filed
    Wednesday, February 28, 2024
    11 months ago
  • Date Issued
    Tuesday, September 10, 2024
    5 months ago
  • Inventors
    • Miller; Luke R. (Findlay, OH, US)
    • Beard; Joshua J. (Findlay, OH, US)
    • Battles; Brittan (Findlay, OH, US)
  • Original Assignees
  • Examiners
    • Nguyen; Leon Viet Q
    Agents
    • Womble Bond Dickinson (US) LLP
Abstract
Embodiments of systems and methods to determine depth of soil coverage for an underground feature along a right-of-way are disclosed. In an embodiment, the method may include receiving a depth of cover measurement for the right-of-way. The method may include capturing baseline images of the right-of-way within a first selected time of the depth of cover measurement. The method may include rendering a three dimensional elevation model of the right-of-way from the baseline images. The method may include georeferencing the three dimensional elevation model to generate a georeferenced three dimensional elevation model. The method may include adding the depth of cover measurement to the georeferenced three dimensional elevation model. The method may include rendering an updated three dimensional elevation model of the right-of-way from subsequently captured images. The method may include determining a delta depth of coverage based on the georeferenced and the updated three dimensional elevation model.
Description
FIELD OF DISCLOSURE

The present disclosure relates to systems and methods for determining a current depth of soil coverage along a right of way. In particular, the present disclosure relates to systems and methods for determining depth of soil coverage of a pipeline (or, in other embodiments, other buried or underground features) along a right-of-way.


BACKGROUND

Pipeline is positioned throughout various environments worldwide to transport various fluids, such as hydrocarbons and/or renewable hydrocarbons, as well as water and/or other fluids, each in a liquid or gaseous state. For example, hundreds of thousands of miles of pipeline are positioned throughout the United States alone. A majority of such pipeline is buried or underground. Other buried or underground features are positioned worldwide as well, such as utility lines, sewage or septic lines or tanks, tunnels, and/or other various underground features.


Exposure of such pipeline or other underground features, caused by erosion, weather events, unintentional interference (for example, digging along a right-of-way), may result in damage to or risk of damage to the pipeline or other underground feature. To determine whether a pipeline is at risk of exposure, a person or technician may walk along the right of way and physically measure the distance from the surface to the top of the pipeline or other underground feature. However, such methods or techniques are expensive, time consuming, and prone to error (for example, mismeasurement, measurement in the wrong area, and/or lack of sufficient measurements).


Another typical method utilized involves flying over the right-of-way to determine whether a pipeline or other feature has actually been exposed. However, such methods simply detect actual exposures and do not predict whether there is a risk of exposure.


SUMMARY

Thus, in view of the foregoing, Applicant has recognized these problems and others in the art, and has recognized a need for enhanced systems and methods for determining a current depth of soil coverage along a right of way. Particularly, the present disclosure relates to systems and methods for determining depth of soil coverage of a pipeline (or, in other embodiments, other buried or underground features) along a right-of-way.


The disclosure herein provides embodiments of systems and methods for determining depth of soil coverage for a pipeline or other underground feature along a right-of-way or the entire right-of-way quickly (in relation to typical walking surveys or inline inspections), utilizing less resources, and/or before the pipeline and/or the other underground feature is actually exposed. Further, the systems and methods may determine the depth of soil coverage within centimeters or even less of a distance of actual depth of soil coverage.


Such systems and methods may include obtaining or receiving a depth of coverage measurement or survey. The depth of coverage survey may be obtained via a walking survey, original construction records, and/or via inline inspection. Such a survey may include a distance from the surface or top of the soil to a top of the feature (for example, a pipeline) along the right-of-way. In other words, the measurements may include measurements for varying points along the right-of-way or for the continuous length of the right-of-way.


Once the depth of coverage measurement is available, the systems and methods may capture or prompt capture of baseline images of the right-of-way. The baseline images may include a plurality of images captured at various angles, for example, top-down images, horizontal images, and/or images at various angles. Further, the baseline images may be captured by one or more vehicles. The one or more vehicles may include a transportation vehicle (for example, a car or truck), an all-terrain vehicle, and/or an aerial vehicle, each of the vehicles being manned or unmanned (for example, a drone).


The system and methods may then utilize the baseline images to render a three dimensional elevation model of the right-of-way. Such a rendering may include utilization of photogrammetry and/or other models to generate such a three dimensional model. The three dimensional elevation model may include various surface features and/or heights along the path of the right-of-way. The systems and methods may georeference the three dimensional elevation model. In other words, coordinates may be assigned to the three dimensional model, enabling the systems and methods to correlate actual areas of the right-of-way with other values in other models and/or measurements (for example, an updated three dimensional model and/or depth of coverage measurement). The georeferenced three dimensional elevation model may include spatial references. The systems and method may utilize mathematical formula and/or known coordinates, spatial references, and/or other models to perform such georeferencing.


The systems and methods may then add the depth of coverage measurements to the georeferenced three dimensional model. The systems and methods may utilize coordinates included in the depth of cover measurements to add the depth of coverage to the correct location in the georeferenced three dimensional elevation model. In an embodiment, the systems and methods may include numbers associated with the depth of coverage in the three dimensional elevation model, allowing users to view current depth of coverage. Further, the systems and methods may relate the current depth of coverage with the current elevation, thus, as the elevation changes (as determined by subsequently captured images), the subsequent depth of coverage may be determined.


The systems and methods, as noted, may capture additional and/or subsequent images. The time that the subsequent images may be captured may be dependent on the location of the right-of-way, among other factors. The systems and methods may render an updated three dimensional elevation model based on these subsequent images. The systems and methods may utilize the updated three dimensional elevation model and the previously determined three dimensional elevation model to determine a delta depth of coverage. The systems and methods may determine whether this delta depth of coverage has exceeded a threshold (dependent, for example, on location and/or type of feature or pipeline) and, if the delta depth of coverage has exceeded the threshold, generate an alert and/or recommended remedial action.


Thus, as noted, the depth of coverage may be determined before a feature or pipeline is actually exposed. Further, the depth of coverage may be determined in a shorter than typical amount of time and/or utilizing less resources.


Accordingly, an embodiment of the disclosure is directed to a method to determine depth of soil coverage for a pipeline along a pipeline right-of-way. The method may include receiving a right-of-way walking survey for the pipeline right-of-way. The right-of-way walking survey may include a depth of soil coverage over the pipeline for the pipeline right-of-way. The method may include capturing baseline images of the pipeline right-of-way within a first selected time of the right-of-way walking survey. The method may include rendering a three dimensional elevation model of the pipeline right-of-way from the baseline images. The method may include georeferencing the three dimensional elevation model to generate a georeferenced three dimensional elevation model. The method may include adding or superimposing the soil coverage to the georeferenced three dimensional elevation model. The method may include capturing subsequent images of the pipeline right-of-way after a second selected time. The method may include rendering an updated three dimensional elevation model of the pipeline right-of-way from the subsequent images. The method may include determining a delta depth of soil coverage of the pipeline based on the georeferenced three dimensional elevation model and the updated three dimensional elevation model.


In an embodiment, the right-of-way walking survey may comprise a survey grade quality and includes one or more of latitude, longitude, elevation, depth of soil coverage from a top of the pipeline to a surface of the pipeline right-of-way, XY coordinates, Z coordinates, or a measurement process. In another embodiment, the delta depth of soil coverage may comprise an actual depth of soil coverage within about 3 centimeters.


In an embodiment, the method may include determining whether the delta depth of soil coverage exceeds a selected threshold and, in response to a determination that the delta depth of soil coverage exceeds the selected threshold, generating an alert. The selected threshold may comprise a variable length based on a location of the pipeline and an environment surrounding of the pipeline. The alert may include one or more of the delta depth of soil coverage, a previous depth of soil coverage, the selected threshold, a location of the pipeline right-of-way where the delta depth of soil coverage, a severity level, or a next action, and the next action may include one or more of adjustment of a current depth of soil coverage, adjustment of pipeline depth, or installation of pipeline protection.


In another embodiment, the method may include updating the georeferenced three dimensional elevation model with the updated three dimensional model. An updated georeferenced three dimensional elevation model may include one or more of symbols or colors indicating a change in the depth of soil coverage and a severity of the change.


In an embodiment, the second selected time may comprise a time based on a location of the pipeline and an environment surrounding of the pipeline. The second selected time may comprise a time less than a year, about 1 year, about 2 years, about 3 years, about 5 years, about 8 years, about 10 years, or about 10 years. The capture of second subsequent images may occur based on the second selected time and the delta depth of soil coverage.


The method may further include verifying the delta depth of soil coverage at a portion of the pipeline right of way via a second right-of-way walking survey.


In an embodiment, the baseline images and/or the subsequent images are high resolution aerial images may be captured by an image sensor on an unmanned and/or manned aerial vehicle.


Another embodiment of the disclosure is directed to a method to determine depth of soil coverage for an underground feature along a right-of-way. The method may include capturing images of the right-of-way after a selected time. The method may include rendering an updated three dimensional elevation model of the right-of-way from the subsequent images. The method may include determining a delta depth of soil coverage of the underground feature based on the georeferenced three dimensional elevation model and the updated three dimensional elevation model. The method may include superimposing a depth of soil coverage and the delta depth of soil coverage on the updated three dimensional elevation model. In an embodiment, the underground feature may comprise a pipeline, a utility line, or a septic system. In a further embodiment, if the underground feature comprises a pipeline, then the pipeline may transport a hydrocarbon fluid.


In another embodiment, the method may include receiving a depth of cover measurement from ground level for the right-of-way. The depth of cover measurement may be received from one or more of a walking survey, original construction records, or via inline inspection The method may include capturing baseline images of the right-of-way within a prior selected time of reception of the depth of cover measurement. The method may include rendering a three dimensional elevation model of the right-of-way from the baseline images. The method may include georeferencing the three dimensional elevation model to generate the georeferenced three dimensional elevation model. The method may include superimposing the soil coverage to the georeferenced three dimensional elevation model.


In another embodiment, the method may include, in response to the delta depth of soil coverage being outside of a threshold range, generating an alert to include an indication of an area with a depth of soil coverage below a selected limit. The selected limit may be based on a type of the underground feature. The alert may be superimposed onto the updated three dimensional model. The alert may include a remedial action, and the remedial action may include one or more of adding surface coverage over the underground feature or lowering the underground feature further below ground.


Another embodiment of the disclosure is directed to a system for determining depth of soil coverage for a pipeline along a pipeline right-of-way. The system may include a survey and image capture circuitry. The survey and image capture circuitry may be configured to receive at an initial selected time (a) a right-of-way walking survey and (b) baseline captured images of the pipeline right-of-way and receive subsequent images at one or more subsequent selected times of the pipeline right-of-way. The system may include a baseline modeling circuitry. The baseline modeling circuitry may be configured to determine a baseline depth of soil coverage model for the pipeline along a pipeline right-of-way based on (a) the right-of-way walking survey and (b) the baseline captured images of the pipeline right-of-way. The system may include a depth of coverage modeling circuitry. The depth of coverage modeling circuitry may be configured to, in response to reception of subsequent images, determine an updated surface level of the pipeline right-of-way and update the baseline depth of soil coverage model based on the updated surface level to generate a subsequent depth of soil coverage model.


In an embodiment, the depth of coverage modeling circuitry may be further configured to, if the subsequent depth of soil coverage model is available, update the subsequent depth of soil coverage model.


In an embodiment, the baseline depth of soil coverage model and the subsequent depth of soil coverage model may each comprise a three dimensional elevation model indicating a distance from a top of the pipeline along the pipeline right-of-way to a surface level. The distance may comprise a length within about 3 centimeters of an actual distance from the top of the pipeline in the pipeline right-of-way to the surface level.


In another embodiment, the system may include one or more controllers. The one or more controllers may include the survey and image capture circuitry and/or the depth of coverage modeling circuitry.


Another embodiment of the disclosure is directed to a computing device for determining depth of soil coverage for a pipeline along a pipeline right-of-way, the computing device comprising a processor and a non-transitory computer-readable storage medium storing software instructions that, when executed by the processor, in response to reception of (a) a right-of-way walking survey for the pipeline right-of-way including a depth of soil coverage over the pipeline for the pipeline right-of-way and (b) captured baseline images of the pipeline right-of-way within a first selected time of the right-of-way walking survey, may render a three dimensional elevation model of the pipeline right-of-way from the baseline images. The software instructions, when executed by the processor, may georeference the three dimensional elevation model to generate a georeferenced three dimensional elevation model. The software instructions, when executed by the processor, may superimpose or add the soil coverage to the georeferenced three dimensional elevation model. The software instructions, when executed by the processor, may capture subsequent images of the pipeline right-of-way after a second selected time. The software instructions, when executed by the processor, may render an updated three dimensional elevation model of the pipeline right-of-way from the subsequent images. The software instructions, when executed by the processor, may determine a delta depth of soil coverage of the pipeline based on the georeferenced three dimensional elevation model and the updated three dimensional elevation model.


In another embodiment, the software instructions, when executed by the processor, may integrate the delta depth of soil coverage of the pipeline into the updated three dimensional elevation model. The software instructions, when executed by the processor, may determine an indicator for each section of the updated three dimensional elevation model based on the delta depth of soil coverage and a selected threshold for each section. The selected threshold may comprise a value based on one or more of a location of the pipeline right-of-way or an environment of the pipeline right-of-way.


Still other aspects and advantages of these embodiments and other embodiments, are discussed in detail herein. Moreover, it is to be understood that both the foregoing information and the following detailed description provide merely illustrative examples of various aspects and embodiments, and are intended to provide an overview or framework for understanding the nature and character of the claimed aspects and embodiments. Accordingly, these and other objects, along with advantages and features herein disclosed, will become apparent through reference to the following description and the accompanying drawings. Furthermore, it is to be understood that the features of the various embodiments described herein are not mutually exclusive and may exist in various combinations and permutations.





BRIEF DESCRIPTION OF DRAWINGS

These and other features, aspects, and advantages of the disclosure will become better understood with regard to the following descriptions, claims, and accompanying drawings. It is to be noted, however, that the drawings illustrate only several embodiments of the disclosure and, therefore, are not to be considered limiting of the disclosure's scope.



FIG. 1A is a simplified diagram that illustrates a system for determining a depth of coverage for a right-of-way, according to an embodiment of the disclosure.



FIG. 1B, FIG. 1C, and FIG. 1D are simplified schematic diagrams that illustrate a system for determining a depth of coverage for a right-of-way and a vehicle for capturing images of the right-of-way, according to an embodiment of the disclosure.



FIG. 2 is a simplified diagram that illustrates an apparatus for determining a depth of coverage for a right-of-way, according to an embodiment of the disclosure.



FIG. 3 is a simplified diagram that illustrates a control system for controlling determination of a depth of coverage for a right-of-way, according to an embodiment of the disclosure.



FIG. 4 is a simplified flow diagram for determining a depth of coverage for a right-of-way, according to an embodiment of the disclosure.



FIG. 5A, FIG. 5B, and FIG. 5C are examples of a three dimensional elevation model, according to an embodiment of the disclosure.





DETAILED DESCRIPTION

So that the manner in which the features and advantages of the embodiments of the systems and methods disclosed herein, as well as others, which will become apparent, may be understood in more detail, a more particular description of embodiments of systems and methods briefly summarized above may be had by reference to the following detailed description of embodiments thereof, in which one or more are further illustrated in the appended drawings, which form a part of this specification. However, it is to be noted that the drawings illustrate only various embodiments of the systems and methods disclosed herein and are therefore not to be considered limiting of the scope of the systems and methods disclosed herein as it may include other effective embodiments as well.


Typically, to determine depth of coverage (in other words, the distance from the surface to the top of a buried feature) along a right-of-way, a walking survey, an inline inspection, and/or or construction records may be utilized. Each technique or method may utilize large amounts of time and/or resources. For example, for a walking survey, a user or technician may walk varying lengths of the right-of-way physically measuring depth of cover. Finally, capturing images over a right-of-way may not accurately predict potential pipeline exposure and, typically, simply captures currently exposed pipeline.


Thus, in view of the foregoing, Applicant has recognized these problems and others in the art, and has recognized a need for enhanced systems and methods for determining a current depth of soil coverage along a right of way. Such systems and methods may include obtaining or receiving a depth of coverage measurement or survey. The depth of coverage survey may be obtained via a walking survey, original construction records, and/or via inline inspection. Such a walking survey may include a distance from the surface or ground surface or top of the soil or other material to a top of the feature (for example, a pipeline) along the right-of-way. In other words, the measurements may include measurements for varying points along the right-of-way or for the continuous length of the right-of-way that show the amount or depth of coverage of the underground feature.


Once the depth of coverage measurement is available, the systems and methods may capture or prompt capture of baseline images of the right-of-way. The baseline images may include a plurality of images captured at various angles, for example, top-down images, horizontal images, and/or images at various angles. Further, the baseline images may be captured by one or more vehicles. The one or more vehicles may include a transportation vehicle (for example, a car or truck), an all-terrain vehicle, and/or an aerial vehicle, each of the vehicles being manned or unmanned (for example, a drone). Such systems and methods may capture the baseline images within a selected time of reception of the depth of coverage measurement. For example, the baseline images may be captured within about 1 week, about 2 weeks, about 30 days, or about 2 months of the depth of coverage measurement. Such a time frame may ensure that an accurate, current depth of coverage is correlated to a current three dimensional elevation model.


The system and methods may then utilize the baseline images to render a three dimensional elevation model of the right-of-way. Such a rendering may include utilization of photogrammetry and/or other models to generate the three dimensional model featuring varying heights and geographic features and/or elements. The three dimensional elevation model may include various surface features and/or heights along the path of the right-of-way. The systems and methods may georeferenced the three dimensional elevation model. In other words, coordinates may be assigned to the three dimensional model, enabling the systems and methods to correlate actual areas of the right-of-way with other values in other models and/or measurements (for example, an updated three dimensional model and/or depth of coverage measurement). The georeferenced three dimensional elevation model may include spatial references. The systems and method may utilize mathematical formula and/or known coordinates, spatial references, and/or other models to perform such georeferencing.


The systems and methods may then superimpose or add the depth of coverage measurements to the georeferenced three dimensional model. The systems and methods may utilize coordinates included in the depth of cover measurements to superimpose or add the depth of coverage to the correct location or correlated location in the georeferenced three dimensional elevation model. In an embodiment, the systems and methods may include the actual numbers in the three dimensional elevation model, allowing users to view current depth of coverage measurements in relation to the surface. Further, as noted, the systems and methods may relate or correlate the current depth of coverage with the current elevation, thus, as the elevation changes (as determined based upon subsequently captured images), the subsequent depth of coverage may be determined.


The systems and methods, as noted, may capture additional and/or subsequent images. The time that the subsequent images may be captured may be dependent on the location of the right-of-way, among other factors. For example, rights-of-way in an arid environment with few inhabitants or near agricultural land, geographic changes to the surface may be unexpected, thus, monitoring may occur yearly or at a frequency including multiple years. On the other hand, right-of ways including numerous inhabitants (for example, near a city or town), near a river or other water way, and/or near other areas prone to potential changes to the surface may be monitored frequently, for example, weekly, monthly, and/or a plurality of times within a year. The systems and methods may render an updated three dimensional elevation model based on these subsequent images. The systems and methods may utilize the updated three dimensional elevation model and the previously determined three dimensional elevation model to determine a delta depth of coverage. The systems and methods may determine whether this delta depth of coverage has exceeded a threshold (dependent, for example, on location and/or type of feature or pipeline) and, if the threshold has been exceeded, generate an alert and/or recommended next or remedial action. Such a next or remedial action may include adding or adjusting surface coverage (such as dirt or other materials) and/or lowering the pipeline further below grade or ground, among other actions.


Thus, as noted, the depth of coverage may be determined before an underground feature or pipeline is actually exposed. Further, the depth of coverage may be determined in a shorter than typical amount of time and/or utilizing less resources.



FIG. 1A is a simplified diagram that illustrates a system or a depth of coverage system 102 for determining a depth of coverage for a right-of-way, according to an embodiment of the disclosure. The depth of coverage system 102 may include a processor 104 and memory 106. The memory 106 may store instructions, such as survey and image capture instructions 108, baseline modeling instructions 112, and/or depth of coverage modeling instructions 110. The depth of coverage system 102 may connect to a vehicle 114 or a sensor 116 (such as an image sensor) of the vehicle 114. In an embodiment, the depth of coverage system 102 may be in signal communication with the vehicle 114 or sensor 116. In such embodiments, the depth of coverage system 102 may receive images (for example, baseline or subsequent images) as they are captured by the vehicle 114 or sensor 116. In another embodiment, the depth of coverage system 102 may receive captured images after the images are captured and after the vehicle 114 has returned to a selected location. In yet another embodiment, the depth of coverage system 102 may connect to a database 118. In such an embodiment, the captured images may be transferred to the database 118 from the vehicle 114 and the depth of coverage system 102 may obtain images from the database 118. Further, the depth of coverage system 102 may connect to a user interface 120 and/or a controller 122. The depth of coverage system 102 may request and/or receive images and/or depth of coverage measurements from the user interface 120 and/or controller 122 (and/or other sources in other embodiments). Further, the depth of coverage system 102 may generate alerts and transmit such alerts to a user via the user interface 120 and/or the controller 122.


As noted, the memory 106 may include instructions. The instructions may include survey and image capture instructions 108. When the survey and image capture instructions 108 are executed by the processor, the survey and image capture instructions 108 may initially cause the depth of coverage system 102 to request and/or initiate depth of coverage measurements for a selected right-of-way or section or portion of the selected right-of-way. Such a depth of coverage measurement may be obtained via a walking survey (for example, via a technician or user), via inline inspection (for example, a device may pass through a pipeline and measure depth of coverage and, further, initiation of the device may be automatically initiated via the depth of coverage system 102), and/or via original construction records (for example, as stored in the database 118). In an embodiment, the inline inspection tool may comprise a smart pig configured to capture latitude, longitude, and elevation measurements as the smart pig progresses through a pipeline. In an embodiment, the walking survey, the inline inspection, and/or the original construction records may include one or more of latitude, longitude, elevation, depth of soil coverage from a top of the pipeline to a surface of the pipeline right-of-way, XY coordinates, Z coordinates, or a measurement process.


In another embodiment, the pipeline may transport hydrocarbon fluids and/or renewable fluids. “Hydrocarbon fluids” as used herein, may refer to petroleum fluids, renewable fluids, and other hydrocarbon based fluids. “Renewable fluids” as used herein, may refer to fluids containing and/or based on plant and/or animal derived feedstock. Further, the renewable fluids may be hydrocarbon based. For example, a renewable fluid may be a pyrolysis oil, oleaginous feedstock, biomass derived feedstock, or other fluids, as will be understood by those skilled in the art.


In another embodiment, the memory 106 may include baseline modeling instructions 112. The baseline modeling instructions 112 may, when executed by the processor 104, cause the depth of coverage system 102 to initiate and/or receive baseline images. For example, when the baseline modeling instructions 112 are executed, the depth of coverage system 102 may send a signal to the vehicle 114 to capture baseline images. The vehicle 114 may, in an embodiment, comprise a transportation vehicle (such as a car or truck), an all-terrain vehicle, an unmanned aerial vehicle (such as a drone), and/or a manned aerial vehicle (such as an airplane). In a further embodiment, the depth of coverage system 102 may initiate or cause an unmanned aerial vehicle to automatically traverse a selected right-of-way and capture baseline images of the selected right-of-way.


Once the baseline images have been captured and received by the depth of coverage system 102, the baseline modeling instructions 112 may cause the depth of coverage system 102 to render a three dimensional elevation model of the right-of-way. As noted, the baseline images may include top-down images, angled top-down images, horizontal images, and/or images from other directions. All of the baseline images may be utilized to generate a three dimensional elevation model, for example, via a photogrammetry algorithm and/or other models. Such a three dimensional elevation model may include the surface of the right-of-way, including various heights and/or features of the surface or ground level of the right-of-way. In an embodiment, the three dimensional baseline model may include ground control points. The ground control points may be determined based on metadata included in the capture images. Further, the ground control points may include known points including visible objects (for example, an aerial marker, waterway crossing, edge of road crossings, an encumbrance, and/or above grade features or pipeline features). The ground control point may provide a highly accurate benchmark measurement to improve the overall accuracy of surrounding elevation data.


Once the baseline three dimensional elevation model has been generated, the baseline modeling instructions 112 may cause the depth of coverage system 102 to georeference the three dimensional elevation model. In other words, the depth of coverage system 102 may bind or assign the points in the three dimensional elevation model to a spatial reference via one or more mathematical formula and/or known ground and/or location points. In an embodiment, georeferencing may occur prior to or in parallel with generation of the baseline three dimensional elevation model. Finally, the baseline modeling instructions 112 may cause the depth of coverage system 102 to superimpose or add the depth of coverage measurements to the georeferenced three dimensional elevation model, based on the georeferencing. In an embodiment, the depth of coverage measurement may be georeferenced prior to addition to the georeferenced three dimensional elevation model, allowing the depth of coverage system 102 to include the depth of coverage measurements in an accurate portion of the three dimensional elevation model.


In another embodiment, the memory 106 may include depth of coverage modeling instructions 110. The depth of coverage modeling instructions 110 may, once a baseline model (for example, the georeferenced three dimensional elevation model with depth of coverage measurements) has been established and when executed by the processor 104, cause the depth of coverage system 102 to initiate or request subsequent images. The vehicle 114 or sensor 116 may capture the subsequent images similar to the capture of the baseline images. In an embodiment, the subsequent images may be captured after a second selected time. The second selected time may be based upon the environment of the right-of-way (for example, environments prone to change may be monitored at shorter intervals than that of environments that do not change as often). In an embodiment, the second selected time may include less than a year, about 1 year, about 2 years, about 3 years, about 5 years, about 8 years, about 10 years, or even longer in some examples (such an amount of time dependent on the environment of the right-of-way).


Once the subsequent images have been captured and received by the depth of coverage system 102, the depth of coverage modeling instructions 110 may be executed to cause the depth of coverage system 102 to generate or render an updated three dimensional elevation model. Further, the depth of coverage system 102 may georeference the updated three dimensional elevation model. Finally, the depth of coverage modeling instructions 110 may cause the depth of coverage system 102 to determine a delta depth of coverage. The depth of coverage system 102 may compare the updated three dimensional elevation model with the previous three dimensional elevation model to determine such a delta depth of coverage. In an embodiment, to verify that the delta depth of coverage is accurate, after a determination of the delta depth of coverage, the depth of coverage system 102 may prompt and/or initiate a walking survey or inline inspection.


In another embodiment, the depth of coverage modeling instructions 110, when executed, may cause the depth of coverage system 102 to determine whether the delta depth of coverage exceeds a selected threshold. The selected threshold may vary depending on the environment of the right of way. For example, the selected threshold for an area where depth of coverage is not expected to change frequently and/or significantly may be relatively small, such as on the order of centimeters, whereas the selected threshold for an area where depth of coverage may change significantly over time may be relatively larger, such as on the order of meters. If the delta depth of coverage exceeds the selected threshold, then the depth of coverage system 102 may generate an alert. The alert may indicate and/or include the delta depth of coverage, the location of the right-of-way where the alert occurred (for example, including, in an embodiment, the coordinates), and/or a next action or remedial action. In an embodiment, the next action or remedial action may include adding soil or other material over the top of the affected area and/or digging out the underground feature (such as a pipeline) and lowering the underground feature further below ground.


In another embodiment, the depth of coverage system 102 may include and/or generate an another updated three dimensional elevation model based on an alert or the another updated three dimensional elevation model may be the alert or include the alert therein. For example, if an issue is detected or if a portion of the right-of-way includes a delta depth of coverage that exceeds the selected threshold, then that portion of the right-of-way may be highlighted. Further, additional information may be added to such a portion of the right-of-way, in the another updated three dimensional elevation model, such as, for example, location data, actual depth of coverage (for example, within about 1 centimeter or within about 3 centimeters to about 10 centimeters), severity (for example, high, medium or low), and/or other relevant data.


As noted, the depth of coverage system 102 may connect to a controller 122 or a plurality of controllers. In such embodiments, the controller 122 may be utilized to monitor the right-of-way. The controller 122 may control various vehicles and/or other components associated with underground features (such as a pipeline).


In some examples, the depth of coverage system 102 may be a computing device. The term “computing device” is used herein to refer to any one or all of programmable logic controllers (PLCs), programmable automation controllers (PACs), industrial computers, servers, virtual computing device or environment, desktop computers, personal data assistants (PDAs), laptop computers, tablet computers, smart books, palm-top computers, personal computers, smartphones, virtual computing devices, cloud based computing devices, and similar electronic devices equipped with at least a processor and any other physical components necessarily to perform the various operations described herein. Devices such as smartphones, laptop computers, and tablet computers are generally collectively referred to as mobile devices.


The term “server” or “server device” is used to refer to any computing device capable of functioning as a server, such as a master exchange server, web server, mail server, document server, or any other type of server. A server may be a dedicated computing device or a server module (e.g., an application) hosted by a computing device that causes the computing device to operate as a server. A server module (e.g., server application) may be a full function server module, or a light or secondary server module (e.g., light or secondary server application) that is configured to provide synchronization services among the dynamic databases on computing devices. A light server or secondary server may be a slimmed-down version of server type functionality that can be implemented on a computing device, such as a smart phone, thereby enabling it to function as an Internet server (e.g., an enterprise e-mail server) only to the extent necessary to provide the functionality described herein.


As used herein, a “non-transitory machine-readable storage medium” or “memory” may be any electronic, magnetic, optical, or other physical storage apparatus to contain or store information such as executable instructions, data, and the like. For example, any machine-readable storage medium described herein may be any of random access memory (RAM), volatile memory, non-volatile memory, flash memory, a storage drive (e.g., hard drive), a solid state drive, any type of storage disc, and the like, or a combination thereof. The memory may store or include instructions executable by the processor.


As used herein, a “processor” or “processing circuitry” may include, for example one processor or multiple processors included in a single device or distributed across multiple computing devices. The processor (such as, processor 104 shown in FIG. 1A or processing circuitry 202 shown in FIG. 2) may be at least one of a central processing unit (CPU), a semiconductor-based microprocessor, a graphics processing unit (GPU), a field-programmable gate array (FPGA) to retrieve and execute instructions, a real time processor (RTP), other electronic circuitry suitable for the retrieval and execution instructions stored on a machine-readable storage medium, or a combination thereof.



FIG. 1B, FIG. 1C, and FIG. 1D are simplified schematic diagrams that illustrate a system for determining a depth of coverage for a right-of-way and a vehicle for capturing images of the right-of-way, according to an embodiment of the disclosure. Turning first to FIG. 1B, the underground feature comprises a pipeline 124 buried underground or positioned beneath the surface 128. While a pipeline 124 is illustrated, as a non-limiting embodiment, as the underground feature, it will be understood that other underground features may be positioned underground along a right-of-way. The depth of coverage 126 may refer to the amount of material or soil covering the pipeline 124 at a particular point. While the surface 128 is illustrated as a slope, it will be understood that the surface may comprise a flat or relatively uniform surface, an uneven surface, a surface including various objects (for examples, trees, other foliage, buildings, water-ways, and/or other objects), and/or combinations thereof along the entirety of a right-of-way.


The vehicle, in such embodiments, may comprise a manned aerial vehicle 132, such as an airplane. The manned aerial vehicle 132 may include an image sensor (for example, a camera, a series of image sensors, a normal camera, a wide-angle camera, an ultra-wide angle camera, an infrared camera, a video camera, a camera configured to take a plurality of images consecutively and at high speed, and/or a camera configured to capture multi-spectral images, among other types of cameras) configured to capture images (as illustrated by 130 in FIG. 1B). The manned aerial vehicle 132 may fly along a portion of or the entirety of the right-of-way, capturing images along the route. The manned aerial vehicle 132 may connect to the depth of coverage system 102. In an embodiment, as the manned aerial vehicle 132 captures images, the manned aerial vehicle may transmit those images to the depth of coverage system 102 in real time. In another embodiment, the manned aerial vehicle 132 may capture the images and, once the flight along the right-of-way is complete, return to a location. At the location, the manned aerial vehicle 132 may connect to the depth of coverage system 102, a database, and/or another computing device. The manned aerial vehicle 132 may then proceed to transmit the images to the depth of coverage system 102, a database, and/or another computing device. In embodiments, the depth of coverage system 102 may connect to the database and/or the another computing device and, if the manned aerial vehicle 132 transmit images to the database and/or the another computing device, receive and/or scan for images collected and stored in the database and/or the another computing device.


Turning to FIGS. 1C and 1D, other vehicles may be used to capture images, rather than or in addition to the manned aerial vehicle 132. For example, an unmanned aerial vehicle 134 (such as a drone) and/or a transportation vehicle 136 (such as a truck or all-terrain vehicle) may be utilized to capture images. Other vehicles may be utilized, as well as satellite based imagery.


In an embodiment, each captured image may include a time stamp and/or location data. If multiple vehicles and/or other image sources (for example, satellites) are used to capture images, then the depth of coverage system 102 may utilized the location data for georeferencing and/or the time stamp to determine the latest delta depth of coverage.



FIG. 2 is a simplified diagram that illustrates an apparatus 200 for determining a depth of coverage for a right-of-way, according to an embodiment of the disclosure. Such an apparatus 200 may be comprised of a processing circuitry 202, a memory 204, a communications circuitry 206, a survey and image capture circuitry 208, a baseline modeling circuitry 210, and a depth of coverage modeling circuitry 212, each of which will be described in greater detail below. While the various components are illustrated in FIG. 2 as being connected with processing circuitry 202, it will be understood that the apparatus 200 may further comprise a bus (not expressly shown in FIG. 2) for passing information amongst any combination of the various components of the apparatus 200. The apparatus 200 may be configured to execute various operations described herein, such as those described above in connection with FIGS. 1A-1D and below in connection with FIGS. 3-4.


The processing circuitry 202 (and/or co-processor or any other processor assisting or otherwise associated with the processor) may be in communication with the memory 204 via a bus for passing information amongst components of the apparatus. The processing circuitry 202 may be embodied in a number of different ways and may, for example, include one or more processing devices configured to perform independently. Furthermore, the processor may include one or more processors configured in tandem via a bus to enable independent execution of software instructions, pipelining, and/or multithreading.


The processing circuitry 202 may be configured to execute software instructions stored in the memory 204 or otherwise accessible to the processing circuitry 202 (e.g., software instructions stored on a separate storage device). In some cases, the processing circuitry 202 may be configured to execute hard-coded functionality. As such, whether configured by hardware or software methods, or by a combination of hardware with software, the processing circuitry 202 represents an entity (for example, physically embodied in circuitry) capable of performing operations according to various embodiments of the present disclosure while configured accordingly. Alternatively, as another example, when the processing circuitry 202 is embodied as an executor of software instructions, the software instructions may specifically configure the processing circuitry 202 to perform the algorithms and/or operations described herein when the software instructions are executed.


Memory 204 is non-transitory and may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory 204 may be an electronic storage device (for example, a computer readable storage medium). The memory 204 may be configured to store information, data, content, applications, software instructions, or the like, for enabling the apparatus 200 to carry out various functions in accordance with example embodiments contemplated herein.


The communications circuitry 206 may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device, circuitry, or module in communication with the apparatus 200. In this regard, the communications circuitry 206 may include, for example, a network interface for enabling communications with a wired or wireless communication network. For example, the communications circuitry 206 may include one or more network interface cards, antennas, buses, switches, routers, modems, and supporting hardware and/or software, or any other device suitable for enabling communications via a network. Furthermore, the communications circuitry 206 may include the processing circuitry for causing transmission of such signals to a network or for handling receipt of signals received from a network. The communications circuitry 206, in an embodiment, may enable reception of depth of coverage measurements, baseline images, and/or subsequent images and may enable transmission of three dimensional elevation models and/or alerts.


The apparatus 200 may include survey and image capture circuitry 208 configured to initiate a survey or depth of coverage measurement operation, receive the survey or depth of coverage measurements, and/or initiate capture of baseline images and/or other images. Initiation of a survey or depth of coverage measurement operation may include prompting a user or technician to begin such a survey, automatically prompting an inline inspection apparatus or tool to begin such a survey, and/or obtaining from a database or the memory 204 construction records. Similarly, initiating capture of baseline and/or other images may include prompting a user or technician to being a fly over of a selected right-of-way and/or automatically initiating an unmanned vehicle (such as an unmanned aerial vehicle) to begin traversing (such as via flying over) a right-of-way. The survey and image capture circuitry 208 may utilize processing circuitry 202, memory 204, or any other hardware component included in the apparatus 200 to perform these operations, as described in connection with FIG. 4 below. The survey and image capture circuitry 208 may further utilize communications circuitry 206, as noted above, to gather data (such as a depth of cover measurement) from a variety of sources (for example, from a database, the memory 204, via a user interface, and/or from another source). The output of the survey and image capture circuitry 208 may be transmitted to other circuitry of the apparatus 200 (such as the baseline modeling circuitry 210).


In addition, the apparatus 200 further comprises the baseline modeling circuitry 210 that may render a baseline three dimensional elevation model based on baseline images, georeferenced the three dimensional elevation model, and add the depth of coverage measurement to the georeferenced three dimensional model to form a baseline three dimensional elevation model. The baseline modeling circuitry 210 may also georeference the depth of coverage measurement. The baseline three dimensional elevation model may be subsequently utilized to determine whether a delta depth of coverage exceeds a selected threshold. The baseline modeling circuitry 210 may utilize processing circuitry 202, memory 204, or any other hardware component included in the apparatus 200 to perform these operations, as described in connection with FIG. 4 below. The baseline modeling circuitry 210 may further utilize communications circuitry 206 to gather data (for example, the depth of coverage measurement and/or baseline images or other types of images) from a variety of sources (such as the survey and image capture circuitry 208) and, in some embodiments, output the baseline three dimensional elevation model. In such examples, the output of the baseline modeling circuitry 210 may be utilized by and/or transmitted to the depth of coverage modeling circuitry 212.


The apparatus 200 further comprises the depth of coverage modeling circuitry 212 that may initiate or prompt capture of subsequent images of the right-of-way, render an updated three dimensional elevation model, determine a delta depth of coverage based on the baseline three dimensional elevation model and the updated three dimensional elevation model, determine whether the delta depth of coverage exceeds a selected threshold. The depth of coverage modeling circuitry 212 may utilize processing circuitry 202, memory 204, or any other hardware component included in the apparatus 200 to perform these operations, as described in connection with FIG. 4 below. The depth of coverage modeling circuitry 212 may further utilize communications circuitry 206 to transmit an alert to a user, controller, and/or computing device.


Although components 202-212 are described in part using functional language, it will be understood that the particular implementations necessarily include the use of particular hardware. It should also be understood that certain of these components 202-212 may include similar or common hardware. For example, the survey and image capture circuitry 208, the baseline modeling circuitry 210, and the depth of coverage modeling circuitry 212 may, in some embodiments, each at times utilize the processing circuitry 202, memory 204, or communications circuitry 206, such that duplicate hardware is not required to facilitate operation of these physical elements of the apparatus 200 (although dedicated hardware elements may be used for any of these components in some embodiments, such as those in which enhanced parallelism may be desired). Use of the terms “circuitry,” with respect to elements of the apparatus therefore shall be interpreted as necessarily including the particular hardware configured to perform the functions associated with the particular element being described. Of course, while the terms “circuitry” should be understood broadly to include hardware, in some embodiments, the terms “circuitry” may in addition refer to software instructions that configure the hardware components of the apparatus 200 to perform the various functions described herein.


Although the survey and image capture circuitry 208, the baseline modeling circuitry 210, and the depth of coverage modeling circuitry 212 may leverage processing circuitry 202, memory 204, or communications circuitry 206 as described above, it will be understood that any of these elements of apparatus 200 may include one or more dedicated processors, specially configured field programmable gate arrays (FPGA), or application specific interface circuits (ASIC) to perform its corresponding functions, and may accordingly leverage processing circuitry 202 executing software stored in a memory or memory 204, communications circuitry 206 for enabling any functions not performed by special-purpose hardware elements. In all embodiments, however, it will be understood that the survey and image capture circuitry 208, the baseline modeling circuitry 210, and the depth of coverage modeling circuitry 212 are implemented via particular machinery designed for performing the functions described herein in connection with such elements of apparatus 200.


In some embodiments, various components of the apparatus 200 may be hosted remotely (e.g., by one or more cloud servers) and thus need not physically reside on the corresponding apparatus 200. Thus, some or all of the functionality described herein may be provided by third party circuitry. For example, a given apparatus 200 may access one or more third party circuitries via any sort of networked connection that facilitates transmission of data and electronic information between the apparatus 200 and the third party circuitries. In turn, that apparatus 200 may be in remote communication with one or more of the other components describe above as comprising the apparatus 200.


As will be appreciated based on this disclosure, example embodiments contemplated herein may be implemented by an apparatus 200 (or by a controller 302). Furthermore, some example embodiments (such as the embodiments described for FIGS. 1 and 3) may take the form of a computer program product comprising software instructions stored on at least one non-transitory computer-readable storage medium (such as memory 204). Any suitable non-transitory computer-readable storage medium may be utilized in such embodiments, some examples of which are non-transitory hard disks, CD-ROMs, flash memory, optical storage devices, and magnetic storage devices. It should be appreciated, with respect to certain devices embodied by apparatus 200 as described in FIG. 2, that loading the software instructions onto a computing device or apparatus produces a special-purpose machine comprising the means for implementing various functions described herein.



FIG. 3 is a simplified diagram that illustrates a control system for controlling determination of a depth of coverage for a right-of-way, according to an embodiment of the disclosure. As noted, system 300 or control system may include a controller 302. Further, controller 302 may connect to the an image sensor 314, a vehicle, a database 316, and/or other electronic devices positioned at various locations. The controller 302 may include memory 306 and one or more processors 304. The memory 306 may store instructions executable by one or more processors 304. In an example, the memory 306 may be a non-transitory machine-readable storage medium. As used herein, a “machine-readable storage medium” may be any electronic, magnetic, optical, or other physical storage apparatus or cyber-physical separation storage to contain or store information such as executable instructions, data, and the like. For example, any machine-readable storage medium described herein may be any of random access memory (RAM), volatile memory, non-volatile memory, flash memory, a storage drive (e.g., hard drive), a solid-state drive, any type of storage disc, and the like, or a combination thereof. As noted, the memory 306 may store or include instructions executable by the processor 304. As used herein, a “processor” may include, for example, one processor or multiple processors included in a single device or distributed across multiple computing devices. The processor 304 may be at least one of a central processing unit (CPU), a semiconductor-based microprocessor, a graphics processing unit (GPU), a field-programmable gate array (FPGA) to retrieve and execute instructions, a real-time processor (RTP), other electronic circuitry suitable for the retrieval and execution instructions stored on a machine-readable storage medium, or a combination thereof.


As used herein, “signal communication” refers to electric communication such as hardwiring two components together or wireless communication, as understood by those skilled in the art. For example, wireless communication may be Wi-Fi®, Bluetooth®, ZigBee, or forms of near-field communications. In addition, signal communication may include one or more intermediate controllers or relays disposed between elements in signal communication.


As note, the memory 306 may store instructions, such as survey and image capture instructions 308. The survey and image capture instructions 308 may prompt or initiate capture of depth of survey measurements. Such a prompt may be transmitted to the user interface 318. In another embodiment, the prompt may be transmitted to another computing device. In another embodiment, the survey and image capture instructions 308, when executed, may cause the controller 302 to initiate a survey, for example, via an inline inspection tool or assembly and/or via an unmanned vehicle. The survey and image capture instructions 308 may also prompt or initiate capture of baseline images. Such a prompt may also be transmitted to the user interface 318. In another embodiment, the prompt may be transmitted to another computing device and/or to a controlled unmanned vehicle (the image sensor 314 disposed therein and to capture one or more images along the right-of-way).


The memory 306 may store instructions, such as baseline modeling instructions 310. The baseline modeling instructions 310 may utilize the data gathered based on execution of the survey and image capture instructions 308. The baseline modeling instructions 310 may, after receiving such data, render a three dimensional elevation model, based on the baseline images. The baseline modeling instructions 310 may utilize a photogrammetry algorithm and/or other models to generate the three dimensional elevation model. For example, the baseline modeling instructions 310 may additionally use optic methods, projective geometry, and/or other computational models. The resulting three dimensional elevation model may comprise an accurate representation of actual conditions at the right-of-way. In another embodiment, the baseline modeling instructions 310 may georeference the three dimensional elevation model using, for example, known locations and/or known coordinates. The baseline modeling instructions 310 may also georeferenced the depth of coverage measurements. Finally, the baseline modeling instructions 310 may add the depth of coverage measurements to the three dimensional elevation model.


The memory 306 may store instructions, such as depth of coverage modeling instructions 312. The depth of coverage modeling instructions 312 may utilize the baseline three dimensional elevation model, as well as subsequently captured images. The depth of coverage modeling instructions 312 may render an updated three dimensional elevation model. The depth of coverage modeling instructions 312 may then determine a delta depth of coverage based on the updated three dimensional elevation model and the baseline three dimensional model.



FIG. 4 is a simplified flow diagram for determining a depth of coverage for a right-of-way, according to an embodiment of the disclosure. Unless otherwise specified, the actions of method 400 may be completed within controller 302. Specifically, method 400 may be included in one or more programs, protocols, or instructions loaded into the memory of controller 302 and executed on the processor or one or more processors of the controller 302. In other embodiments, method 400 may be implemented in or included in components of FIGS. 1-3. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order and/or in parallel to implement the methods.


At block 402, controller 302 may determine whether a depth of cover or a depth of coverage measurement has been received. In other embodiments, the controller 302 may initiate capture or actual measurement of the depth of cover for a right-of-way, using one or more automated components and/or apparatus. If no depth of cover or coverage measurement has been received, then the controller 302 may wait until the depth of cover or coverage measurement has been received.


At block 404, if the depth of cover or coverage has been received by the controller 302, then the controller may capture baseline images of a right-of-way. In an embodiment, the controller 302 may prompt an automated vehicle to capture such baseline images. In another embodiment, the controller 302 may prompt a user or technician to obtain or capture the baseline images. In an embodiment, capture of the baseline images may occur within a selected time from the reception or capture of the depth of cover or coverage measurement. Such a selected time may ensure that the resulting three dimensional elevation model accurately captures the current condition of the right-of-way.


At block 406, the controller 302 may render a three dimensional elevation model of the right-of-way based on the baseline images. The controller 302 may utilize photogrammetry techniques. In another embodiment, the controller 302 may utilize other algorithms, such as light detection and ranging (LiDAR) algorithms, photoclinometry (for example, using shadows cast by nearby objects to generate a three dimensional elevation model), an interpolation algorithm (for example, using known points to estimate elevation), and/or other algorithms. The use of such algorithms may ensure that accurate three dimensional representation of the actual landscape is generated and based on the baseline images.


At block 408, the controller 302 may georeference the three dimensional elevation model. In such embodiments, the controller 302 may utilize various mathematical formula and/or computational algorithm to georeference the three dimensional elevation model.


At block 410, the controller 302 may add the depth of cover or coverage measurement (for example, depth of soil coverage) to the georeferenced three dimensional elevation model. At block 412, the controller 302 may capture subsequent images of the right-of-way based on a second selected time. The controller 302, in such examples, may initiate an unmanned vehicle to capture the subsequent images. In another example, the controller 302 may prompt a user to capture the subsequent images. In an embodiment, the first georeferenced three dimensional elevation model of a selected area may be considered a baseline depth of soil coverage model.


At block 414, the controller 302 may render an updated three dimensional elevation model of the right-of-way. At block 416, the controller 302 may determine a delta depth of coverage (for example, soil coverage) using the updated three dimensional elevation model and the baseline three dimensional elevation model. In an embodiment, the controller 302 may subtract the values included in the updated three dimensional elevation model from the corresponding values in the baseline. In another embodiment, the controller 302 may compare each three dimensional elevation model to determine the delta depth of coverage.


At block 418, the controller 302 may determine whether delta depth of cover or coverage exceeds threshold. If the delta depth of cover or coverage does not exceed the threshold, controller 302 may capture additional images and proceed to determine a new delta depth of coverage. If At block 420, If the delta depth of cover or coverage does exceed the threshold, then the controller 302 may generate an alert. In an embodiment, the threshold may include a variable length or depth or other measurement. Such a variable length may be based on a location of the pipeline or underground feature, an environment surrounding the surface above the pipeline or underground feature, and/or the type of underground feature (for example, a pipeline, utility line, sewage or septic line or tank, tunnels, and/or other various underground features).



FIG. 5A is an example of a three dimensional elevation model 502, according to an embodiment of the disclosure. In such an embodiment, the three dimensional elevation model may include geographical features within a preselected distance of the right-of-way 504. In other embodiments, actual values may be overlayed onto certain elevations and/or features. In particular for elevations and/or features near the right-of-way, the overlayed values may indicate a current depth of coverage. Further, the overlayed values may be colored, each color indicating a severity level (for example, green means good, yellow means potential issues, red means actual issues occurring). Other data may be included in the three dimensional elevation model, such as coordinates and/or identification of varying issues (such as highlighting areas with high risk of potential exposure due to erosion, degradation, weather events, landslides, flooding, sinkholes, and/or other event).



FIG. 5B and FIG. 5C are other examples of a three dimensional elevation model 502, according to an embodiment of the disclosure. In an embodiment, the three dimensional elevation model 502 may include ground control points 508. The ground control points 508 may be determined based on metadata included in the capture images. Further, the ground control points 508 may include known points including visible objects (for example, an aerial marker, waterway crossing, edge of road crossings, an encumbrance, and/or above grade features or pipeline features). The ground control points 508 may provide a highly accurate benchmark measurement to improve the overall accuracy of surrounding elevation data. In another embodiment, a user may select a ground point to view the surface elevation for a particular ground control point via a separate user interface.


The present application claims priority to and the benefit of U.S. Provisional Application No. 63/540,822, filed Sep. 27, 2023, titled “SYSTEMS AND METHODS TO DETERMINE DEPTH OF SOIL COVERAGE ALONG A RIGHT-OF-WAY,” U.S. Provisional Application No. 63/540,692, filed Sep. 27, 2023, titled “SYSTEMS AND METHODS TO DETERMINE VEGETATION ENCROACHMENT ALONG A RIGHT-OF-WAY,” and U.S. Provisional Application No. 63/539,039, filed Sep. 18, 2023, titled “SYSTEMS AND METHODS TO DETERMINE DEPTH OF SOIL COVERAGE ALONG A RIGHT-OF-WAY,” the disclosures of which are incorporated herein by reference in their entirety.


Although specific terms are employed herein, the terms are used in a descriptive sense only and not for purposes of limitation. Embodiments of systems and methods have been described in considerable detail with specific reference to the illustrated embodiments. However, it will be apparent that various modifications and changes can be made within the spirit and scope of the embodiments of systems and methods as described in the foregoing specification, and such modifications and changes are to be considered equivalents and part of this disclosure.

Claims
  • 1. A method to determine depth of soil coverage for a pipeline along a pipeline right-of-way, the method comprising: receiving a right-of-way walking survey for the pipeline right-of-way, the right-of-way walking survey includes a depth of soil coverage over the pipeline for the pipeline right-of-way;capturing baseline images of the pipeline right-of-way within a first selected time of the right-of-way walking survey;rendering a three dimensional elevation model of the pipeline right-of-way from the baseline images;georeferencing the three dimensional elevation model to generate a georeferenced three dimensional elevation model;superimposing the depth of soil coverage on the georeferenced three dimensional elevation model;capturing subsequent images of the pipeline right-of-way after a second selected time;rendering an updated three dimensional elevation model of the pipeline right-of-way from the subsequent images; anddetermining a delta depth of soil coverage of the pipeline based on the georeferenced three dimensional elevation model and the updated three dimensional elevation model.
  • 2. The method of claim 1, wherein the right-of-way walking survey comprises a survey grade quality and includes one or more of latitude, longitude, elevation, depth of soil coverage from a top of the pipeline to a surface of the pipeline right-of-way, XY coordinates, Z coordinates, or a measurement process.
  • 3. The method of claim 1, wherein the delta depth of soil coverage comprises an actual depth of soil coverage within about 3 centimeters.
  • 4. The method of claim 1, further comprising: determining whether the delta depth of soil coverage exceeds a selected threshold; andin response to a determination that the delta depth of soil coverage exceeds the selected threshold, generating an alert.
  • 5. The method of claim 4, wherein the selected threshold comprises a variable length based on a location of the pipeline and an environment surrounding of the pipeline.
  • 6. The method of claim 4, wherein the alert includes one or more of the delta depth of soil coverage, a previous depth of soil coverage, the selected threshold, a location of the pipeline right-of-way where the delta depth of soil coverage, a severity level, or a next action, and wherein the next action includes one or more of adjustment of a current depth of soil coverage, adjustment of pipeline depth, or installation of pipeline protection.
  • 7. The method of claim 1, further comprising: updating the georeferenced three dimensional elevation model with the updated three dimensional model.
  • 8. The method of claim 7, wherein an updated georeferenced three dimensional elevation model includes one or more of symbols or colors indicating a change in the depth of soil coverage and a severity of the change.
  • 9. The method of claim 1, wherein the second selected time comprises a time based on a location of the pipeline and an environment surrounding of the pipeline.
  • 10. The method of claim 9, wherein capture of second subsequent images occurs based on the second selected time and the delta depth of soil coverage.
  • 11. The method of claim 1, further comprising: verifying the delta depth of soil coverage at a portion of the pipeline right of way via a second right-of-way walking survey.
  • 12. The method of claim 1, wherein the baseline images and the subsequent images comprise high resolution aerial images captured by an image sensor on one of an unmanned aerial vehicle or a manned aerial vehicle.
  • 13. A method to determine depth of soil coverage for an underground feature along a right-of-way, the method comprising: capturing images of the right-of-way after a selected time;rendering an updated three dimensional elevation model of the right-of-way from the images;determining a delta depth of soil coverage of the underground feature based on a georeferenced three dimensional elevation model and the updated three dimensional elevation model; andsuperimposing a depth of soil coverage and the delta depth of soil coverage on the updated three dimensional elevation model.
  • 14. The method of claim 13, further comprising: receiving a depth of cover measurement from ground level for the right-of-way;capturing baseline images of the right-of-way within a prior selected time of reception of the depth of cover measurement;rendering a three dimensional elevation model of the right-of-way from the baseline images;georeferencing the three dimensional elevation model to generate the georeferenced three dimensional elevation model; andsuperimposing the soil coverage to the georeferenced three dimensional elevation model.
  • 15. The method of claim 14, wherein the depth of cover measurement is received from one or more of a walking survey, original construction records, or via inline inspection.
  • 16. The method of claim 13, wherein the underground feature comprises a pipeline, a utility line, or a septic system.
  • 17. The method of claim 16, wherein, if the underground features comprises a pipeline, the pipeline facilitates transport of hydrocarbon fluids.
  • 18. The method of claim 13, further comprising: in response to the delta depth of soil coverage being outside of a threshold range, generating an alert to include an indication of an area with a depth of soil coverage below a selected limit.
  • 19. The method of claim 18, wherein the selected limit is based on a type of the underground feature.
  • 20. The method of claim 18, wherein the alert is superimposed on the updated three dimensional model.
  • 21. The method of claim 18, wherein the alert includes a remedial action, and wherein the remedial action includes one or more of adding surface coverage over the underground feature or lowering the underground feature further below ground.
  • 22. A computing device for determining depth of soil coverage for a pipeline along a pipeline right-of-way, the computing device comprising a processor and a non-transitory computer-readable storage medium storing software instructions that, when executed by the processor: in response to reception of (a) a right-of-way walking survey for the pipeline right-of-way including a depth of soil coverage over the pipeline for the pipeline right-of-way and (b) captured baseline images of the pipeline right-of-way within a first selected time of the right-of-way walking survey, render a three dimensional elevation model of the pipeline right-of-way from the captured baseline images;georeference the three dimensional elevation model to generate a georeferenced three dimensional elevation model;add the soil coverage to the georeferenced three dimensional elevation model;capture subsequent images of the pipeline right-of-way after a second selected time;render an updated three dimensional elevation model of the pipeline right-of-way from the subsequent images; anddetermine a delta depth of soil coverage of the pipeline based on the georeferenced three dimensional elevation model and the updated three dimensional elevation model.
  • 23. The computing device of claim 22, wherein the non-transitory computer-readable storage medium further stores software instructions that, when executed by the processor: integrate the delta depth of soil coverage of the pipeline into the updated three dimensional elevation model.
  • 24. The computing device of claim 23, wherein the non-transitory computer-readable storage medium further stores software instructions that, when executed by the processor: determine an indicator for each section of the updated three dimensional elevation model based on the delta depth of soil coverage and a selected threshold for each section.
  • 25. The computing device of claim 24, wherein the selected threshold comprises a value based on one or more of a location of the pipeline right-of-way or an environment of the pipeline right-of-way.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority to and the benefit of U.S. Provisional Application No. 63/540,822, filed Sep. 27, 2023, titled “SYSTEMS AND METHODS TO DETERMINE DEPTH OF SOIL COVERAGE ALONG A RIGHT-OF-WAY,” U.S. Provisional Application No. 63/540,692, filed Sep. 27, 2023, titled “SYSTEMS AND METHODS TO DETERMINE VEGETATION ENCROACHMENT ALONG A RIGHT-OF-WAY,” and U.S. Provisional Application No. 63/539,039, filed Sep. 18, 2023, titled “SYSTEMS AND METHODS TO DETERMINE DEPTH OF SOIL COVERAGE ALONG A RIGHT-OF-WAY,” the disclosures of which are incorporated herein by reference in their entirety.

US Referenced Citations (409)
Number Name Date Kind
2626627 Jung et al. Jan 1953 A
2864252 Schaschl Dec 1958 A
3087311 Rousseau Apr 1963 A
3303525 Peoples Feb 1967 A
3398071 Bagno Aug 1968 A
3504686 Cooper et al. Apr 1970 A
3593555 Grosko Jul 1971 A
3608869 Woodle Sep 1971 A
3672180 Davis Jun 1972 A
3725669 Tatum Apr 1973 A
3807433 Byrd Apr 1974 A
3809113 Grove May 1974 A
3814148 Wostl Jun 1974 A
3925592 Webb Dec 1975 A
3961493 Nolan, Jr Jun 1976 A
4010779 Pollock et al. Mar 1977 A
4073303 Foley, Jr. Feb 1978 A
4109677 Burnside Aug 1978 A
4202351 Biche May 1980 A
4229064 Vetter et al. Oct 1980 A
4242533 Cott Dec 1980 A
4289163 Pierson Sep 1981 A
4294378 Rabinovich Oct 1981 A
4320775 Stirling et al. Mar 1982 A
4357576 Hickam et al. Nov 1982 A
4420008 Shu Dec 1983 A
4457037 Rylander Jul 1984 A
4481474 Gerrit Nov 1984 A
4488570 Jiskoot Dec 1984 A
4630685 Huck et al. Dec 1986 A
4690587 Petter Sep 1987 A
4744305 Lim et al. May 1988 A
4788093 Murata et al. Nov 1988 A
4794331 Schweitzer, Jr. Dec 1988 A
4848082 Takahashi Jul 1989 A
4897226 Hoyle et al. Jan 1990 A
4904932 Schweitzer, Jr. Feb 1990 A
4964732 Cadeo et al. Oct 1990 A
5050064 Mayhew Sep 1991 A
5072380 Randelman et al. Dec 1991 A
5095977 Ford Mar 1992 A
5129432 Dugger Jul 1992 A
5191537 Edge Mar 1993 A
5367882 Lievens et al. Nov 1994 A
5383243 Thacker Jan 1995 A
5469830 Gonzalez Nov 1995 A
5533912 Fillinger Jul 1996 A
5562133 Mitchell Oct 1996 A
5595709 Klemp Jan 1997 A
5603360 Teel Feb 1997 A
5627749 Waterman et al. May 1997 A
5628351 Ramsey, Jr. et al. May 1997 A
5661623 McDonald Aug 1997 A
5783916 Blackburn Jul 1998 A
5814982 Thompson et al. Sep 1998 A
5832967 Andersson Nov 1998 A
5873916 Cemenska et al. Feb 1999 A
5887974 Pozniak Mar 1999 A
5895347 Doyle Apr 1999 A
5906648 Zoratti et al. May 1999 A
5906877 Popper et al. May 1999 A
5939166 Cheng et al. Aug 1999 A
5962774 Mowry Oct 1999 A
5973593 Botella Oct 1999 A
5993054 Tan et al. Nov 1999 A
6022421 Bath Feb 2000 A
6050844 Johnson Apr 2000 A
6065903 Doyle May 2000 A
6077340 Doyle Jun 2000 A
6077418 Iseri et al. Jun 2000 A
6098601 Reddy Aug 2000 A
6111021 Nakahama et al. Aug 2000 A
6149351 Doyle Nov 2000 A
6186193 Phallen et al. Feb 2001 B1
6243483 Petrou et al. Jun 2001 B1
6333374 Chen Dec 2001 B1
6346813 Kleinberg Feb 2002 B1
6383237 Langer et al. May 2002 B1
6427384 Davis, Jr. Aug 2002 B1
6478353 Barrozo Nov 2002 B1
6679302 Mattingly et al. Jan 2004 B1
6719921 Steinberger et al. Apr 2004 B2
6799883 Urquhart et al. Oct 2004 B1
6834531 Rust Dec 2004 B2
6840292 Hart et al. Jan 2005 B2
6851916 Schmidt Feb 2005 B2
6980647 Daugherty et al. Dec 2005 B1
6987877 Paz-Pujalt et al. Jan 2006 B2
7032629 Mattingly et al. Apr 2006 B1
7091421 Kukita et al. Aug 2006 B2
7186321 Benham Mar 2007 B2
7258710 Caro et al. Aug 2007 B2
7275366 Powell et al. Oct 2007 B2
7294913 Fischer et al. Nov 2007 B2
7385681 Ninomiya et al. Jun 2008 B2
7444996 Potier Nov 2008 B2
7459067 Dunn et al. Dec 2008 B2
7564540 Paulson Jul 2009 B2
7631671 Mattingly et al. Dec 2009 B2
7729561 Boland et al. Jun 2010 B1
7749308 McCully Jul 2010 B2
7810988 Kamimura et al. Oct 2010 B2
7815744 Abney et al. Oct 2010 B2
7832338 Caro et al. Nov 2010 B2
7879204 Funahashi Feb 2011 B2
8075651 Caro et al. Dec 2011 B2
8282265 Breithhaupt Oct 2012 B2
8299811 Wing Oct 2012 B2
8312584 Hodde Nov 2012 B2
8327631 Caro et al. Dec 2012 B2
8368405 Siebens Feb 2013 B2
8376432 Halger et al. Feb 2013 B1
8402746 Powell et al. Mar 2013 B2
8413484 Lubkowitz Apr 2013 B2
8414781 Berard Apr 2013 B2
8577518 Linden et al. Nov 2013 B2
8597380 Buchanan Dec 2013 B2
8632359 Grimm Jan 2014 B2
8647162 Henriksson et al. Feb 2014 B2
8748677 Buchanan Jun 2014 B2
8808415 Caro et al. Aug 2014 B2
8912924 Scofield et al. Dec 2014 B2
8979982 Jordan et al. Mar 2015 B2
9038855 Lurcott et al. May 2015 B2
9162944 Bennett et al. Oct 2015 B2
9175235 Kastner Nov 2015 B2
9222480 Younes et al. Dec 2015 B2
9310016 Hodde Apr 2016 B2
9329066 Skarping May 2016 B2
9363462 Yoel Jun 2016 B2
9388350 Buchanan Jul 2016 B2
9518693 Hodde Dec 2016 B2
9530121 Brauer et al. Dec 2016 B2
9550247 Smith Jan 2017 B2
9643135 Mazzei et al. May 2017 B1
9945333 Kopinsky Apr 2018 B2
10001240 Dray et al. Jun 2018 B1
10012340 Dray et al. Jul 2018 B1
10024768 Johnsen Jul 2018 B1
10094508 Dray et al. Oct 2018 B1
10134042 Prasad et al. Nov 2018 B1
10168255 Johnsen Jan 2019 B1
10196243 Wells Feb 2019 B1
10197206 Dray et al. Feb 2019 B1
10223596 Edwards et al. Mar 2019 B1
10247643 Johnsen Apr 2019 B1
10261279 Potter Apr 2019 B1
10287940 Tonsich May 2019 B2
10345221 Silverman Jul 2019 B1
10364718 Eddaoudi et al. Jul 2019 B2
10386260 Dudek Aug 2019 B2
10408377 Dray et al. Sep 2019 B1
10486946 Wells Nov 2019 B1
10501385 Buckner et al. Dec 2019 B1
10563555 Hamad Feb 2020 B2
10570581 Faivre Feb 2020 B2
10605144 Kobayashi Mar 2020 B2
10633830 Shibamori Apr 2020 B2
10655774 Dray et al. May 2020 B1
10657443 Araujo et al. May 2020 B2
10688686 Fadhel et al. Jun 2020 B2
10756459 Jongsma Aug 2020 B2
10833434 Tassell, Jr. Nov 2020 B1
10943357 Badawy et al. Mar 2021 B2
10948471 MacMullin et al. Mar 2021 B1
10953960 Sharp Mar 2021 B1
10962437 Nottrott et al. Mar 2021 B1
10970927 Sharp Apr 2021 B2
10990114 Miller Apr 2021 B1
10997707 Katz et al. May 2021 B1
11010608 Adam et al. May 2021 B2
11112308 Kreitinger et al. Sep 2021 B2
11125391 Al Khowaiter et al. Sep 2021 B2
11132008 Miller Sep 2021 B2
11164406 Meroux et al. Nov 2021 B2
11221107 Du et al. Jan 2022 B2
11247184 Miller Feb 2022 B2
11325687 Sharp May 2022 B1
11332070 Holden et al. May 2022 B2
11345455 Sharp May 2022 B2
11416012 Miller Aug 2022 B2
11428600 Dankers et al. Aug 2022 B2
11428622 Borin et al. Aug 2022 B2
11447877 Ell Sep 2022 B1
11559774 Miller Jan 2023 B2
11565221 Miller Jan 2023 B2
11578638 Thobe Feb 2023 B2
11578836 Thobe Feb 2023 B2
11596910 Miller Mar 2023 B2
11607654 Miller Mar 2023 B2
11655748 Thobe May 2023 B1
11655940 Thobe May 2023 B2
11662750 Miller May 2023 B2
11686070 Jordan et al. Jun 2023 B1
11715950 Miller et al. Aug 2023 B2
11720526 Miller et al. Aug 2023 B2
11739679 Thobe Aug 2023 B2
11752472 Miller Sep 2023 B2
11754225 Thobe Sep 2023 B2
11774042 Thobe Oct 2023 B2
11794153 Miller Oct 2023 B2
11807945 Ell Nov 2023 B2
11808013 Jordan et al. Nov 2023 B1
11815227 Thobe Nov 2023 B2
11920504 Thobe Mar 2024 B2
11965317 Jordan Apr 2024 B2
11988336 Thobe May 2024 B2
12000538 Thobe Jun 2024 B2
12006014 Ernst Jun 2024 B1
12011697 Miller Jun 2024 B2
12012082 Pittman, Jr. Jun 2024 B1
12012883 Thobe Jun 2024 B2
20020014068 Mittricker et al. Feb 2002 A1
20020178806 Valentine Dec 2002 A1
20030041518 Wallace et al. Mar 2003 A1
20030121481 Dodd et al. Jul 2003 A1
20030158630 Pham et al. Aug 2003 A1
20030167660 Kondou Sep 2003 A1
20030178994 Hurlimann et al. Sep 2003 A1
20030188536 Mittricker Oct 2003 A1
20030197622 Reynard et al. Oct 2003 A1
20030227821 Bae et al. Dec 2003 A1
20040057334 Wilmer et al. Mar 2004 A1
20040058597 Matsuda Mar 2004 A1
20040067126 Schmidt Apr 2004 A1
20040125688 Kelley et al. Jul 2004 A1
20040249105 Nolte et al. Dec 2004 A1
20040265653 Buechi et al. Dec 2004 A1
20050007450 Hill et al. Jan 2005 A1
20050058016 Smith et al. Mar 2005 A1
20050146437 Ward Jul 2005 A1
20050150820 Guo Jul 2005 A1
20050154132 Hakuta et al. Jul 2005 A1
20050176482 Raisinghani et al. Aug 2005 A1
20050284333 Falkiewicz Dec 2005 A1
20060125826 Lubkowitz Jun 2006 A1
20060278304 Mattingly et al. Dec 2006 A1
20070175511 Doerr Aug 2007 A1
20080092625 Hinrichs Apr 2008 A1
20080113884 Campbell et al. May 2008 A1
20080115834 Geoffrion et al. May 2008 A1
20080149481 Hurt Jun 2008 A1
20080283083 Piao Nov 2008 A1
20090009308 Date et al. Jan 2009 A1
20090107111 Oliver Apr 2009 A1
20090175738 Shaimi Jul 2009 A1
20090183498 Uchida et al. Jul 2009 A1
20090188565 Satake Jul 2009 A1
20090197489 Caro Aug 2009 A1
20100031825 Kemp Feb 2010 A1
20100049410 McKee Feb 2010 A1
20100058666 Kim Mar 2010 A1
20100198775 Rousselle Aug 2010 A1
20110265449 Powell Nov 2011 A1
20120027298 Dow Feb 2012 A1
20120092835 Miller Apr 2012 A1
20120143560 Tabet et al. Jun 2012 A1
20120185220 Shippen Jul 2012 A1
20120276379 Daniels et al. Nov 2012 A1
20120304625 Daikoku Dec 2012 A1
20130035824 Nakamura Feb 2013 A1
20130048094 Ballantyne Feb 2013 A1
20130062258 Ophus Mar 2013 A1
20130125323 Henderson May 2013 A1
20130176656 Kaisser Jul 2013 A1
20130186671 Theis Jul 2013 A1
20130201025 Kamalakannan et al. Aug 2013 A1
20130245524 Schofield Sep 2013 A1
20130293884 Lee et al. Nov 2013 A1
20130299500 McKinnon Nov 2013 A1
20140002639 Cheben et al. Jan 2014 A1
20140008926 Allen Jan 2014 A1
20140062490 Neuman et al. Mar 2014 A1
20140090379 Powell et al. Apr 2014 A1
20140121622 Jackson et al. May 2014 A1
20140133824 Yoel May 2014 A1
20140158616 Govind et al. Jun 2014 A1
20140158632 Govind et al. Jun 2014 A1
20140171538 Daniels et al. Jun 2014 A1
20140176344 Littlestar Jun 2014 A1
20140190691 Vinegar Jul 2014 A1
20140194657 Wadhwa et al. Jul 2014 A1
20140299039 Trollux Oct 2014 A1
20140345370 Marotta Nov 2014 A1
20140356707 Kwon et al. Dec 2014 A1
20150081165 Yamashita et al. Mar 2015 A1
20150144468 Skolozdra May 2015 A1
20150183102 Breschi et al. Jul 2015 A1
20150198518 Borin et al. Jul 2015 A1
20150244087 Wing Aug 2015 A1
20150269288 Moore Sep 2015 A1
20150323119 Giunta Nov 2015 A1
20160071059 Petering Mar 2016 A1
20160091467 Morris Mar 2016 A1
20160139355 Petersen May 2016 A1
20160169098 Makita Jun 2016 A1
20160169436 Sander et al. Jun 2016 A1
20160175634 Radian Jun 2016 A1
20160238194 Adler et al. Aug 2016 A1
20160252650 Hirst, Sr. Sep 2016 A1
20160363249 Disher Dec 2016 A1
20160369930 Poe et al. Dec 2016 A1
20170051472 Mochimaru Feb 2017 A1
20170088401 Clements et al. Mar 2017 A1
20170122174 Patel May 2017 A1
20170131728 Lambert et al. May 2017 A1
20170140237 Voeller et al. May 2017 A1
20170158303 Michaelis et al. Jun 2017 A1
20170180012 Tingler et al. Jun 2017 A1
20170248569 Lambert et al. Aug 2017 A1
20170253737 Auld et al. Sep 2017 A1
20170253738 Auld et al. Sep 2017 A1
20170253806 Auld et al. Sep 2017 A1
20170254481 Cadogan et al. Sep 2017 A1
20170259229 Chou et al. Sep 2017 A1
20170306428 Helgason et al. Oct 2017 A1
20170326474 Olovsson Nov 2017 A1
20170367346 Rees et al. Dec 2017 A1
20180002617 Umansky et al. Jan 2018 A1
20180003116 Fersman et al. Jan 2018 A1
20180037452 Gray et al. Feb 2018 A1
20180080356 Fukui Mar 2018 A1
20180098137 Saha Apr 2018 A1
20180119882 Allidieres et al. May 2018 A1
20180143734 Ochenas et al. May 2018 A1
20180186528 Tonn Jul 2018 A1
20180218214 Pestun Aug 2018 A1
20180223202 Fransham et al. Aug 2018 A1
20180245313 Shibamori et al. Aug 2018 A1
20180259064 McLemore Sep 2018 A1
20180312391 Borg Nov 2018 A1
20190016963 Auld et al. Jan 2019 A1
20190121373 Panigrahi Apr 2019 A1
20190367732 Helgason et al. May 2019 A1
20190270500 Hamaoka Sep 2019 A1
20190295189 Strasser Sep 2019 A1
20190338203 Umansky et al. Nov 2019 A1
20190359899 Umansky et al. Nov 2019 A1
20190362147 Adam Nov 2019 A1
20190136060 Helgason et al. Dec 2019 A1
20190368054 Gummow et al. Dec 2019 A1
20190368156 Faivre Dec 2019 A1
20200033252 Borin et al. Jan 2020 A1
20200118413 Kanukurthy et al. Apr 2020 A1
20200232191 Prior Jul 2020 A1
20200240588 Al Khowaiter Jul 2020 A1
20200245551 Hoffman et al. Aug 2020 A1
20200245552 Hoffman et al. Aug 2020 A1
20200245553 Hoffman et al. Aug 2020 A1
20200292445 Morimoto Sep 2020 A1
20200325742 Astudillo et al. Oct 2020 A1
20210053011 Sugiyama et al. Feb 2021 A1
20210062697 Yokoyama et al. Mar 2021 A1
20210073692 Saha et al. Mar 2021 A1
20210076006 O'Neill et al. Mar 2021 A1
20210095380 Borin et al. Apr 2021 A1
20210123211 Miller et al. Apr 2021 A1
20210138399 Yokoyama et al. May 2021 A1
20210192938 Doerr et al. Jun 2021 A1
20210197151 Miller Jul 2021 A1
20210207772 Norton et al. Jul 2021 A1
20210215925 Kim et al. Jul 2021 A1
20210216852 Reece et al. Jul 2021 A1
20210232163 Miller Jul 2021 A1
20210232741 Ogiso et al. Jul 2021 A1
20210362637 Hanis et al. Nov 2021 A1
20210381920 Jacobsz et al. Dec 2021 A1
20220001969 Pugnetti Jan 2022 A1
20220010707 Sharma et al. Jan 2022 A1
20220048606 Singh Feb 2022 A1
20220081261 Karbassi Mar 2022 A1
20220087099 Hoffman et al. Mar 2022 A1
20220154427 Misaki May 2022 A1
20220178114 Takahama Jun 2022 A1
20220186470 Chiba et al. Jun 2022 A1
20220213603 Al Eid et al. Jul 2022 A1
20220221368 Bergeron Jul 2022 A1
20220228345 Case et al. Jul 2022 A1
20220282651 Reynolds et al. Sep 2022 A1
20220290411 Anahara et al. Sep 2022 A1
20220343229 Gruber et al. Oct 2022 A1
20220401899 Miller Dec 2022 A1
20220404272 Kendall et al. Dec 2022 A1
20230015077 Kim Jan 2023 A1
20230061824 Ell Mar 2023 A1
20230078852 Campbell et al. Mar 2023 A1
20230129513 Miller Apr 2023 A1
20230259080 Whikehart et al. Aug 2023 A1
20230259088 Borup et al. Aug 2023 A1
20230332532 Thobe Oct 2023 A1
20230333577 Miller Oct 2023 A1
20230333578 Miller Oct 2023 A1
20230341092 Thobe Oct 2023 A1
20230347303 Miller Nov 2023 A1
20230358023 Jordan et al. Nov 2023 A1
20230366510 Thobe Nov 2023 A1
20230383416 Ell Nov 2023 A1
20230383417 Ell Nov 2023 A1
20230383418 Ell Nov 2023 A1
20230392536 Thobe Dec 2023 A1
20230399817 Jordan Dec 2023 A1
20230399818 Jordan Dec 2023 A1
20230407488 Ell Dec 2023 A1
20230415106 Miller Dec 2023 A1
20240003016 Ell Jan 2024 A1
20240060189 Ell Feb 2024 A1
20240141506 Ell May 2024 A1
20240166492 Thobe May 2024 A1
20240209988 Thobe Jun 2024 A1
Foreign Referenced Citations (103)
Number Date Country
2010241217 Nov 2010 AU
2013202839 May 2014 AU
2447358 Apr 2005 CA
2702151 Oct 2007 CA
2637421 Jan 2010 CA
2642295 Jan 2010 CA
2736733 Oct 2011 CA
2958443 Apr 2017 CA
2995532 Apr 2017 CA
2916141 Jun 2017 CA
2092562 Jan 1992 CN
200958686 Oct 2007 CN
100348970 Nov 2007 CN
102997052 Mar 2013 CN
103106764 May 2013 CN
103497804 Jan 2014 CN
102997061 May 2015 CN
204824775 Dec 2015 CN
205640252 Oct 2016 CN
106764463 Jan 2019 CN
208306600 Jan 2019 CN
110513604 Nov 2019 CN
210176958 Mar 2020 CN
111537157 Aug 2020 CN
114001278 Feb 2022 CN
113719746 Nov 2022 CN
114877263 Apr 2023 CN
2458573 May 2012 EP
2602609 Jun 2013 EP
3076461 Oct 2016 EP
3101411 Dec 2016 EP
3112011 Jan 2017 EP
2994626 Jan 2018 EP
3285759 Feb 2018 EP
2398302 Mar 2013 ES
2388762 Nov 1978 FR
2689241 Oct 1993 FR
1179978 Feb 1970 GB
2097687 Nov 1982 GB
2545207 Jun 2017 GB
2559149 Apr 2022 GB
202141001384 Jan 2021 IN
201900008235 Dec 2020 IT
2004125039 Apr 2004 JP
2007204023 Aug 2007 JP
2008097832 Apr 2008 JP
2012002159 Nov 2014 JP
2016078893 May 2016 JP
20110010316 Feb 2011 KR
20130038986 Apr 2013 KR
102129951 Jul 2020 KR
102169280 Oct 2020 KR
102281640 Jul 2021 KR
2760879 Dec 2021 RU
1996006685 May 1996 WO
1997006004 Feb 1997 WO
1997006298 Feb 1997 WO
1998003711 Jan 1998 WO
2000063108 Oct 2000 WO
2002030551 Apr 2002 WO
2003003002 Jan 2003 WO
2003066423 Aug 2003 WO
2004003293 Jan 2004 WO
2004092307 Oct 2004 WO
2005018300 Mar 2005 WO
2007107652 Sep 2007 WO
2007112335 Oct 2007 WO
2007149851 Dec 2007 WO
2009013544 Jan 2009 WO
2009055024 Apr 2009 WO
2010042704 Apr 2010 WO
2010103260 Sep 2010 WO
2011127535 Oct 2011 WO
2013112274 Aug 2013 WO
2014089443 Jun 2014 WO
2014173672 Oct 2014 WO
2015061868 May 2015 WO
2015153607 Oct 2015 WO
2016004107 Jan 2016 WO
2016026043 Feb 2016 WO
2016146404 Sep 2016 WO
2017074985 May 2017 WO
2017083778 May 2017 WO
2017087731 May 2017 WO
2017152269 Sep 2017 WO
2018005141 Jan 2018 WO
2018102378 Jun 2018 WO
2020044026 Mar 2020 WO
2020118020 Jun 2020 WO
2020132632 Jun 2020 WO
2020223803 Nov 2020 WO
2020237112 Nov 2020 WO
2021062563 Apr 2021 WO
2021100054 May 2021 WO
2022043197 Mar 2022 WO
2022126092 Jun 2022 WO
2022149501 Jul 2022 WO
2023287276 Jan 2023 WO
2023038579 Mar 2023 WO
2023137304 Jul 2023 WO
2023164683 Aug 2023 WO
9606765 Feb 1998 ZA
200610366 Jan 2008 ZA
Non-Patent Literature Citations (25)
Entry
Alexandrakis et al., “Marine Transportation for Carbon Capture and Sequestration (CCS)”, Department of Civil and Environmental Engineering, Thesis, Massachusetts Institute of Technology, Jun. 2010.
Datta et al., “Advancing carbon management through the global commoditization of CO2: the case for dual-use LNG-CO2 shipping”, Carbon Management, 2020, vol. 11, No. 6, 611-630.
Ibitoye et al., “Poster Abstract: A Convolutional Neural Network Based Solution for Pipeline Leak Detection”, School of Information Technology, Carleton University, Ottawa, Canada, Nov. 2019.
IntelliView, “Thermal Imaging Provides Early Leak Detection in Oil and Gas Pipelines”, Petro Industry News, www.Petro-Online.com, Aug./Sep. 2018.
Southwest Research Institute, “Methane Leak Detection”, 2021.
Lloyd's Register, Using technology to trace the carbon intensity of sustainable marine fuels, Feb. 15, 2023.
Borin Manufacturing, Inc., Dart for Test Station, Above Ground Remote Monitoring, Feb. 11, 2021.
Borin Manufacturing, Inc., Commanche Remote Monitoring and Control System, Mar. 24, 2017.
Borin Manufacturing, Inc., Dart for Rectifiers, Remote Monitoring and Control System. Nov. 1, 2017.
Borin Manufacturing, Inc., Stelth 2 Solid-State Reference Electrode for Buried and Concrete Service, Aug. 7, 2015.
Borin Manufacturing, Inc., Stelth 3, Nov. 10, 2016.
Borin Manufacturing, Inc., Stelth Reference Electrodes, Feb. 4, 2016.
Borin Manufacturing, Inc., Stelth Solid-State Reference Electrodes, Nov. 8, 2016.
Borin Manufacturing, Inc., Stelth Reference Electrodes, Oct. 10, 2017.
Borin Manufacturing, Inc., ‘Miracle half-cell’, Palladium: Borin's new reference electrode chemistry, Aug. 13, 2014.
Borin Manufacturing, Inc., Street Dart, For Test Station, Ground Level Remote Monitoring, Mar. 2017.
Neutrik XXR-2 XX Series, https://www.parts-express.com/Neutrik-XXR-2-XX-Series-Color-Coding_Ring-Red, 2022.
Hou, Qingmin, An FBG Strain Sensor-Based NPW Method for Natural Gas Pipeline Leakage Detection, Hindawi, Mathematical Problems in Engineering, vol. 2021, Article ID 5548503, pp. 1-8.
Skelton et al., Onboard Refueling Vapor Recovery Systems Analysis of Widespread Use, Nescaum, Boston MA, Aug. 20, 2007.
Membrane Technology and Research, Inc., Gasoline Vapor Recovery, 2018.
Jordan Technologies, Aereon, Recovering More Vapor = Increased Profits, 2015.
EPFL, Capturing CO2 from trucks and reducing their emissions by 90%, Dec. 23, 2019.
Sharma, Shivom et al., Carbon Dioxide Capture from Internal Combustion Engine Exhaust Using Temperature Swing Adsorption, Front. Energy Res., Sec. Carbon Capture, Utilization and Storage, Dec. 16, 2019.
Information Disclosure Declaration by Kyle E. Miller, Dec. 18, 2020.
Cott Manufacturing Company, FinkLet®/FinkPlate® Cathodic Proection Test Stations, Wayback Machine, May 22, 2000.
Provisional Applications (3)
Number Date Country
63540822 Sep 2023 US
63540692 Sep 2023 US
63539039 Sep 2023 US