Many large objects require routine inspection. Some examples include ships, aircraft, bridges, and large storage structures like tanks, buildings, and roadways. If these objects are outside (i.e., exposed to GPS signals) and the locations of the points of inspection known to about 2-5 cm are acceptable, precision GPS attached to inspection sensors for location tagging of inspection data is acceptable. However, if more accurate localization of the inspection is required, or if the object being inspected does not have GPS visibility or line of sight (to at least 5 satellites due to being indoors or amongst objects that obscure GPS line-of-sight to sufficient satellites), an alternative localization method is needed.
For less precise measurement, GPS substitutes like pseudolites can be used, however, accuracy achievable is comparable to GPS and these devices are also hard to employ and are costly. Such a system is described by U.S. Pat. No. 6,882,315.1 1 Richley et al., Object Location System and Method, U.S. Pat. No. 6,882,315, Apr. 19, 2005.
Optical measurement approaches have been employed at least since the advent of the telescope2 and its use for surveying.3 Gelbart, et. al. in U.S. Pat. No. 5,305,0914 describes an optical coordinate measurement approach that consists of multiple optical transceivers (transmitter-receivers) mounted onto a stable reference frame such as the walls of a room. The object to be measured is touched with a hand-held measuring probe. To measure, the probe triggers the transceivers to read the distance to two retroreflectors mounted on the probe. The location of the probe tip relative to the reference frame is computed from at least six transceiver readings (three for each retroreflector). 2 Invented and patented by Dutch eyeglass maker Hans Lippershey in 1608. Also Galileo in 1609.3 Joshua Habermel made the first theodolite with a compass in 1576. Johnathon Sission incorporated the telescope into it in 1725. As a practice, surveying in some form dates back to at least the Egyptians in 1400 B.C.4 Gelbart, et al. Optical coordinate measuring system for large objects. U.S. Pat. No. 5,305,091. 19 Apr. 1994.
More recently, Borghese, et al., disclose Autoscan, a two camera 3D imaging system for capture of large area objects.5 Borghese's approach essentially employs stereo computer vision like that described in Ohta, et al.6 and Baker, et al.' Neitzel, et al., discloses a system that uses a UAV to move a camera around a large object the capture 3D mapping of the object.' Neitzel's system employs 3D reconstruction from multiple views of an object. This technology dates back to Hildreth,9 and later Mathies, Kanade, et. al.10 5 Borghese, Nunzio Alberto, et al. “Autoscan: A flexible and portable 3D scanner.” IEEE Computer Graphics and Applications 18.3 (1998): 38-41.6 Ohta, Yuichi, and Takeo Kanade. “Stereo by intra-and inter-scanline search using dynamic programming.” IEEE Transactions on pattern analysis and machine intelligence 2 (1985): 139-154.7 Bolles, Robert C., H. Harlyn Baker, and David H. Marimont. “Epipolar-plane image analysis: An approach to determining structure from motion.” International journal of computer vision 1.1 (1987): 7-55. Also a citation that references work as early as 1982.8 Neitzel, Frank, and J. Klonowski. “Mobile 3D mapping with a low-cost UAV system.” Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci 38 (2011): 1-6.9 Hildreth, Ellen C. “Computations underlying the measurement of visual motion.” Artificial Intelligence 23.3 (1984): 309-354.10 Matthies, Larry, Takeo Kanade, and Richard Szeliski. “Kalman filter-based algorithms for estimating depth from image sequences.” International Journal of Computer Vision 3.3 (1989): 209-238.
Guidi, et al. discloses the use of 3D mapping to large area cultural (archeological) sites. His approach employs 3D time-of-flight laser radar units often used for aerial surveys. This technology was invented in the early 1960s, disclosed in U.S. Pat. No. 4,935,616,11 pioneered at the Environmental Research Institute of Michigan (formerly University of Michigan Willow Run Laboratories), in 1980s as described in by McMannamon, et al.,12 and used in mapping as described by Wesolowiz, et al.13 Localization on aircraft is described by Hadley, et al. in U.S. Pat. No. 7,873,494.14 His method does not directly measure location of arbitrary points on the aircraft, but rather identifies where a point is relative to other known locations on the aircraft (features readily identifiable in an image of the aircraft and designated as reference points with known locations relative to the three dimensional coordinate system of the aircraft). This approach assumes a geometric or CAD representation of the aircraft that defines its coordinate system, and reference points identified in that CAD database. 11 Scott, et al., Range Imaging Laser Radar, U.S. Pat. No. 4,935,616, Jun. 19, 1990.12 McManamon, Paul F., Gary Kamerman, and Milton Huffaker. “A history of laser radar in the United States.” Laser Radar Technology and Applications XV. Vol. 7684. International Society for Optics and Photonics, 2010.13 Wesolowicz, Karl G., and Robert E. Sampson. “Laser Radar Range Imaging Sensor for Commercial Applications.” SPIE. Vol. 783. 1987.14 Hadley, et al., Method and Apparatus for an Aircraft Location Position System, U.S. Pat. No. 7,873,494, Jan. 18, 2011.
This invention enables continuous, multiple-point surveying and measurements of large areas and objects. The results may be coordinated or combined with 3D localization systems or methods employing GPS, manual theodolites, range finders, laser radars or pseudolites. The invention is ideally suited to the routine and repeated inspection of aircraft and other objects with large surfaces including ships, bridges, tanks, buildings, and roadways.
In accordance with a method of inspecting such surfaces, a marker is placed on a surface providing a unique computer-readable code. A camera gathers an image of the surface containing the marker. A programmed computer processes the image to develop a coordinate system defining the surface, with the location of the marker being defined as a point with particular coordinates with respect to the surface. This facilitates tracking or determining characteristics of the surface relative to the location of the marker.
The markers may be positioned at different locations on the surface, each marker having a different unique computer-readable code, and wherein coordinate system may define a full six-degree-of-freedom coordinate space. The computer-readable code may be a barcode or other passive code. Alternatively, the computer-readable code may be an encoded, light-emitting code or other active code. The step of tracking or determining characteristics of the surface relative to the location of the marker may include mapping the surface to create a computer-aided design (CAD) representation.
The method may further include the steps of coupling the marker to a sensor operative to collect sensor data at or in the vicinity of the marker, and merging the coordinates of the marker and the sensor data. For example, the sensor data may be imaging data; and the step of tracking or determining characteristics of the surface relative to the location of the marker may include generating a multi-staged or dimensional map of the surface.
The sensor data may be derived from a non-destructive inspection sensor, and the step of tracking or determining characteristics of the surface relative to the location of the marker may include the step of monitoring flaws or defects in the surface. The flaws or defects in the surface may be monitored over time.
The method may include the step of mounting the marker on a fixture, thereby enabling the computer-readable code to be imaged from multiple directions, lines of sight, or preferred viewing angles. The method may also further include the step of patching leapfrogged inspection areas to enable a contiguous inspection map.
A system for inspecting a surface in accordance with the invention may include a marker supported on the surface providing a unique computer-readable code; a camera operative to gather an image of the surface containing the marker; and a programmed computer operative to receive the image from the camera and develop a coordinate system defining the surface, with the location of the marker being defined as a point with particular coordinates on the surface. A human interface coupled to the programmed computer enables a user to track or determine characteristics of the surface relative to the location of the marker.
This invention provides a system and related methods for performing continuous, multiple point surveying or measurement of large areas or objects. The measurement results may be coordinated or combined with other 3D localization systems employing GPS, manual theodolites, range finders, laser radars, pseudolites, and so forth. Disclosed examples deploy small passive unique targets that are attached to inspection sensors, and the targets are tracked accurately by one of more focal plane camera units set back at an offset from the area to be inspected.
The invention is not limited in terms of application area, and is particularly well suited to large areas, objects, structures and surfaces requiring routine inspection. Some examples include ships, aircraft, bridges, and large storage structures like tanks, buildings, and roadways. To find defects target areas must be systematically scanned, making sure no critical area has been overlooked. Accurate location of each sensor scan is necessary to ensure this (and also enables location-based depictions of inspection data). To track defects over time is necessary to accurately know sensor location so that the same defect can be revisited over time to allow tracking of progression.
Alternative uses of the disclosed location tag approach include:
As one non-limiting example, tags and tracking of them have been employed for large area inspection of aircraft. To take measurements relative to the aircraft coordinate system, we typically place a version of the small passive targets at the center of the aircraft fuselage, and then offset individual tag measures from this aircraft central point, thus eliminating the need for aircraft geometry or CAD data (although the measurements can be registered to, or overlaid on, aircraft CAD information if it is available).
This disclosed application is driven by the need to inspect surfaces, including composite surfaces, and features to detect corrosion and delaminations that may weaken aircraft structures, but are often completely invisible to external visual inspection. Because the delaminations are often progressive, and size of the defective area is important. Target areas have to be found and tracked over time as part of the aircraft preventative maintenance process. This is also true for the detection of cracks and progression of cracks over time.
The invention is not limited in terms of the sensor technology used, and may include any NDI (nondestructive inspection/evaluation) method(s), including ultrasonics, eddy-current measurement, x-radiography, laser interferometry, holographic interferometry and electronic speckle shearography (ES). In the preferred embodiments, the inspection is carried out with the NDI sensors described in U.S. Pat. Nos. 6,043,87015 and 6,040,90016, the entire content of both being incorporated herein by reference. 15 Chen, Compact fiber optic electronic laser speckle pattern interferometer, U.S. Pat. No. 6,043,870, Mar. 28, 2000.16 Chen, Compact fiber-optic electronic laser speckle pattern shearography, U.S. Pat. No. 6,040,900, Mar. 21, 2000.
Now making reference to the accompanying drawings, one or more unique, two-dimensional (2D) markers are placed at known locations on the area over which an inspection is to be performed (for example, an aircraft skin for aircraft inspection). Versions of this system accommodate between one and multiple markers to define the inspection space.
As shown in
By identifying the markers and their locations, the position of the inspection becomes known. When referencing the known inspection sensor position to the coordinate system defining the inspection space (incorporating the knowledge of where of the handle mounted marker is mounted to the inspection sensor), it is possible to attach to each inspection sensor record the location and orientation of the sensor reading within the inspection space or area.
The markers can be active or passive. For passive markers, augmented reality barcode tag-containing markers (
Software operative to implement the system and method is depicted in
In parallel, the sensor information 614 is read and fused with the sensor location information relative to the area being inspected (perhaps an aircraft fuselage). This allows a user interface 616 to be presented to the operator that displays where inspections are made relative to the inspected object and inspection results referenced to this three-dimensional space. The data may be archived 618 in a longitudinal database for later reference, so that defects detected can be tracked over time. As shown, the data in the database is readily exported in exchange formats (for example, as .PDF 620) for insertion into other applications of analysis, storage, and display 622.
As disclosed in U.S. Pat. No. 6,801,637, the entire content of which is incorporated herein by reference, it is also possible to employ active markers that are identified either by tracking their positions from a known starting configuration (i.e., an emitter is tracked in real time from a starting position so that an expected next location is approximately known and can be used to disambiguate the emitter from any others also visible in the same camera view), or detected through a time modulated code sequence (basically a “Morse code” like code where each active emitter generates a unique code that makes it unique either in sequence or in time of the pulse. The system defined the '637 Patent uses a code that emits a pulse at a time unique to each emitter relative to an elongate pulse from the master emitter. Each uniquely identified active marker is then used in the same way to identify where the inspection sensor is relative to the inspection area as was described previously for passive markers.
Note that passive markers that are not code unique can also be tracked and disambiguated from other markers through tracking their positions from a known starting configuration. Some trackers in the field for body tracking have used non-unique white balls for this type of application.
While the invention is ideally suited to the identification of inspection locations relative to an object to be routinely and repeatedly inspectioned, the technology can also be used to track any type of motion in a coordinate space (for instance in
The embodiment of the invention shown in
Use of additional markers enables a leapfrogging approach to extend inspection coverage beyond the initial inspection area. As long as one or more existing markers appears in the new inspection area defined by the additional markers, the system will patch the scans together as a contiguous inspection map.
This invention relates generally to inspection and measurement and, in particular, to apparatus and methods for inspecting and measuring large structures, objects and areas.