The present disclosure generally pertains to the field of non-destructive testing (NDT). More specifically, the disclosure pertains to supporting a person conducting NDT by the use of augmented reality (AR).
Non-destructive testing includes methods that do not damage the parts being tested. NDT uses various inspection techniques to assess individual or group components. By employing different principles from physics, chemistry, and mathematics, NDT can test components without causing damage. One special field of NDT is directed to the testing of buildings and other structures, i.e., by determining the position and condition of reinforced bars (rebar) inside of the structure, e.g., inside of a wall of a building.
According to known solutions for NDT, the data acquisition follows a grid pattern. The grid must have known dimensions and allows a plurality of 2D line scans to be related to each other. This grid may be painted by hand, which is time-consuming, or a paper grid may be used, which may be difficult to place at challenging locations. Also, paper grids have specific sizes so that there is not always a fitting grid size available for the job at hand.
Often, post-processing of the data in the office is necessary. To visualize the collected data, known solutions often do not provide a satisfying solution. The visualized data may be sketched on the tested wall or structure. However, for this to work there is a need to reposition within the room relative to the collected data. Also, there is no option for efficiently visualizing depth with this approach.
It is therefore an object to provide an improved system and an improved method for non-destructive testing of a structure that overcome the problems of the prior art.
At least one of these objects is achieved by the system according to claim 1, the method according to claim 13, and/or the dependent claims.
A first aspect pertains to a system for non-destructive testing (NDT) of a structure, the system comprising a data-acquisition device (NDT device) comprising at least one NDT sensor configured to detect features inside the structure by means of NDT and to generate feature data related to the detected features. According to this aspect, the system comprises an augmented-reality (AR) device comprising at least one camera and a display unit, wherein the at least one camera is configured to capture images of a surrounding and to generate image data, the surrounding comprising at least one surface of the structure. The system further comprises a computing unit that is configured to generate AR image data based on the image data and on NDT data and to provide the AR image data to the display unit. The display unit is configured to present, based on the AR image data, AR images in real time to a user of the system. The NDT data comprises either the feature data, instruction data or both, wherein the instruction data comprises measuring instructions for the user for performing the non-destructive testing of the structure using the data-acquisition device.
According to some embodiments of the system, the AR device is configured as a wearable, e.g., as AR goggles, to be worn by the user. According to some embodiments, it is configured as a projector, e.g., as a head-up display. Alternatively, the AR device may be configured as a handheld device having a touchscreen, e.g., a smartphone or tablet computer. According to some embodiments, the AR device comprises the computing unit. According to some embodiments, the AR device comprises an inertial measuring unit and/or a gyroscope.
According to some embodiments of the system, the at least one camera is configured to capture the images as an image stream, and the display unit is configured to present the AR images in real time as a live AR image stream.
According to some embodiments of the system, the NDT data comprises at least the feature data, and at least a subset of the AR images shows the structure overlaid with representations of the detected features. In some embodiments, the data-acquisition device comprises a plurality of different NDT sensors, including at least a radar sensor, an inductive sensor and/or a capacitive sensor. In some embodiments, the representations of the detected features are presented in the AR images in a 3D view.
According to some embodiments, the at least one NDT sensor is configured to determine properties of the features, the properties comprising at least a kind of the feature or a material of the feature, the feature data comprising information about the determined properties, and the representations of the detected features are presented in the AR images together with the information about the determined properties, wherein different kinds of features or different materials are highlighted or coloured differently. For instance, the highlighting or colouring of the different kinds of features and/or the different materials may be user-selectable.
According to some embodiments, the data-acquisition device comprises a plurality of NDT sensors of different sensor types, such as inductive sensors, capacitive sensors, and/or radar sensors. The computing unit can then be configured to determine, based on feature data generated by NDT sensors of two or more different sensor types, object types of the features, such as water pipes, plastic pipes, copper pipes, live wires, and reinforcement bars. The representations of the detected features may then be presented in the AR images together with information about their object type, wherein different object types are highlighted or coloured differently (the highlighting or colouring may be user-selectable).
According to some embodiments, the computing unit is configured to generate further information based on the feature data (and optionally other information). At least a subset of the AR images then shows the structure overlaid with said further information, the further information comprising areas that are or safe or unsafe to drill into and/or locations that should be inspected further.
According to some embodiments of the system, the NDT data comprises at least the measuring instruction data, and at least a subset of the AR images shows the structure overlaid with the measuring instructions. The measuring instructions may comprise visual instructions on how to move the data-acquisition device over the structure.
In some embodiments, the measuring instructions comprise a grid that provides a pattern for moving the data-acquisition device over the structure. A size and shape of the grid may be user-selectable or may be selected automatically depending on a detected size and shape of the structure or of a scanning area on the structure.
In some embodiments, the computing unit is configured to identify the data-acquisition device in at least a subset of the images of the surrounding and to derive an actual position of the data-acquisition device relative to the structure. The measuring instructions then may be updated in real time based on the actual position of the data-acquisition device.
In some embodiments, in which the NDT data comprises the feature data and at least a subset of the AR images shows the structure overlaid with representations of the detected features, the AR images show representations of the detected features positioned relative to the structure based on the actual position of the data-acquisition device when detecting the respective features.
In some embodiments, the computing unit is configured to perform a user-guidance process, in the course of which the measuring instructions are updated in real time based on the actual position of the data-acquisition device to guide its user through the non-destructive testing of the structure.
In some embodiments, the computing unit is configured for generating documentation data of steps of the non-destructive testing of the structure, the documentation data comprising at least one of image, video and audio data. The computing unit is then further configured to automatically capture the documentation data, and/or to prompt the user to capture the documentation data, e.g., by displaying a visual indicator on the display unit.
According to some embodiments of the system, the computing unit is configured to detect a fiducial marker in at least a subset of the images of the surrounding and to extract computer-readable information from the fiducial marker. For instance, the fiducial marker can be provided on a surface of the structure, can be an ArUco marker, and/or can be an elongated element, such as a ribbon band.
In some embodiments, the fiducial marker comprises information that allows identifying the structure, and the computing unit is configured to identify the structure based on the information of the fiducial marker. Optionally, the computing unit may have access to a database providing a multitude of datasets with information about individual structures.
In some embodiments, the NDT data comprises the feature data, the computing unit has access to a database providing past NDT data of a previously performed NDT of the individual structure, and the computing unit is configured, upon identifying the structure, to generate the AR image data based on the image data and on the past NDT data.
In some embodiments, the fiducial marker comprises information that allows aligning the NDT data relative to the image data, and the computing unit is configured to generate the AR image data based on the information of the fiducial marker. Optionally, the computing unit may be configured to generate, based on the information of the fiducial marker, the AR image data by mooring coordinates of the NDT data to a 3D location, orientation and scale of the surface of the structure.
According to some embodiments of the system, the structure is a wall, a ceiling or a floor, and the features comprise at least one of reinforcement bars, conduits and utilities.
According to some embodiments of the system, the structure is a bridge, a viaduct, a pillar, a dam, or a tower (or a part thereof), and the features comprise at least one of reinforcement bars and post-tensioning cables.
Additionally, the features may comprise non-homogenous compositions of a material of the structure, such as a fault or a defect in the material or an absence (e.g., hole) of the material inside the structure.
A second aspect pertains to a computer-implemented method for NDT of a structure, e.g., using a system according the first aspect. Said method comprises, in real time:
According to some embodiments, the method comprises, while the user performs the NDT of the structure using the data acquisition device:
A third aspect pertains to a computer program product comprising program code having computer-executable instructions for performing, in particular when run in a system according to the first aspect, the method according to the second aspect.
Aspects will be described in detail by referring to exemplary embodiments that are accompanied by figures, in which:
In
This allows for a flexible grid size depending on needs, wherein a positioning relative to known location is possible, e.g., by using fiducial markers. There is no need for markings on the scanning surfaces or for attaching paper grid on the scanning surfaces, which is particularly difficult on walls or ceilings. Also, a size of the scanning area or a scanning density is not restricted to pre-defined reference papers, and custom scan path patterns are possible. Additionally, in some embodiments, detected objects may be visualized in 3D relative to a known location (e.g., to fiducial markers). This also allows to visualize locations which are or are not allowed to be drilled into, and/or to detect and highlight critical locations that must be opened for inspection.
Instead of AR goggles 10 as shown here, other suitable devices may be used that allow displaying augmented reality information to the user 2. For instance, such devices may include other wearables, projectors, or head-up displays. Alternatively, the AR device may be configured as a handheld device having a touchscreen, a camera and an IMU, e.g., a smartphone or tablet computer on which a software application (app) may be installed, which allows the AR device to cooperate with the NDT device and to display the AR images on the touchscreen in dependence of a pose of the AR device, the pose being detectable by means of the IMU and/or the camera.
The system further comprises an NDT data acquisition device (NDT device) 20, comprising one or more NDT sensors 21 configured for detecting features inside of a structure. Optionally, the NDT device 20 allows material discrimination for detecting and classifying features in the structure. For instance, this can be achieved by combining different NDT sensors 21 like radar, inductive sensors and capacitive sensors.
Various features may be detected, classified and visualized by methods and systems. For instance, these features may comprise reinforcement bars, tendon cables, post-tensioning cables, conduits, pipes (e.g., gas or water pipes) and utilities (e.g., electric cables). These features may also comprise hidden defects and other anomalies of the structure such as voids, cracks, foreign objects or material anomalies.
The cameras 13, capturing images of a surrounding, provide image data 33 to the computing unit 15. The computing unit generates AR image data 35 and provides it to the display unit 17, which displays AR images to the user based on the AR image data 35.
Generating the AR image data 35 is based at least on the provided image data 33 and on AR content. The AR content may comprise measuring instructions for the user, e.g. (as shown in
The AR content may also comprise features detected by the NDT device 20. In this case the NDT device sends NDT data 31 to the computing unit, e.g. via a wireless data connection, such as Bluetooth or WiFi. The detected features are then projected on the structure in the AR images.
For correctly positioning and orienting the AR content relative to the structure in the AR images, the cameras 13 optionally may capture images of a fiducial marker on the structure. Also, the cameras 13 may capture images of the NDT device 20 while the NDT is performed.
Optionally, the computing unit 15 may be configured to provide instruction data 32 with measuring instructions for the user to the NDT device 20, and the NDT device 20 may comprise means for displaying the instructions to the user, e.g. in addition to a grid being displayed in AR images on the display unit 17. These means, for instance, may comprise a small display or LEDs on the NDT device indicating a distance to follow along the grid.
For instance, as illustrated in
In some embodiments, a single fiducial marker 4 (e.g., a single ArUco marker) may be provided permanently at the wall 3. This allows unambiguously identifying the structure to be tested. Also, the manual grid definition may be based on the single fiducial marker 4 to map the grid scan to a serial number or specific fiducial marker pattern.
Optionally, the fiducial marker 4 can be an elongated fiducial marker, e.g. in the form of a ribbon band. This may allow for a better detection of the marker, and an improved referencing and scaling. For instance, the ribbon band be about 5 cm wide and up to 10 m long. It can be provided in arbitrary position on the wall 3 or other feature, either permanently or temporarily, i.e. only during the measurement.
The size and shape of the grid 52 may vary depending on a detected size and shape of the scanning area. In particular, grid shapes need not be limited to rectangles. Also, obstacles on the surface which need to be scanned may be taken into account when defining the grid 52. Optionally, the size and shape of the grid 52 may be selectable or customizable by the user 2.
The camera may also detect, which parts of the grid 52 have already been tested and update the grid 52 in real time, e.g. by only displaying the remaining parts of the grid 52 or by highlighting those parts of the grid that have already been tested differently than the remaining parts of the grid 52 (e.g. using different colours). The process may also be split into sub-processes, so that only one line of the grid 52 may be displayed at the same time, i.e. that line of the grid 52, along which the user is performing the current part (or the next part) of the measurement.
The user may be guided through the process by providing feedback in the AR images to correct or improve the data collection process. For instance, if the position of the sensor of the NDT device 20 is deviating from the current line, the user can be given feedback, e.g. by displaying arrows indicating how to correct the sensor position (up, down, left, right). If the whole grid is displayed and the user does not follow the correct sequence of grid lines, this may be indicated in the AR images, e.g. by highlighting the correct line. Optionally, if a deviation is detected, this may be taken into account in the data processing, in order to automatically correct the user's error.
A scan surface has been defined over points A, B and C, providing a 2D plane where the device 20 will always be on during data collection. The AR Goggles 10 have position reference to the grid 52 through points A, B, C. The camera on the AR goggles 10 can track the NDT device 20 by recognizing significant features or geometries of the device.
As the NDT device 20 will always drive on the previously defined 2D plane (or slightly curved surface), camera images (or more) can add sufficient information to track the device 20 in three dimensions, fully defining the device position relative to points A, B, C.
Detected features (e.g., rebars 5 and utilities) in the wall 3 are referenced to the grid corner points A, B, C, which allows the AR goggles 10 to visualize the objects at their corresponding locations after initial setup has been done. Data visualization can be used for several purposes. The data captured by the AR device 10, particularly the AR images, and the data captured by the NDT device 20, i.e. the feature data related to the detected features are related in 3D space to each other.
Optionally, the process or important steps thereof may be documented by the AR googles 10 by taking and storing pictures, videos and/or audio of the process, which optionally may be combined with 3D space information, e.g. of the NDT sensor. This documentation may be done automatically or involve prompting the user 2 to do so.
Optionally, further information may be generated based on the detected features and visualized in the AR image 51. Preferably, the representations 55 may be visualized in 3D. Presently available AR solutions that allow this include the Oculus Quest 2, the Microsoft HoloLens and a 3D rendering software such as Unity.
To prevent hitting structural elements within the wall, in the shown example this further information comprises highlighting areas 58 that are safe to drill into. Alternatively, the NDT sensor data (e.g., detected life wire) may be used to highlight dangerous zones where drilling should be avoided. Also, locations may be defined and highlighted that should be inspected further. Depth of detected utilities may be highlighted using colour grading, material properties may be highlighted using overlaid symbols, marks, icons, text or abstracted objects (e.g., spheres or cylinders). Detected features may be classified (e.g., as rebars, conduits, water pipes, pockets) and highlighted by representative geometrical overlays (e.g. cylinders, rectangular objects, tubes or spheres).
Visualizing the feature data may comprise object discrimination. This involves sensor fusion on the side of the NDT device, so that the different detected properties or objects can be visualizing. For instance, this may include highlighting different kinds of objects (e.g., highlighting representations of rebars differently than representations of pipes) or showing different materials (e.g. plastic, iron, copper) in different colours.
A plurality of NDT sensors of different sensor types may be used for detecting the features, for instance inductive sensors, capacitive sensors, and radar sensors. Based on feature data generated by NDT sensors of two or more different sensor types, object types of the features may be determined, such as water pipes, plastic pipes, copper pipes, live wires, and reinforcement bars. Representations of the detected features may then be presented in the AR images according to their object type, e.g., highlighted or coloured differently.
For instance, if radar detects a strong signal, whereas the inductive sensor gives no signal, it is likely that the feature is a non-metal object like a water pipe (containing water). If radar detects a weak signal, and the inductive sensor gives no signal, it is likely that the feature is a non-metal object like an empty plastic pipe. If radar detects a strong signal, and the inductive sensor senses a non-ferrous metal object, it is likely that the feature is a copper pipe as used for heating. If radar detects a strong signal, and the inductive sensor senses a ferrous metal object, it is likely that the feature is a reinforcement bar. If radar detects a signal, the inductive sensor senses a non-ferrous metal object, and the capacitive sensor detects 50 Hz, it is likely that the feature is a live wire.
The method 100 comprises, in real time:
As described above, e.g. with respect to
Although aspects are illustrated above, partly with reference to some preferred embodiments, it must be understood that numerous modifications and combinations of different features of the embodiments can be made. All of these modifications lie within the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
23213675.4 | Dec 2023 | EP | regional |