The embodiments disclosed herein relate to an inspection support apparatus that supports an inspection related to the frame of building and civil-engineering structures.
In the construction of buildings of reinforced concrete, a bar arrangement inspection is performed for checking whether rebars are arranged correctly according to the bar arrangement drawing or the like. For this bar arrangement inspection, a system that supports the bar arrangement inspection (hereinafter, referred to as a “bar arrangement inspection system”) has been developed from the viewpoint of greater efficiency of the inspection, reduction of burden on the inspector, and so on. As the bar arrangement inspection system, techniques for performing the bar arrangement inspection by analyzing rebar image data captured by a digital camera have been actively developed.
An inspection support apparatus that supports an inspection related to a frame of building and civil-engineering structures according to an aspect includes a memory and a processor connected to the memory, and the processor is configured to obtain three-dimensional data of the structure that includes a plurality of planes formed by the frame; from the three-dimensional data, detect a plurality of first planes that include at least three points of the three-dimensional data; for each of the detected first planes, compute, from the three-dimensional data, a number of three-dimensional points at a distance that is equal to or shorter than a prescribed distance from the first plane; and according to the computed number of three-dimensional points, identify, from the detected plurality of first planes, a second plane positioned on a forefront of the frame of the structure.
For example, Japanese Laid-Open Patent Publication No. 2015-001146 mentioned above proposes a rebar inspection apparatus that measures, according to a captured image of the rebars, the distance between adjacent joints of rebars and derives the diameter of the corresponding rebars according to the measured distance between the joints.
Usually, bar arrangement is executed with division into a plurality of layers (planes), such as the front plane, the rear plane, the side planes, and the like. While the bar arrangement inspection is often conducted with the front plane layer being the target, it is difficult to capture the image of only the bar arrangement of the front plane layer on site, and rebars of the respective layers are often mixed in the captured image. For this reason, when conducting a bar arrangement inspection according to a captured image, it is necessary to extract rebars that belong to the layer being the inspection target. For example, in Japanese Laid-Open Patent Publication No. 2015-001146 mentioned above, three-dimensional points that belong to the front plane area are manually specified at the start to detect the front plane area of the bar arrangement. However, the method in which the bar arrangement area that is to be the inspection target is manually specified interferes with the automation of the inspection and also easily causes errors.
Therefore, it has been desired to provide an inspection support apparatus that accurately detects the bar arrangement area being the inspection target.
Hereinafter, embodiments of the present invention are explained according to the drawings.
the bar arrangement inspection system 1 includes an information processing apparatus 10 and a stereo camera 100. The stereo camera 100 is an example of a three-dimensional data generating apparatus (also referred to as a three-dimensional sensor). The three-dimensional data generating apparatus may also be a 3D laser scanner.
A pair of images of the bar arrangement H is captured by the stereo camera 100, and the three-dimensional data of the bar arrangement H is generated. The stereo camera 100 includes a right image capturing unit 110R, a left image capturing unit 110L, and a three-dimensional data generating unit 120. The right image capturing unit 110R captures a right-eye viewpoint image viewed from the right eye. The left image capturing unit 110L captures a left-eye viewpoint image viewed from the left eye. Meanwhile, the image captured by the right image capturing unit 110R and the left image capturing unit 110L may be a color image or may be a multi-level monochrome image such as a grayscale image. In the present embodiment, it is a grayscale image.
The three-dimensional data generating unit 120 generates three-dimensional data by applying a known stereo matching process to the image data of the right-eye viewpoint image and the image data of the left-eye viewpoint image. Meanwhile, the three-dimensional data is obtained as an image that holds three-dimensional point information in units of pixels. The three-dimensional data is also called a three-dimensional image or a distance image.
The information processing apparatus 10 is a PC (Personal Computer), a tablet device, or a dedicated hardware or the like, for example. The generated three-dimensional data is obtained by the information processing apparatus 10, and the measurement target is identified by a front plane parameter computing process or the like according to the obtained three-dimensional data. Then, various measurement processes for the rebars identified as the measurement target are performed by the information processing apparatus 10. In addition, processes may also be performed by the information processing apparatus 10 to display or store the results of the processes, and so on. Hereinafter, the front plane bar arrangement HA positioned on the forefront of the bar arrangement H is assumed as the measurement target. The front plane bar arrangement HA is the bar arrangement facing the side that is closest to the stereo camera 100.
Meanwhile, the means for the information processing apparatus 10 to obtain the three-dimensional data from the stereo camera 100 may be any of wired (for example, a USB cable or the Internet), wireless (for example, a wireless Local Area Network or the Internet), or an external recording medium.
Next, the overall process in the bar arrangement inspection system 1 is briefly explained.
By the person who captures the image, the stereo image capturing of the bar arrangement H is performed, using the stereo camera 100 mentioned above (Step S1). The right-eye viewpoint image and the left-eye viewpoint image are captured by the stereo camera 100, and the three-dimensional data is generated. The information processing apparatus 10 obtains the three-dimensional data from the stereo camera 100 (Step S2). The information processing apparatus 10 identifies the plane on the forefront from the three-dimensional data (Step S3). The plane on the forefront refers to the place positioned on the forefront with respect to the stereo camera 100 in the planes formed by the frame of building and civil-engineering structures. Specifically, the plane on the forefront is the plane in
The information processing apparatus 10 creates a plane area image according to the identified plane on the forefront (Step S4) and identifies the arrangement of rebars that is to be the measurement target (Step S5). The plane area image is explained in
The CPU 510 is a control unit that integrally controls the entire information processing apparatus 10. Meanwhile, the CPU 510 is an example of a processor, and the processes performed by the CPU 510 may also be performed by the processor. The CPU 510 loads a control program from the ROM 530 and performs various control processes according to the loaded control program.
The RAM 520 is a work area that temporality stores various data such as the control program, three-dimensional data from the stereo camera 100, and the like. The RAM 520 is a memory such as a DRAM (Dynamic Random Access Memory) or the like, for example. The ROM 530 is a non-volatile storage unit that stores the control program, data, and the like. The ROM 530 is a memory such as a flash memory or the like, for example.
The input/output IF 540 performs transmission and reception of data with an external device. The external device is the stereo camera 100 connected by a USB cable or the like, or an exchangeable storage unit 600, for example. The information processing apparatus 10 obtains the three-dimensional data by the input/output IF 540 from the stereo camera 100. The input/output IF 540 is also referred to as a three-dimensional data obtaining unit. The storage unit 600 is a recording medium that is so called a memory card. The storage unit 600 may also be an HDD (Hard Disk Drive). The three-dimensional data generated by the stereo camera 100, the plane area image generated by the information processing apparatus 10 according to the three-dimensional data, and the like may be stored in the storage unit 600.
The communication unit 550 performs communication of various data wirelessly with an external device. The operation unit 560 is a keyboard, a touch panel, or the like that inputs operation instructions. The display unit 570 is an LCD (liquid crystal display) for example and displays input data or a captured image, an image according to the three-dimensional data, and the like. The CPU 510 is connected to the RAM 520, the ROM 530 and so on by the bus 580.
As illustrated in
The three-dimensional information obtaining unit 32 obtains the three-dimensional data from the stereo camera 100. The three-dimensional information obtaining unit 32 is the input/output IF 540 for example, as mentioned above.
The multiple plane detection unit 34 detects a plurality of planes that include at least three points of the three-dimensional data, from the obtained three-dimensional data. Specifically, for example, the multiple plane detection unit 34 selects three points from the obtained three-dimensional data, sets a temporary plane (also referred to as a first plane) formed by the three points, and detects a plurality of such temporary planes.
Then, the multiple plane detection unit 34 extracts, for each of the detected temporary planes, three-dimensional points that are at a distance to the temporary plane that is equal to or smaller than a prescribed distance, as three-dimensional points belonging to the temporary plane. The three-dimensional points are the respective points of the three-dimensional data. The multiple plane detection unit 34 calculates the number of three-dimensional points belonging to each of the temporary planes. The multiple plane detection unit 34 outputs the parameters (referred to as a plane parameter) of the detected temporary planes and the number of three-dimensional points belonging to each of the temporary planes.
The front plane identification unit 36 identifies the plane on the forefront from the plurality of temporary planes output from the multiple plane detection unit 34. Specifically, the front plane identification unit 36 identifies the plane positioned on the forefront of the frame of the structure (the plane on the forefront), from the detected plurality of temporary planes, according to the number of three-dimensional points belonging to each of the temporary planes. The plane on the forefront is also referred to as the second plane. More specifically, the front plane identification unit 36 identifies, as the plane on the forefront, the temporary plane that has the largest number of calculated three-dimensional points in the detected plurality of temporary planes. The front plane identification unit 36 outputs the plane parameter of the identified plane on the forefront to the plane area image creating unit 38 as bar arrangement front plane information.
Meanwhile, the parameters of the plane on the forefront are also referred to as front plane parameters, and the multiple plane detection unit 34 and the front plane identification unit 36 are also referred to as a front plane parameter detection apparatus 50 together. The processing by the front plane parameter detection apparatus 50 is also referred to as a front plane parameter detection.
The plane area image creating unit 38 creates an image of the front plane bar arrangement HA from the obtained three-dimensional data, according to the parameters of the plane on the forefront. The bar arrangement identification unit 40 identifies the arrangement of rebars, from the image of the front plane bar arrangement HA. The measurement unit performs various measurement processes such as measurement of the diameters of rebars, measurement of the number of rebars, measurement of intervals between rebars, and the like, according to the identified arrangement of rebars.
First, the multiple plane detection unit 34 and the front plane identification unit 36 perform initialization of variables (t, N) for loop processing. The multiple plane detection unit 34 selects at least three points from the set ϕ of the three-dimensional points (Step S100). The multiple plane detection unit 34 computes the parameters of the plurality of planes L_tmp (Step S102) according to the selected points. The plane L_tmp is the temporary plane mentioned above.
Meanwhile, the parameters of the plane L_tmp are coefficients a, b, c, d of a plane equation expressed as the expression (1) below.
ax+by+cz+d=0 Expression (1)
Here, (x, y, z) represent coordinates of points in a three-dimensional space. The parameters of the plane L_tmp may be computed using known techniques such as the least-squares method. Meanwhile, when the selected points are three points, one plane may be computed by the calculation. Meanwhile, the points to be selected are not limited to three points and may be three points or more.
The multiple plane detection unit 34 computes the distance between the plane L_tmp and each point of the set ϕ of three-dimensional points (Step S104). The multiple plane detection unit 34 computes the number N_tmp of points at a distance to the plane L_tmp that is equal to or smaller than a threshold D, in the set ϕ of three-dimensional points (Step S106). That is, the multiple plane detection unit 34 calculates the number of three-dimensional points near the plane L_tmp. The threshold D is a value set in advance, and it may be stored in the ROM 530. The threshold D may be the radius or the diameter of the rebar to be the target. Three-dimensional points at a distance to the plane L_tmp equal to or smaller than the threshold D are referred to as close points. From the set ϕ of three-dimensional points, the number N_tmp of close points that satisfies the expression (2) as the relationship between the distance to the plane L_tmp and the threshold D is calculated.
|ax+by+cz+d|<D Expression (2)
Referring to
A plurality of planes L_tmp are detected from the set ϕ of three-dimensional points. The image D4 in
The image D5 in
The image D7 in
According to
Then, from Step S110 in
As illustrated in
Back to
The front plane identification unit 36 determines whether N<N_tmp is true (Step S108). N is the tentative largest value of the number N_tmp of close points. When the front plane identification unit 36 determines that N<N_tmp is true (YES in Step S108), the front plane identification unit 36 updates N with N_tmp and updates the plane L with the plane L_tmp (Step S110). The plane L is the tentative parameters of the plane on the forefront.
The front plane identification unit sets t=t+1 (Step S112). t is a loop counter for comparing N_tmp of planes L_tmp sequentially.
The front plane identification unit 36 determines whether t<T is true (Step S114). T is the total number of temporary planes computed in Step S104. The process is terminated when t=T. Upon determining that t<T is true (YES in Step S114), the front plane identification unit 36 returns to Step S100 and performs the process for the next plane L_tmp.
Upon determining that N<N_tmp is not true (NO in Step S108), the front plane identification unit 36 proceeds to Step S112. After Step S112, upon determining that t<T is not true (NO in Step S114), the front plane identification unit 36 identifies the plane L as the plane parameter corresponding to the front plane of the bar arrangement and outputs it to the plane area image creating unit 38. Then, the front plane parameter detection process is terminated.
The processes from Step S100 through Step S114 described above are common with the processing process of the parameter estimation method according to the RANSAC (Random Sample Consensus) process. The RANSAC process has been developed and used with the original intention being the estimation of a numerical model from measured values that include outliers (abnormal values). In the present embodiment, a process equivalent to the RANSAC process is performed with an intention that is different from the original intention of the RANSAC process.
The front plane parameter detection process explained above is summarized as processes below.
Then, according to the front plane parameter detection process explained above, the bar arrangement on the forefront is automatically detected using that, in obtained three-dimensional data, the largest volume of three-dimensional point data corresponds to the front plane. Accordingly, the bar arrangement on the forefront is detected certainly by the front plane parameter detection process according to the present embodiment.
Next, the second embodiment is explained. In the front plane parameter detection process described above, the multiple plane detection unit 34 excludes the floor plane and the wall plane in advance from the temporary plane. In the second embodiment, further, three-dimensional data that are away from the three-dimensional data generating apparatus (the stereo camera 100) by a distance equal to or larger than a prescribed distance is excluded from the set ϕ of three-dimensional points used by the multiple plane detection unit 34. This is because the measurement target is the bar arrangement on the forefront, and therefore, three-dimensional points at the side that are away from the stereo camera 100 by an amount equal to or more than a prescribed amount are not needed.
The second embodiment has many common portions with the first embodiment explained with drawings up to
In an inspection support apparatus 30b in the second embodiment, a three-dimensional data removal unit 70 is added to the inspection support apparatus 30 of the first embodiment. The three-dimensional data removal unit 70 identifies three-dimensional data that are away from the three-dimensional data generating apparatus (the stereo camera 100) by a distance equal to or larger than a prescribed distance and excludes the identified three-dimensional data from the set ϕ of three-dimensional points. Then, the three-dimensional data removal unit 70 outputs, to the multiple plane detection unit 34, the set ϕ of three-dimensional points from which the identified three-dimensional data are excluded. The prescribed distance is 3 m in the case in which the distance to the front plane bar arrangement HA is 2 m, for example.
The three-dimensional data removal unit 70 reads three-dimensional points one by one from the set ϕ of three-dimensional points and performs the processes of Step S80 through Step S86 below.
The three-dimensional data removal unit 70 computes, from the coordinates of a three-dimensional point (x, y, z), a distance K between the three-dimensional point and the stereo camera 100 (Step S80). The three-dimensional data removal unit 70 determines whether K<P is true (Step S82). P is a value in which a certain margin value is added to the distance to the front plane bar arrangement HA positioned on the forefront. As the distance to the front plane bar arrangement HA, a distance value measured by the stereo camera 100 may be used, or it may also be an input value from the person who captures the image.
Upon determining that K<P is not true (NO in Step S82), the three-dimensional data removal unit 70 removes the three-dimensional point (Step S84) and proceeds to Step S86. This is because the three-dimensional point can be determined as a point that is not included in the front plane bar arrangement HA.
Upon determining that K<P is true (YES in Step S82), the three-dimensional data removal unit 70 determines whether the process has been completed for all of the three-dimensional points (Step S86). Upon determining that the process has not been completed for all of the three-dimensional points (NO in Step 86), the three-dimensional data removal unit 70 returns to Step S80. Upon determining that the process has been completed for all of the three-dimensional points (YES in Step S86), the three-dimensional data removal unit 70 proceeds to Step S100. Explanation for Step S100 and subsequent steps is omitted as they have already been explained.
According to the second embodiment, the three-dimensional data based on the bar arrangement on the far side and the wall plane, and the like may be efficiently removed by the three-dimensional data removal unit 70. By excluding unnecessary three-dimensional points in advance, the volume of three-dimensional data is reduced, and the time taken for the front plane parameter detection process is shortened. Meanwhile, the removal process described above may also be performed in the multiple plane detection unit (in the RANSAC process).
In
According to the embodiments described above, an inspection support apparatus that properly detects the area of the bar arrangement to be the inspection target may be provided.
Meanwhile, the present invention is not limited to the exact embodiments described above, and at the stage of implementation, embodiments may be made while applying variation to the components without departing from its scope. In addition, various inventions may be formed by appropriately combining a plurality of components disclosed in the embodiments described above. For example, the entire components disclosed in an embodiment may be appropriately combined. Furthermore, components across different embodiments may be appropriately combined. It goes without saying that various variations and applications may be made without departing from the gist of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2017-069434 | Mar 2017 | JP | national |
This application is based upon and claims the benefit of priority from prior Japanese Patent Application No. 2017-069434, filed Mar. 31, 2017, the entire contents of which are incorporated herein by this reference. This application is a continuation application of International Application PCT/JP2018/009622 filed on Mar. 13, 2018, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/009622 | Mar 2018 | US |
Child | 16574020 | US |