The present invention relates to an inspection support device for a structure, an inspection support method for a structure, and a program.
There are structures such as a bridge and a tunnel as social infrastructure. Since damage occurs in these structures and the damage has a property of progressing, it is required to perform regular inspections.
An inspector who inspects a structure needs to create an inspection record of a predetermined format, as a form indicating a result of the inspection, based on an inspection procedure determined by a structure manager or the like. With viewing of a damage diagram created in a predetermined format, even an expert who is different from the inspector who actually performs the inspection can grasp a progressing situation of damage to the structure and formulate a maintenance plan for the structure.
Similarly, for a structure such as a condominium or an office building, a state of the structure is also periodically inspected, and repair or mending is performed based on an inspection result. In the inspection, an inspection report is created. Regarding the creation of the inspection report, JP2019-082933A discloses a system capable of reducing a time required for the creation of the inspection report.
By the way, in a case where an inspection record is checked, there may be a case where it is desired to refer to damage fact information (captured image, drawing, three-dimensional model data, or the like) related to a damage situation from text information of the inspection record. However, the damage fact information is not associated with the inspection record created in the predetermined format, and thus there is a problem that it takes time and effort to refer to the damage fact information.
The present invention has been made in view of such circumstances, and an object of the present invention is to provide an inspection support device for a structure, an inspection support method for a structure, and a program capable of easily displaying related information from text data included in an inspection record or the like.
An inspection support device for a structure according to a first aspect comprises an inspection support device for a structure including a processor. The processor acquires three-dimensional model data of the structure, inspection data mutually associated with the three-dimensional model data, and a list of text data of a plurality of inspection points related to an inspection work of the structure, displays the list of text data on a display device, receives selection of the text data of at least one inspection point from the displayed list of text data, analyzes the selected text data to extract a corresponding portion on the three-dimensional model data and/or the inspection data, which are corresponding to the text data of the inspection point, and displays the extracted corresponding portion on the three-dimensional model data and/or the extracted inspection data on the display device.
In the inspection support device for a structure according to a second aspect, the processor analyzes the selected text data to extract the corresponding portion on the three-dimensional model data corresponding to the inspection point and extracts the inspection data associated with the extracted corresponding portion on the three-dimensional model data.
In the inspection support device for a structure according to a third aspect, a memory that stores the three-dimensional model data, the inspection data mutually associated with the three-dimensional model data, and the list of text data is further provided, and the processor acquires the three-dimensional model data, the inspection data, and the list of text data from the memory.
In the inspection support device for a structure according to a fourth aspect, the processor maps the extracted inspection data to the three-dimensional model data and displays the mapped data on the display device.
In the inspection support device for a structure according to a fifth aspect, the list of text data is an inspection record.
In the inspection support device for a structure according to a sixth aspect, the three-dimensional model data includes at least a member region and data of a member.
In the inspection support device for a structure according to a seventh aspect, the inspection data includes a plurality of types of data.
In the inspection support device for a structure according to an eighth aspect, the plurality of types of data include a captured image, a panoramic composite image, damage information, and a two-dimensional drawing.
In the inspection support device for a structure according to a ninth aspect, the processor displays at least one type of data on the display device from the plurality of types of data included in the inspection data.
In the inspection support device for a structure according to a tenth aspect, the inspection data includes a plurality of captured images, and the processor displays the captured image satisfying a condition on the display device from the plurality of captured images to be displayed.
In the inspection support device for a structure according to an eleventh aspect, the processor analyzes the selected text data to extract past inspection data corresponding to the inspection point, and displays the extracted past inspection data on the display device.
An inspection support method for a structure according to a twelfth aspect is an inspection support method for a structure with use of an inspection support device for a structure including a processor. The inspection support method for a structure comprises, via the processor, acquiring three-dimensional model data of the structure, inspection data mutually associated with the three-dimensional model data, and a list of text data of a plurality of inspection points related to an inspection work of the structure, displaying the list of text data on a display device, receiving selection of the text data of at least one inspection point from the displayed list of text data, analyzing the selected text data to extract a corresponding portion on the three-dimensional model data and/or the inspection data, which are corresponding to the text data of the inspection point, and displaying the extracted corresponding portion on the three-dimensional model data and/or the extracted inspection data on the display device.
A program according to a thirteenth aspect is a program causing an inspection support device for a structure including a processor to execute an inspection support method for a structure. The program causes the processor to execute acquiring three-dimensional model data of the structure, inspection data mutually associated with the three-dimensional model data, and a list of text data of a plurality of inspection points related to an inspection work of the structure, displaying the list of text data on a display device, receiving selection of the text data of at least one inspection point from the displayed list of text data, analyzing the selected text data to extract a corresponding portion on the three-dimensional model data and/or the inspection data, which are corresponding to the text data of the inspection point, and displaying the extracted corresponding portion on the three-dimensional model data and/or the extracted inspection data on the display device.
With the inspection support device for a structure, the inspection support method for a structure, and the program according to the present invention, it is possible to easily display the related information from the text data included in the inspection record or the like.
Hereinafter, preferred embodiments of an inspection support device for a structure, an inspection support method for a structure, and a program according to one aspect of the present invention will be described with reference to accompanying drawings. Here, the “structure” includes a construction, for example, a civil-engineering structure such as a bridge, a tunnel, or a dam, and also includes an architectural structure such as a building, a house, or a wall, pillar, or beam of a building.
[Hardware Configuration of Inspection Support Device for Structure]
As an inspection support device 10 for a structure shown in
With the input/output interface 12, various pieces of data (information) can be input to the inspection support device 10 for the structure. For example, data stored in the storage unit 16 is input via the input/output interface 12.
The CPU (processor) 20 reads out various programs stored in the storage unit 16, the ROM 24, or the like, expands the programs in the RAM 22, and performs calculations to integrally control each unit. Further, the CPU 20 reads out a program stored in the storage unit 16 or the ROM 24 and performs a calculation using the RAM 22 to perform various types of processing of the inspection support device 10 for the structure.
The CPU 20 has an information acquisition unit 51, a list display unit 53, a selection reception unit 55, an information extraction unit 57, and an extraction information display unit 59. A specific processing function of each unit will be described below. The information acquisition unit 51, the list display unit 53, the selection reception unit 55, the information extraction unit 57, and the extraction information display unit 59 are a part of the CPU 20, and thus it can also be said that the CPU 20 executes the processing of each unit.
Returning to
The storage unit 16 mainly stores three-dimensional model data 101, inspection data 103, and inspection record data 105.
The three-dimensional model data 101 is, for example, data of a three-dimensional model of a structure created based on a plurality of captured images. The three-dimensional model data 101 includes data of a member region constituting the structure and a member name, and each member region and the member name are specified in the three-dimensional model data 101. The member region and the member name are specified for the three-dimensional model data 101, for example, based on a user operation. Further, the member region and the member name are automatically specified for the three-dimensional model data 101 from information about a shape, dimension, and the like of the member.
The inspection data 103 can include a plurality of types of data necessary for inspection. The inspection data 103 can include, for example, a captured image, a panoramic composite image, damage information, and a two-dimensional drawing. The captured image is a plurality of images obtained by capturing a structure, and the panoramic composite image is (a set of) images corresponding to a specific member, which are combined from the captured image. The two-dimensional drawing can include a general diagram, a damage diagram, a repair diagram, and the like. In a case where the structure is damaged, the captured image includes the damage. Thus, the damage can be extracted from the captured image. Further, the damage diagram and the repair diagram are automatically created from the extracted damage. The three-dimensional model data 101 and the inspection data 103 are associated with each other. For example, the inspection data 103 is stored in association with a position on the three-dimensional model data 101, a member, and the like. With designation of the position information on the three-dimensional model data 101, the inspection data 103 can be displayed. Further, with designation of the inspection data 103, the three-dimensional model data 101 can be displayed.
The inspection record data 105 is an example of a list of text data of a plurality of inspection points related to an inspection work of the structure. The list of text data is data created by a user (for example, a diagnostician) inputting a plurality of pieces of text data at predetermined positions in a template (document file having a designated format). The template may be in a format specified by Ministry of Land, Infrastructure, Transport and Tourism or a local government. The list of text data preferably includes text data such as findings described by a user (for example, a diagnostician).
The operation unit 18 shown in
The display device 30 is a device such as a liquid crystal display and can display the three-dimensional model data 101, the inspection data 103, and the inspection record data 105.
First, the information acquisition unit 51 acquires the three-dimensional model data 101, inspection data 103 mutually associated with the three-dimensional model data 101, and list of text data of the structure (information acquisition step: step S1). In the present example, the list of text data is the inspection record data 105.
Next, the list display unit 53 displays the acquired inspection record data 105 on the display device 30 (list display step: step S2). Next, the selection reception unit 55 receives selection of the text data of at least one inspection point from the displayed inspection record data 105 (selection reception step: step S3). Next, the information extraction unit 57 analyzes the selected text data to extract a corresponding portion on the three-dimensional model data 101 and/or the inspection data 103, which are corresponding to the text data of the inspection point (information extraction step: step S4). Next, the extraction information display unit 59 displays the extracted corresponding portion on the three-dimensional model data 101 and/or the extracted inspection data 103 on the display device 30 (extraction information display step: step S5). Each step will be described below.
<Information Acquisition Step>
The information acquisition step (step S1) is executed by the information acquisition unit 51. The information acquisition unit 51 acquires the three-dimensional model data 101, inspection data 103, and inspection record data 105 of the structure stored in the storage unit 16. In a case where the three-dimensional model data 101, the inspection data 103, and the inspection record data 105 are not stored in the storage unit 16, the information acquisition unit 51 acquires the three-dimensional model data 101, the inspection data 103, and the inspection record data 105 from the outside. For example, the information acquisition unit 51 acquires the three-dimensional model data 101, the inspection data 103, and the inspection record data 105 through the network via the input/output interface 12.
As camera parameters (focal length, image size and pixel pitch of image sensor, and the like) necessary for applying a structure from motion (SfM) method, parameters stored in the storage unit 16 can be used. Further, since an absolute scale cannot be obtained by the SfM method, the absolute scale (three-dimensional position) can be obtained with instruction of a known size (distance between two points or the like) of the structure, for example.
In the example of
Hereinafter, the three-dimensional model data 101A and the three-dimensional model data 101B may be referred to as the three-dimensional model data 101 without distinguishing between the data 101A and the data 101B.
The inspection data 103 includes, for example, a damage diagram 103D. In the damage diagram 103D, for example, for each piece of damage generated in a deck slab of a bridge to be inspected, damage display (fissuring display, water leakage display, liberate lime display, or the like), a member name (“deck slab” or the like), an element number, a type of damage (“fissuring”, “water leakage”, “liberate lime”, or the like), and evaluation classification (rank information) of a degree of damage are described for each panel coffer.
The captured image 103B, the damage detection result image 103C, and the damage diagram 103D of
<List Display Step and Selection Reception Step>
The list display step (step S2) is executed by the list display unit 53, and the selection reception step (step S3) is executed by the selection reception unit 55.
(A) of
(B) of
<Information Extraction Step>
Next, the information extraction step (step S4) is executed by the information extraction unit 57.
The information extraction unit 57 (refer to
As shown in
The information extraction unit 57 extracts the damage diagram (member unit of deck slab, bridge, or the like) that is the two-dimensional drawing, the panoramic composite image (member unit of deck slab, bridge, or the like), the damage information (type, degree, size, or the like of damage), the captured image, and the like, which are included in the inspection data 103.
The information extraction unit 57 can execute not only the extraction of the information based on the text data but also the extraction of the information by a user operation.
In
As shown in
In another example of the information extraction step, the inspection data 103 is extracted via the three-dimensional model data 101. Even in a case where the inspection data 103 cannot be directly extracted, the information extraction unit 57 can indirectly extract the inspection data 103 via the three-dimensional model data 101. In a case where the text data of the plurality of inspection points is selected and received in the selection reception step (step S3), in the information extraction step, the information extraction unit 57 can extract a plurality of corresponding portions on the three-dimensional model data 101 and/or a plurality of pieces of inspection data 103, which are corresponding to the plurality of pieces of text data.
<Extraction Information Display Step>
The extraction information display step (step S5) is executed by the extraction information display unit 59. The extraction information display unit 59 displays the information extracted by the information extraction unit 57 on the display device 30.
As shown in
The extraction information display unit 59 causes the display device 30 to display at least one type of data from the plurality of types of data (captured image, panoramic composite image, damage information, two-dimensional drawing, and the like) included in the inspection data 103.
Next, another first aspect of the selection reception step, the information extraction step, and the extraction information display step will be described with reference to
As shown in
The information extraction unit 57 (refer to
The information extraction unit 57 individually extracts images and drawings corresponding to the captured image 103B, the damage detection result image 103C, the damage diagram 103D, and the like, as the inspection data 103 corresponding to the text data (information extraction step).
The extraction information display unit 59 individually displays, on the display device 30, the images and drawings of the inspection data 103 extracted by the information extraction unit 57 (refer to
Further, the extraction information display unit 59 can display, as the inspection data 103, three-dimensional model data 120 to which the captured image 103B is mapped. Further, the extraction information display unit 59 can display, as the inspection data 103, three-dimensional model data 122 to which the captured image 103B and the damage detection result image 103C are mapped (extraction information display step).
Another second aspect of the selection acceptance step, the information extraction step, and the extraction information display step will be described with reference to
The information extraction unit 57 individually extracts images and drawings corresponding to the captured image 103B, the damage detection result image 103C, the damage diagram 103D, and the like, as the inspection data 103 corresponding to the text data (information extraction step). Further, in a case where the text data is analyzed and the text data related to the progressiveness of the damage is determined to be included, the information extraction unit 57 can extract past inspection data 203 corresponding to the inspection data 103 (information extraction step). The past inspection data 203 includes a past captured image group 203A, a past captured image 203B, a past damage detection result image 203C, a past damage diagram 203D, and the like. The past inspection data 203 can be stored in the storage unit 16 or an external storage unit. As the text data relating to the progressiveness of the damage, for example, expressions such as “almost no progressing is seen”, “progressing is slow”, or “progressing is fast” can be exemplified.
The extraction information display unit 59 individually displays, on the display device 30, images and drawings of the inspection data 103 extracted by the information extraction unit 57 and the past inspection data 203 (refer to
Another third aspect of the selection reception step, the information extraction step, and the extraction information display step will be described with reference to
The information extraction unit 57 specifies the corresponding portion on the three-dimensional model data 101 corresponding to the text data. A plurality of captured images 103B (refer to
Each captured image 103B of the captured image group 103A has an overlapping region that overlaps with each other. Therefore, there are the plurality of captured images 103B corresponding to the positional information on the three-dimensional model data 101 in the captured image group 103A.
The extraction information display unit 59 displays mapped three-dimensional model data 120 satisfying a condition from a plurality of pieces of the three-dimensional model data 120 to which the captured image 103B is mapped, as the inspection data 103, which are display targets (extraction information display step). In the present example, it is shown that the mapped three-dimensional model data 120 satisfying the condition is displayed. Even in this case as well, there is a state in which the captured image 103B satisfying the condition is displayed from the plurality of captured images 103B.
Here, the condition may be randomly determined by the user or may be automatically determined by the user. For example, “normalization degree of captured image 103B” or “distance of the captured image 103B to structure” can be applied as the condition, and the extraction information display unit 59 can display the captured image 103B satisfying this condition from the plurality of captured images 103B.
As another condition, in the case of the captured image 103B including damage, “image quality is good” or “damage is at the center of captured image 103B” can be applied as the condition, and the extraction information display unit 59 can display the captured image 103B satisfying this condition from the plurality of captured images 103B.
The extraction information display unit 59 can display at least one of an optimum captured image 103B satisfying the condition or the damage detection result image 103C for the optimum captured image 103B.
As described above, the user can easily check the corresponding portion 102 of the three-dimensional model data 101, the inspection data 103, and the mapped three-dimensional model data 120 and 122 from the text data included in the inspection record data 105, which is the list of text data. In a case where the corresponding portion on the three-dimensional model data 101 and/or the inspection data 103, which are corresponding to the plurality of pieces of text data, are extracted in the information extraction step, in the extraction information display step (step S5), the extraction information display unit 59 can also display the plurality of inspection data 103 and the corresponding portion 102 of the three-dimensional model data 101 corresponding to each of the text data of the plurality of inspection points.
<Others>
In the above description, a form has been described in which the information acquisition unit 51 acquires the information stored in the storage unit 16, but the present invention is not limited thereto. For example, in a case where necessary information is not stored in the storage unit 16, the information acquisition unit 51 may acquire the information from the outside via the input/output interface 12. Specifically, the information acquisition unit 51 acquires the information input from the outside of the inspection support device 10 for the structure via the input/output interface 12.
In the above embodiment, a hardware structure of a processing unit that executes various types of processing is the following various processors. The various processors include a central processing unit (CPU) which is a general-purpose processor that executes software (program) to function as various processing units, a programmable logic device (PLD) which is a processor whose circuit configuration can be changed after manufacturing such as a field programmable gate array (FPGA), a dedicated electric circuit which is a processor having a circuit configuration specifically designed to execute specific processing such as an application specific integrated circuit (ASIC), and the like.
One processing unit may be configured of one of these various processors or may be configured of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPU and FPGA). Further, a plurality of processing units can be configured by one processor. As an example of configuring the plurality of processing units by one processor, first, there is a form in which one processor is configured of a combination of one or more CPUs and software, as represented by a computer such as a client or a server, and the one processor functions as the plurality of processing units. Second, there is a form in which a processor that realizes the functions of the entire system including the plurality of processing units by one integrated circuit (IC) chip is used, as represented by a system on chip (SoC) or the like. As described above, the various processing units are configured by using one or more various processors as a hardware structure.
Further, as the hardware structure of the various processors, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined may be used.
Each of the above configurations and functions can be realized by any hardware, software, or a combination of both, as appropriate. For example, the present invention can be also applied to a program causing a computer to execute the above processing step (processing procedure), a computer-readable recording medium (non-temporary recording medium) on which such a program is recorded, or a computer on which such a program can be installed.
Although the examples of the present invention have been described above, it is needless to say that the present invention is not limited to the embodiments described above and various modifications can be made within a range not departing from the spirit of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2020-167558 | Oct 2020 | JP | national |
The present application is a Continuation of PCT International Application No. PCT/JP2021/031985 filed on Aug. 31, 2021 claiming priority under 35 U.S.C § 119(a) to Japanese Patent Application No. 2020-167558 filed on Oct. 2, 2020. Each of the above applications is hereby expressly incorporated by reference, in its entirety, into the present application.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/031985 | Aug 2021 | US |
Child | 18193388 | US |