The present disclosure relates to an information processing apparatus, an information processing method, and an information processing program.
JP2007-192818A discloses a technique for non-destructive inspection of parts, including extracting a point cloud from a three-dimensional image generated using a plurality of tomographic images obtained by imaging a part with a computed tomography (CT) apparatus, and registering the extracted point cloud to a computer aided design (CAD) coordinate system.
In the non-destructive inspection, it is preferable that information on a discontinuity in a two-dimensional inspection image captured by irradiating an inspection target object with radiation can be associated with a position of the discontinuity in a three-dimensional structure of the inspection target object. This is because the inspection efficiency can be improved by referring to the shape and texture information of the discontinuity in the two-dimensional inspection image and the position of the discontinuity in the three-dimensional structure of the inspection target object.
In a technique of identifying a three-dimensional position of a discontinuity from a three-dimensional image generated using a plurality of tomographic images, there is room for improvement from the viewpoint of the inspection efficiency of non-destructive inspection because an imaging apparatus such as a CT apparatus is expensive or it takes a relatively long time to capture the plurality of tomographic images.
The present disclosure has been made in view of the above circumstances, and an object thereof is to provide an information processing apparatus, an information processing method, and an information processing program that allow improvement of the inspection efficiency of non-destructive inspection.
An information processing apparatus according to a first aspect is an information processing apparatus including at least one processor, the processor being configured to acquire a conversion parameter for converting a position in an inspection image captured by irradiating an inspection target object with radiation along an irradiation direction into a position of structure information representing a three-dimensional structure of the inspection target object, the conversion parameter including the irradiation direction in the structure information; and convert a position of a discontinuity of the inspection target object in the inspection image into a position of the discontinuity in the structure information by using the conversion parameter.
An information processing apparatus according to a second aspect is the information processing apparatus according to the first aspect, in which the processor is configured to derive a depth of the discontinuity along the irradiation direction, based on the inspection image; and derive, as a position of the discontinuity in the structure information, a position separated from a position corresponding to a surface of the inspection target object in the structure information by the derived depth along the irradiation direction.
An information processing apparatus according to a third aspect is the information processing apparatus according to the second aspect, in which the processor is configured to derive the depth of the discontinuity, based on a degree of blur of a region of the discontinuity in the inspection image.
An information processing apparatus according to a fourth aspect is the information processing apparatus according to the first aspect, in which a plurality of the inspection images are captured from a plurality of different irradiation directions, and the processor is configured to convert a position of the discontinuity in each of the plurality of the inspection images into a position in the structure information in a plane for which each of the irradiation directions is set as a viewpoint direction, by using the conversion parameter; and derive, as a position of the discontinuity in the structure information, an intersection point of a plurality of straight lines each extending along one of the irradiation directions and each passing through the position of the discontinuity after conversion for one of the plurality of the inspection images.
An information processing apparatus according to a fifth aspect is the information processing apparatus according to any one of the first aspect to the fourth aspect, in which the processor is configured to integrate the structure information and a position of the discontinuity in the structure information into data.
An information processing method according to a sixth aspect is an information processing method in which a processor included in an information processing apparatus executes processing including acquiring a conversion parameter for converting a position in an inspection image captured by irradiating an inspection target object with radiation along an irradiation direction into a position in structure information representing a three-dimensional structure of the inspection target object, the conversion parameter including the irradiation direction in the structure information; and converting a position of a discontinuity of the inspection target object in the inspection image into a position of the discontinuity in the structure information by using the conversion parameter.
An information processing program according to a seventh aspect is an information processing program for causing a processor included in an information processing apparatus to execute processing including acquiring a conversion parameter for converting a position in an inspection image captured by irradiating an inspection target object with radiation along an irradiation direction into a position in structure information representing a three-dimensional structure of the inspection target object, the conversion parameter including the irradiation direction in the structure information; and converting a position of a discontinuity of the inspection target object in the inspection image into a position of the discontinuity in the structure information by using the conversion parameter.
According to the present disclosure, the inspection efficiency of non-destructive inspection can be improved.
Exemplary embodiments according to the technique of the present disclosure will be described in detail based on the following figures, wherein:
Hereinafter, exemplary embodiments for implementing the technology of the present disclosure will be described in detail with reference to the drawings.
First, a configuration of a radiographic image capturing apparatus 1 according to the present embodiment will be described with reference to
The radiation source 12 irradiates an inspection target object O with radiation R such as X-rays. The radiation source 12 according to the present embodiment emits the radiation R in a cone-beam shape. A direction from the radiation source 12 to one point on the radiation detector 14 at which the radiation R transmitted through the inspection target object O arrives is referred to as an “irradiation direction D”. In the present embodiment, a case where a direction from the radiation source 12 to one point on the radiation detector 14 at which the radiation R transmitted through the center of the inspection target object O arrives is set as the irradiation direction D will be described as an example.
The radiation detector 14 includes a scintillator as an example of a light emitting layer that emits light when irradiated with the radiation R, and a thin film transistor (TFT) substrate. The scintillator and the TFT substrate are laminated together. The TFT substrate includes a plurality of pixels arranged in a two-dimensional manner, and each pixel includes a field-effect thin film transistor and a sensor unit as an example of a conversion element in which generated charge increases as the amount of radiation to be emitted increases. The sensor unit absorbs light emitted by the scintillator to generate charge, and accumulates the generated charge. The field-effect thin film transistor converts the charge accumulated in the sensor unit into an electrical signal and outputs the electrical signal. With the above configuration, the radiation detector 14 generates a two-dimensional radiographic image corresponding to the radiation R emitted from the radiation source 12 to the inspection target object O, and outputs the generated radiographic image to the information processing apparatus 10.
In this manner, in the radiographic image capturing apparatus 1, a radiographic image captured by irradiating the inspection target object O with the radiation R from the radiation source 12 along the irradiation direction D is stored in the information processing apparatus 10. A radiographic image captured by irradiating the inspection target object O with the radiation R along the irradiation direction D is hereinafter referred to as an “inspection image”.
Next, a hardware configuration of the information processing apparatus 10 according to the present embodiment will be described with reference to
The storage unit 22 is implemented by a hard disk drive (HDD), a solid state drive (SSD), a flash memory, or the like. The storage unit 22 serving as a storage medium stores an information processing program 30. The CPU 20 reads the information processing program 30 from the storage unit 22, loads the information processing program 30 into the memory 21, and executes the loaded information processing program 30.
The storage unit 22 further stores an inspection image, structure information 32, and a trained model 34. The structure information 32 includes information representing a three-dimensional structure of the inspection target object O. As illustrated in
As illustrated in
Next, a functional configuration of the information processing apparatus 10 according to the present embodiment will be described with reference to
The detection unit 40 detects, based on the inspection image and the trained model 34 stored in the storage unit 22, a discontinuity of the inspection target object O in the inspection image. Specifically, the detection unit 40 inputs the inspection image to the trained model 34. The trained model 34 outputs information on the discontinuity of the inspection target object O corresponding to the input inspection image. Accordingly, the detection unit 40 detects the discontinuity of the inspection target object O in the inspection image.
The detection unit 40 may detect the discontinuity of the inspection target object O in the inspection image by a known detection algorithm. The discontinuity of the inspection target object O in the inspection image may be designated by the user via the input device 24. In this case, the detection unit 40 detects the discontinuity designated by the user.
The first derivation unit 42 derives a conversion parameter that is for converting a position in the two-dimensional inspection image into a position in the three-dimensional structure information 32 and that includes the irradiation direction D in the structure information 32. A specific example of a process of deriving the irradiation direction D in the structure information 32 by the first derivation unit 42 will be described with reference to
As illustrated on the right side of
The first derivation unit 42 generates respective simulation images for a plurality of different viewpoint directions. Further, the first derivation unit 42 derives a level of similarity between each of the generated simulation images and the inspection image. Specifically, for example, the first derivation unit 42 derives a level of similarity between the silhouette of the inspection target object O in the simulation image and the silhouette of the inspection target object O in the inspection image. The first derivation unit 42 may extract a contour line of the inspection target object O in the simulation image and a contour line of the inspection target object O in the inspection image and derive a level of similarity between outer shapes of the inspection target object O configured by the extracted contour lines.
Then, the first derivation unit 42 sets the viewpoint direction for the simulation image having the highest level of similarity to the inspection image as the irradiation direction D in the structure information 32.
The second derivation unit 44 derives the three-dimensional coordinates of the discontinuity in the structure information 32 by converting the position of the discontinuity of the inspection target object O in the inspection image detected by the detection unit 40 into the position of the discontinuity in the structure information 32 by using the irradiation direction D derived by the first derivation unit 42. The second derivation unit 44 may acquire the irradiation direction D derived by an external device, instead of the first derivation unit 42, via the network I/F 25.
Specifically, first, the second derivation unit 44 converts the position of the discontinuity of the inspection target object O in the inspection image into the position in the structure information 32 in a plane for which the irradiation direction D is set as the viewpoint direction. For example, the second derivation unit 44 can perform the conversion described above by a process of matching the scale of the outer shape of the inspection target object O in the inspection image with the scale of the outer shape of the virtual inspection target object O viewed from the irradiation direction D in the structure information 32.
In this way, the position of the discontinuity of the inspection target object O in a plane in a case where the virtual inspection target object O is viewed from the irradiation direction D in the structure information 32 can be determined. Accordingly, as illustrated in
The second derivation unit 44 according to the present embodiment derives the depth of the discontinuity along the irradiation direction D on the basis of one inspection image. As illustrated in
The above processing allows the second derivation unit 44 to convert the position of the discontinuity of the inspection target object O in the two-dimensional inspection image into the three-dimensional coordinates in the structure information 32.
The second derivation unit 44 may convert the position of the discontinuity of the inspection target object O in the inspection image into the position of the discontinuity in the structure information 32 on the basis of a plurality of inspection images captured from a plurality of different irradiation directions D. In this case, the second derivation unit 44 converts the position of the discontinuity in each of the plurality of inspection images into a position in the structure information 32 in a plane for which each of the irradiation directions D is set as the viewpoint direction, by using the irradiation directions D derived by the first derivation unit 42. Then, as illustrated in
The generation unit 46 generates data in which the structure information 32 and the position of the discontinuity in the structure information 32 derived by the second derivation unit 44 are integrated. For example, the generation unit 46 generates data in which the position of the discontinuity in the structure information 32 derived by the second derivation unit 44 is included in the property of a data file representing the structure information 32.
The display control unit 48 performs control to display the virtual inspection target object O and an object representing the discontinuity on the display 23 on the basis of the data generated by the generation unit 46. The object representing the discontinuity may be, for example, a mark such as a circle or an image of the region of the discontinuity in the inspection image.
Next, the operation of the information processing apparatus 10 according to the present embodiment will be described with reference to
In step S10 of
In step S14, as described above, the second derivation unit 44 converts the position of the discontinuity of the inspection target object O in the inspection image detected in step S10 into the position of the discontinuity in the structure information 32 by using the irradiation direction D derived in step S12.
In step S16, the generation unit 46 generates data in which the structure information 32 and the position of the discontinuity in the structure information 32 derived in step S14 are integrated. In step S18, the display control unit 48 performs control to display the virtual inspection target object O and an object representing the discontinuity on the display 23 on the basis of the data generated in step S16. When the processing of step S18 is finished, the discontinuity position conversion process ends.
As described above, according to the present embodiment, the position of the discontinuity of the inspection target object O can be determined as a three-dimensional position on the basis of not a tomographic image such as a CT image but a simple X-ray image. This allows the user to grasp the position of the discontinuity of the inspection target object O as a three-dimensional position. According to the present embodiment, therefore, the inspection efficiency of non-destructive inspection can be improved.
In the embodiment described above, as illustrated in
In the embodiment described above, furthermore, the conversion parameter may include correction data for correcting a distortion of the inspection image, in addition to the irradiation direction D.
In the embodiment described above, furthermore, the hardware structures of processing units that execute various processes, such as the functional units of the information processing apparatus 10, may be implemented using various processors described below. As described above, the various processors described above include a CPU that is a general-purpose processor configured to execute software (program) to function as various processing units, and further include a programmable logic device (PLD) that is a processor whose circuit configuration can be changed after manufacturing, such as a field programmable gate array (FPGA), a dedicated electric circuit that is a processor having a circuit configuration designed exclusively for executing specific processing, such as an application specific integrated circuit (ASIC), and so on.
One processing unit may be constituted by one of these various processors, or may be constituted by a combination of two or more processors of the same type or different types (e.g., a combination of a plurality of FPGAs or a combination of a CPU and an FPGA). Alternatively, a plurality of processing units may be configured by one processor.
Examples of configuring a plurality of processing units by one processor include, first, a form in which, as typified by a computer such as a client and a server, one processor is configured by a combination of one or more CPUs and software and the processor functions as a plurality of processing units. The examples include, second, a form in which, as typified by a system on chip (SoC) or the like, a processor is used that implements the functions of the entire system including a plurality of processing units by one integrated circuit (IC) chip. As described above, the various processing units are configured by using one or more of the various processors described above as a hardware structure.
More specifically, the hardware structure of these various processors may be an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
The embodiment described above has described an embodiment in which the information processing program 30 is stored (installed) in the storage unit 22 in advance, but is not limited to this embodiment. The information processing program 30 may be provided in a form recorded in recording media such as a compact disc read only memory (CD-ROM), a digital versatile disc read only memory (DVD-ROM), and a universal serial bus (USB) memory. Alternatively, in an embodiment, the information processing program 30 may be downloaded from an external device via a network.
The disclosure of JP2022-099870 filed on Jun. 21, 2022 is incorporated herein by reference in its entirety. All publications, patent applications, and technical standards mentioned herein are incorporated herein by reference to the same extent as if each individual publication, patent application, or technical standard was specifically and individually indicated to be incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
2022-099870 | Jun 2022 | JP | national |
This application is a continuation application of International Application No. PCT/JP2023/014617, filed Apr. 10, 2023, the disclosure of which is incorporated herein by reference in its entirety. Further, this application claims priority from Japanese Patent Application No. 2022-099870, filed Jun. 21, 2022, the disclosure of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2023/014617 | Apr 2023 | WO |
Child | 18963487 | US |