The present invention relates to an information processing apparatus, an information processing method, and an information processing program for processing discontinuity information of a subject.
In various technical fields, an inspector visually determines discontinuities or defects of a subject, and in a case where defects are found, the subject is classified as a defective product. In such an inspection, an image of the subject may be used. In addition, a technique of acquiring defect information or discontinuity information from the image of the subject through an image processing apparatus or the like is also known. For example, JP4250898B describes extracting a defect candidate from an image by using first and second threshold values.
However, it is burdensome for the inspector to individually determine defects or discontinuities, and simply extracting defects as in JP4250898B makes it difficult to utilize or analyze information on discontinuities that do not meet the criteria for defects, resulting in low utility of the extracted information.
The present invention has been made in view of such circumstances, and an object of the present invention is to provide an information processing apparatus, an information processing method, and an information processing program capable of recording useful discontinuity information of a subject.
In order to achieve the above-described object, according to a first aspect of the present invention, there is provided an information processing apparatus comprising: a processor, in which the processor is configured to: acquire first discontinuity information obtained by analyzing an image of a subject with a first criterion, the first discontinuity information including information indicating a feature of a discontinuity; acquire second discontinuity information obtained by analyzing the image of the subject with a second criterion that is stricter than the first criterion, the second discontinuity information including information indicating the feature of the discontinuity; and record the second discontinuity information in association with the first discontinuity information.
According to the first aspect, since the second discontinuity information obtained by analyzing the image of the subject with the second criterion that is stricter than the first criterion is recorded in association with the first discontinuity information, not only the first discontinuity information but also the second discontinuity information is recorded, and a relationship between the first discontinuity information and the second discontinuity information can be understood. Such first and second discontinuity information can be utilized for the analysis, prediction, and the like of the discontinuity of the subject. As described above, according to the first aspect, it is possible to record useful discontinuity information of the subject.
The processor can record the first discontinuity information and the second discontinuity information on various recording devices (an optical magnetic recording device or a semiconductor recording device). Those recording devices may use non-transitory and tangible recording media.
In the first aspect and each of aspects to be described below, for example, a discontinuous portion in an inspection target object (subject) such as a product can be defined as a “discontinuity”. In addition, among the “discontinuities”, a discontinuity that exceeds a determination criterion defined in a standard, a specification, or the like, resulting in the product (subject) or the like being deemed defective in tests or inspections, can be defined as a “defect”. In this case, the defect refers to a discontinuity that has been determined as a “defect” either by an interpreter or automatically among the detected “discontinuities”. For example, a discontinuity having a predetermined major axis can be determined as a “defect”.
According to a second aspect, in the information processing apparatus described in the first aspect, the first discontinuity information and the second discontinuity information each have information regarding an arrangement of a plurality of the subjects, and the processor is configured to associate the first discontinuity information and the second discontinuity information with each other based on the information regarding the arrangement. The “information regarding the arrangement” may be information based on which the order of the subjects can be determined.
According to a third aspect, in the information processing apparatus described in the second aspect, the information regarding the arrangement has a component related to time. The “component related to time” is, for example, a specific period, a manufacturing time, an inspection time, or an image capturing time, but is not limited thereto.
According to a fourth aspect, in the information processing apparatus described in the second or third aspect, the processor is configured to obtain a range of the first discontinuity information and/or the second discontinuity information to be recorded, based on the information regarding the arrangement.
According to a fifth aspect, in the information processing apparatus described in any one of the first to fourth aspects, the processor is configured to calculate a recording condition based on the first discontinuity information, and acquire and/or record the second discontinuity information based on the recording condition. In the fifth aspect, the processor can calculate a condition for at least one of a specific period, a manufacturing number, a lot number, an inspection time, an inspection number, or an image capturing time as the recording condition and can record the discontinuity information under the calculated recording condition.
According to a sixth aspect, in the information processing apparatus described in the fifth aspect, the processor is configured to associate the first discontinuity information and the second discontinuity information with each other based on the recording condition.
According to a seventh aspect, in the information processing apparatus described in any one of the first to sixth aspects, the image is a radiation transmission image. The radiation transmission image can be used for a non-destructive inspection of the subject.
According to an eighth aspect, in the information processing apparatus described in any one of the first to seventh aspects, the first criterion is a criterion for at least one of a type, a number, a position in the subject, a size, a shape, a presence density, or a distance from another discontinuity of the discontinuity, and the processor is configured to perform the recording by using information indicating at least one of the type, the number, the position in the subject, the size, the shape, the presence density, or the distance from the other discontinuity of the discontinuity as the feature of the discontinuity. The second aspect defines a specific aspect of the “first criterion” for the “discontinuity”. A value of the criterion may differ depending on a type of the discontinuity.
According to a ninth aspect, in the information processing apparatus described in any one of the first to eighth aspects, the second criterion is a criterion for at least one of a type, a number, a position in the subject, a size, a shape, a presence density, or a distance from another discontinuity of the discontinuity, and the processor is configured to perform the recording by using information indicating at least one of the type, the number, the position in the subject, the size, the shape, the presence density, or the distance from the other discontinuity of the discontinuity as the feature of the discontinuity. The third aspect defines a specific aspect of the “second criterion” for the “discontinuity”, and the value of the criterion may differ depending on the type of the discontinuity.
According to a tenth aspect, in the information processing apparatus described in any one of the first to ninth aspects, the processor is configured to acquire the second discontinuity information based on the second criterion for a discontinuity in the first discontinuity information and for a discontinuity that has a lower influence on quality and/or performance of the subject than the discontinuity in the first discontinuity information. The fourth aspect specifically defines that the “discontinuity” which is an acquisition target of the second discontinuity information is determined by the “second criterion that is stricter than the first criterion” defined in the first aspect.
According to an eleventh aspect, in the information processing apparatus described in any one of the first to tenth aspects, the first discontinuity information and the second discontinuity information include information related to an occurrence timing of the discontinuity, and the processor is configured to acquire at least one of a manufacturing time, a manufacturing number, a lot number, an inspection time, an inspection number, or an image capturing time of the subject as the information related to the occurrence timing of the discontinuity. As defined in the eleventh aspect, the “information related to the occurrence timing of the discontinuity” may be information acquired in time series, instead of the time itself.
According to a twelfth aspect, in the information processing apparatus described in any one of the first to eleventh aspects, the processor is configured to perform the recording for a predetermined region in the subject.
According to a thirteenth aspect, in the information processing apparatus described in any one of the first to twelfth aspects, the processor is configured to perform the recording for a period in which an occurrence frequency of the discontinuity is higher than a determination criterion value. Performing the recording for the period in which the occurrence frequency of the discontinuity is higher than the determination criterion value is one aspect of the recording according to the recording condition.
According to a fourteenth aspect, in the information processing apparatus described in any one of the first to twelfth aspects, the processor is configured to perform the recording for a period in which an occurrence frequency of the discontinuity is equal to or lower than a determination criterion value. Performing the recording for the period in which the occurrence frequency of the discontinuity is equal to or lower than the determination criterion value is one aspect of the recording according to the recording condition.
According to a fifteenth aspect, in the information processing apparatus described in any one of the first to twelfth aspects, the processor is configured to perform the recording for a period in which an occurrence frequency of the discontinuity falls within a determination criterion range. Performing the recording for the period in which the occurrence frequency of the discontinuity falls within the determination criterion range is one aspect of the recording according to the recording condition.
According to a sixteenth aspect, in the information processing apparatus described in any one of the first to fifteenth aspects, the processor is configured to perform the recording for a first period in which the first discontinuity information is present. Performing the recording for the first period in which the first discontinuity information is present is one aspect of the recording according to the recording condition.
According to a seventeenth aspect, in the information processing apparatus described in the sixteenth aspect, the processor is configured to perform the recording for the first period and a second period preceding the first period. Performing the recording for the first period and the second period preceding the first period is one aspect of the recording according to the recording condition.
According to an eighteenth aspect, in the information processing apparatus described in any one of the first to seventeenth aspects, an image analyzer that is constructed by using a machine learning algorithm and performs the analysis on the image of the subject is further provided, and the processor is configured to acquire the first discontinuity information and/or the second discontinuity information by using the image analyzer. The image analyzer can be constructed by, for example, a neural network.
According to a nineteenth aspect, in the information processing apparatus described in the eighteenth aspect, the image analyzer performs one or more of detection, classification, and measurement as the analysis, and the processor is configured to acquire the first discontinuity information and/or the second discontinuity information based on a result of the analysis.
According to a twentieth aspect, in the information processing apparatus described in the eighteenth or nineteenth aspect, the processor is configured to correct a result of the analysis according to an operation of a user. In some cases, the analysis result of the image analyzer may include an error, an inappropriate result, unnecessary information, or the like, but according to the twentieth aspect, the user can correct the analysis result and acquire appropriate information.
According to a twenty-first aspect, in the information processing apparatus described in any one of the first to twentieth aspects, the processor is configured to output the recorded information and/or information extracted from the recorded information to a recording device and/or a display device. The extracted information may be displayed by being associated with or superimposed on the image of the subject, a design drawing, or the like.
According to a twenty-second aspect, in the information processing apparatus described in the twenty-first aspect, the processor is configured to extract information from the recorded information under a condition designated by an operation of a user, and output the extracted information. According to the twenty-first aspect, the user can extract and refer to the information according to a desired condition.
In order to achieve the above-described object, according to a twenty-third aspect of the present invention, there is provided an information processing method executed by an information processing apparatus including a processor, the information processing method comprising: causing the processor to execute: a first information acquisition step of acquiring first discontinuity information obtained by analyzing an image of a subject with a first criterion, the first discontinuity information including information indicating a feature of a discontinuity; a second information acquisition step of acquiring second discontinuity information obtained by analyzing the image of the subject with a second criterion that is stricter than the first criterion, the second discontinuity information including information indicating the feature of the discontinuity; and a recording step of recording the second discontinuity information in association with the first discontinuity information.
According to the twenty-third aspect, useful discontinuity information of the subject can be recorded as in the first aspect. The information processing method according to the twenty-third aspect may have the same configurations as those of the second to twenty-second aspects.
In order to achieve the above-described object, according to a twenty-fourth aspect of the present invention, there is provided an information processing program for causing an information processing apparatus including a processor to execute an information processing method, the information processing program causing the processor to execute, using the information processing method: a first information acquisition step of acquiring first discontinuity information obtained by analyzing an image of a subject with a first criterion, the first discontinuity information including information indicating a feature of a discontinuity; a second information acquisition step of acquiring second discontinuity information obtained by analyzing the image of the subject with a second criterion that is stricter than the first criterion, the second discontinuity information including information indicating the feature of the discontinuity; and a recording step of recording the second discontinuity information in association with the first discontinuity information.
According to the twenty-fourth aspect, useful discontinuity information of the subject can be recorded as in the first and twenty-third aspects. The information processing program according to the twenty-fourth aspect may have the same configurations as those of the second to twenty-second aspects. In addition, a non-transitory and tangible recording medium having a computer-readable code of the information processing program of these aspects recorded thereon can also be employed as an aspect of the present invention.
As described above, with the information processing apparatus, the information processing method, and the information processing program of the aspects of the present invention, it is possible to record useful discontinuity information of the subject.
The inspection apparatus 10 (information processing apparatus) according to the present embodiment is an apparatus that acquires discontinuity information from an image obtained by imaging an inspection target (inspection target object, subject) and that recording and outputs (displays or the like) the acquired information with a designated condition, and is an apparatus for assisting an interpreter in diagnosing, analyzing, and the like a discontinuity of the inspection target object. Here, the “inspection target object” refers to an object to be inspected, and various articles are assumed. For example, in a case where the inspection apparatus 10 is used for industrial inspection, industrial parts or industrial products that require high precision in design and manufacturing can be assumed as the inspection target object (subject). However, the application range of the present invention is not limited to these inspection target objects (for other examples of the inspection target object, refer to the section of “Other Examples of Inspection Target Object (Subject)” to be described below).
In addition, a “light-reception image” is an image that is obtained by irradiating the inspection target object with radiation, visible light, infrared light, or the like and that is created based on transmitted light or reflected light of the inspection target object.
As shown in
The control unit 12 (processor) includes a processor (for example, a central processing unit (CPU)) that controls an operation of each unit of the inspection apparatus 10. The control unit 12 receives operation input from the interpreter via the operation unit 14, and transmits a control signal corresponding to the operation input to each unit of the inspection apparatus 10 to control the operation of each unit including the processing unit 22. In addition, the control unit 12 causes the display unit 18 to display a display image, the discontinuity information, product information, inspection information, and the like. As a result, the interpreter can inspect an inspection target object OBJ by interpreting the image displayed on the display unit 18.
In addition, the control unit 12 includes a memory (non-transitory and tangible recording medium) (not shown) configured with a flash memory, a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), various semiconductor storage elements, or the like. A computer-readable code of an information processing program according to the embodiment of the present invention is recorded on this memory, the control unit 12 refers to this memory to execute an information processing method according to the embodiment of the present invention.
The operation unit 14 is an input device that receives the operation input from the interpreter, and includes a keyboard for character input and a pointing device (a mouse, a trackball, or the like) for operating a pointer, an icon, or the like displayed on the display unit 18 (not shown). A touch panel can also be provided on a surface of the display unit 18 as the operation unit 14, instead of the means listed above or in addition to the means listed above. A user can perform an operation necessary to execute the information processing method and the information processing program according to the embodiment of the present invention via these devices. This operation includes, for example, operations such as setting a criterion of the discontinuity (first criterion or second criterion), correcting the discontinuity information, and condition settings such as extracting, displaying, or the like the discontinuity information.
The I/F 16 is means for performing communication with an external device via the network NW. As a method of transmitting and receiving data between the inspection apparatus 10 and the external device, wired communication (for example, a local area network (LAN), a wide area network (WAN), Internet connection, or the like) or wireless communication (for example, LAN, WAN, Internet connection, or the like) can be used. The I/F 16 functions as an image acquisition unit and acquires a light-reception image that is obtained by irradiating the inspection target object with light rays (visible light, infrared light, or the like) or radiation and that is created based on reflected light or transmitted light from the inspection target object.
The inspection apparatus 10 can receive, via the I/F 16, input of inspection target object imaging data D100 including captured image data of the inspection target object OBJ (subject) imaged by the imaging system 100. Here, the captured image data is image data constituting the light-reception image. In addition, the method of inputting the inspection target object imaging data D100 from the imaging system 100 to the inspection apparatus 10 is not limited to the communication via the network NW listed above. For example, the inspection apparatus 10 and the imaging system 100 may be connected to each other by a universal serial bus (USB) cable, Bluetooth (registered trademark), infrared communication, or the like, or the inspection target object imaging data D100 may be stored in a memory card attachable to and detachable from and readable by the inspection apparatus 10, and the image data may be input to the inspection apparatus 10 via the memory card.
Further, the inspection apparatus 10 can communicate with the product database (product DB 200) via the network NW. Product data D200 for each industrial product that can be an inspection target is stored in the product DB 200. The control unit 12 can retrieve and read out inspection target object specification information for specifying the inspection target object from the inspection target object imaging data of the inspection target object OBJ acquired from the imaging system 100, and can acquire the product data D200 corresponding to the readout inspection target object specification information from the product DB 200. By using this product data D200, defect information and discontinuity information corresponding to the type or the feature of the inspection target object OBJ can be acquired.
The product DB 200 may be installed on the network NW as in the present embodiment so that the product data D200 is updatable by the manufacturer or the like, or the product DB 200 may be provided in the inspection apparatus 10.
The display unit 18 (display device) is a device that is controlled by the control unit 12 and that is used to display the light-reception image (image), the inspection information, the product information, the discontinuity information, and the like. As the display unit 18, for example, a liquid crystal monitor can be used, and a screen of a device such as a smartphone or a tablet terminal may also be used as the display unit 18.
The buffer memory 20 is configured with, for example, a storage device such as a random access memory (RAM) and is used as a work area of the control unit 12 or an area for temporarily storing the image data output to the display unit 18.
The extraction section 220 functions as an image processing unit and performs image processing (for example, color conversion processing, monochrome conversion processing, edge enhancement processing, conversion processing into three-dimensional data, and the like) on the captured image data to detect changes in color, brightness value, or the like of the inspection target object OBJ, thereby detecting discontinuities (for example, scratches, fissuring (cracks), wear, rust, and the like) of the inspection target object OBJ. The extraction section 220 analyzes the image with the first criterion and the second criterion, which will be described below, to detect the defect and the discontinuity based on, for example, a color change, an edge detection result, and the like. As a result, the position and the shape of the discontinuity are specified.
The extraction section 220 may detect the discontinuity by, for example, incorporating, into the product data D200, product image data including an image of a product (new product) in which the same defect of the inspection target object OBJ is not detected, and comparing the product image data with the captured image data of the inspection target object OBJ.
In the present invention, for example, a discontinuous portion in the inspection target object (subject) such as a product can be defined as a “discontinuity (discontinuity)”. In addition, among the “discontinuities”, a discontinuity that exceeds a determination criterion defined in a standard, a specification, or the like, resulting in the product (subject) or the like being deemed defective in tests or inspections, can be defined as the “defect”. In this case, the defect refers to a discontinuity that has been determined as the “defect” either by the interpreter or automatically among the detected “discontinuities”. For example, a discontinuity having a predetermined major axis can be determined as the “defect”. Therefore, the same processing (including analysis such as detection, classification, and measurement) can be performed on the “defect” and the “discontinuity”.
The measurement section 222 measures dimensions of each part of the inspection target object OBJ based on the captured image data of the inspection target object OBJ and imaging condition data. The measurement section 222 measures a size of the inspection target object OBJ based on, for example, the imaging condition data such as a distance between a camera during imaging and the inspection target object OBJ, a focal length, and a zoom magnification, and a size of the inspection target object OBJ in the captured image data (the size of the inspection target object OBJ may be acquired via the product data D200). The measurement section 222 calculates the size of the discontinuity (for example, a maximum dimension, a minimum dimension, a depth of the fissuring, an angle, and the like) by using the measured size of the inspection target object OBJ, the size of the inspection target object OBJ in the captured image data, and the size of the discontinuity in the captured image data. The measurement section 222 measures at least one of the type, the number, the position in the inspection target object OBJ (subject), the size, the shape, the presence density, or the distance from another discontinuity of the discontinuity.
The measurement section 222 may measure a thickness of the inspection target object OBJ for each position by using the dimensions of each part of the inspection target object OBJ and information indicating, for example, the reflectivity and the transmittance (transmission attenuation) of irradiation light during imaging of the inspection target object OBJ. The thickness may be measured by the imaging system 100 during imaging and incorporated into the inspection target object imaging data D100. [Example of Measurement by Measurement Section]
An example of the measurement performed by the measurement section 222 will be described.
The classification section 224 assigns a discontinuity classification to the extracted discontinuity based on classification information stored in the storage unit 24. For example, the classification section 224 calculates a degree of similarity between the discontinuity extracted by the processing unit 22 and at least one of a discontinuity image corresponding to a discontinuity extracted in the past or information indicating the feature of the discontinuity image, and the classification section 224 assigns the discontinuity classification based on the degree of similarity. Here, the degree of similarity calculated by the classification section 224 is calculated by a known method. For example, the classification section 224 can calculate the degree of similarity by performing block matching between a discontinuity image extracted by the processing unit 22 and the discontinuity image extracted in the past. In addition, for example, the classification section 224 can calculate a degree of similarity between an extracted discontinuity candidate image and the discontinuity image (or the information indicating the feature of the discontinuity image) stored in the storage unit 24, and can assign a classification result assigned to the discontinuity image having the highest degree of similarity as the discontinuity classification in the discontinuity candidate image.
The function of each unit of the processing unit 22 described above can be implemented by using various processors. The various processors include, for example, a central processing unit (CPU) which is a general-purpose processor that implements various functions by executing software (programs). In addition, the various processors described above include a programmable logic device (PLD) which is a processor whose circuit configuration can be changed after manufacturing such as a field programmable gate array (FPGA). Further, the various processors also include a dedicated electrical circuit which is a processor that has a dedicated electrical circuit configuration designed to execute specific processing such as an application specific integrated circuit (ASIC), and the like.
The function of each unit may be implemented by one processor or a combination of a plurality of processors. In addition, a plurality of functions may be implemented by one processor. A first example in which a plurality of functions are configured with one processor is, as represented by a computer such as a client or a server, an aspect in which one or more CPUs and software are combined to configure one processor and the processor implements the plurality of functions. A second example of the configuration is, as represented by a system-on-chip (SoC) or the like, an aspect in which a processor that implements the functions of the entire system by using one integrated circuit (IC) chip is used. As described above, various functions are configured by using one or more of the above-described various processors as a hardware structure. Further, the hardware structure of these various processors is, more specifically, electrical circuits (circuitry) in which circuit elements, such as semiconductor elements, are combined.
In a case where the above-described processor or electrical circuit executes software (programs), a processor readable code (or a computer readable code, the computer constructed by using the processor) of the software to be executed is stored in a non-transitory and tangible recording medium, such as ROM, and the processor refers to the software. The software stored in such a recording medium includes a program (information processing program) for executing the information processing method according to the embodiment of the present invention. The code may be recorded on a non-transitory and tangible recording medium, such as various optical magnetic recording devices and semiconductor memories, instead of the ROM. In a case of processing using software, for example, the buffer memory 20 can be used as a transitory storage area, and data stored in an electronically erasable and programmable read only memory (EEPROM) (not shown) can also be referred to.
The above-described “non-transitory and tangible recording medium” does not include a non-tangible recording medium such as a carrier wave signal or a propagation signal itself.
The storage unit 24 is means for storing data including a control program used by the control unit 12. As the storage unit 24, for example, devices such as a device including a magnetic disk such as a hard disk drive (HDD), a device including a flash memory such as an embedded multi-media card (eMMC) or a solid state drive (SSD) can be used. These devices are examples of the recording device or the non-transitory and tangible recording medium. The storage unit 24 stores the inspection target object imaging data D100, the product data D200, the discontinuity information, and the like.
In addition, the storage unit 24 can store a discontinuity classification result input via the operation unit 14. In this case, the operation unit 14 functions as an input unit for inputting the classification result to be stored in the storage unit 24, and the storage unit 24 stores the classification result based on classification input through the input unit. For example, the storage unit 24 may receive input of the type of the discontinuity (defect) as described in “JIS Z 2300” for classification and assignment, or may store the type of the discontinuity in a case where the type of the discontinuity defined by the user himself/herself is input via the operation unit 14. Further, for example, the storage unit 24 may store the classification result classified based on an occurrence cause of the discontinuity. Here, examples of the “occurrence cause of the discontinuity” include air bubbles due to gas inclusion, cracks in members, foreign matter adhesion, welding, and welding defects.
Further, the information (data) stored in the storage unit 24 may be updated. For example, the information stored in the storage unit 24 may be updated by sequentially storing an interpretation result interpreted by the interpreter through the inspection apparatus 10 in the storage unit 24.
The interpreter can input, via the operation unit 14, for example, diagnosis results, such as “immediately replace the inspection target object OBJ with a new one”, “observe the progress (re-inspect)”, or “leave it untouched (not a discontinuity)”, for each piece of the information related to the defect candidate image assigned to the image displayed on the display unit 18. The control unit 12 creates inspection target object inspection result data D10 (refer to
The inspection target object specification information is information for specifying the inspection target object OBJ and includes, for example, information indicating a product name, a product number, a manufacturer name, and a technical classification of the inspection target object OBJ.
The captured image data is image data (for example, an X-ray image (radiation transmission image), a visible light image, or an infrared light image) obtained by imaging the inspection target object OBJ.
The imaging condition data is stored for each piece of the captured image data of the inspection target object OBJ and includes information indicating an imaging date and time, an imaging target location of each piece of the captured image data, a distance between the inspection target object OBJ and the camera during imaging, and an angle with respect to the camera.
The illumination condition data includes information indicating a type of radiation or light used for imaging of the inspection target object OBJ (for example, X-rays, visible light rays, infrared light rays, transmitted light rays, reflected light rays), an irradiation intensity, and an irradiation angle.
The product specification information is information for specifying the product and includes, for example, information indicating a product name, a product number, a manufacturer name, and a technical classification.
The product attribute information includes, for example, information indicating a material and dimensions of each part of the product, and an application of the product. The information indicating the application of the product includes, for example, information regarding a name, a type, a processing state, and an attachment method (for example, joints, welds, screwing, fitting, and soldering) of a device to which the product is attached, or the like. In addition, the product attribute information includes discontinuity occurrence information. The discontinuity occurrence information includes, for example, at least one information of a past inspection date and time, a material of the inspection target object OBJ, a type of the discontinuity that has occurred in the past (for example, gas hole, porosity, foreign material less dense (FMLD), foreign material more dense (FMMD), fissuring, delamination, or the like), a shape, a size, a depth, an occurrence site (site coordinates, a material thickness, and a processing state (for example, joints, welds, or the like), frequency information related to the occurrence frequency of the discontinuity, or a discontinuity image.
The inspection region designation information includes information indicating an inspection region in the inspection target object, which is designated by the manufacturer or the like of each product. The inspection region designation information is created, for example, by specifying a location where the discontinuity is likely to occur statistically or structurally based on information in a case where the manufacturer or the like has repaired the product in the past.
The inspection target object measurement data includes information indicating the size of the inspection target object OBJ and a measurement result of the thickness by the measurement section 222 for each position of the inspection target object OBJ.
The information regarding the discontinuity includes information indicating the feature of the discontinuity (for example, a position, a size, a thickness change amount, and a type of the discontinuity). The position of the discontinuity can be represented by, for example, coordinates on a coordinate system (for example, a three-dimensional rectangular Cartesian coordinate system, a polar coordinate system, a cylindrical coordinate system, or the like) set according to the shape of the inspection target object OBJ. The information indicating the type of the discontinuity is information to be assigned to the classification section 224 as described above and is, for example, information such as a granular discontinuity, a spot-like discontinuity, and a crack-like discontinuity.
The diagnosis result data may include an inspection date and time and information additionally input or corrected by the interpreter with respect to the discontinuity. The diagnosis result data is, for example, “It is a discontinuity. The classification is a foreign matter shadow.”, “It is not a discontinuity.”, and the like. Further, the diagnosis result data may include, for example, information indicating the diagnosis result input by the interpreter, such as “immediately replace the inspection target object OBJ with a new one”, “observe the progress (re-inspect)”, “leave it untouched”, and “classify it as a defective product because a defect is present”.
Next, the imaging system 100 for capturing the image of the inspection target object OBJ will be described.
The imaging control unit 102 includes a central processing unit (CPU) that controls an operation of each unit of the imaging system 100. The imaging control unit 102 receives operation input from an operator (imaging technician) via the imaging operation unit 104 and transmits a control signal corresponding to the operation input to each unit of the imaging system 100 to control the operation of each unit.
The imaging operation unit 104 is an input device that receives the operation input from the operator, and includes a keyboard for character input and a pointing device (a mouse, a trackball, or the like) for operating a pointer, an icon, and the like displayed on the display unit 18. The operator can perform, via the imaging operation unit 104, input of information regarding the inspection target object OBJ, input of an instruction to execute imaging to the camera 108 (including settings for imaging conditions, such as an exposure time, a focal length, and a stop, and an imaging angle, an imaging location, and the like), input of an instruction of radiation irradiation to the radiation sources 110 and 112 (including settings for an irradiation start time, an irradiation duration, an irradiation angle, an irradiation intensity, and the like), and input of an instruction to record the acquired image data on the image recording unit 106.
The image recording unit 106 records the image data (light-reception image) of the inspection target object OBJ imaged by the camera 108. The image recording unit 106 records the information for specifying the inspection target object OBJ in association with the image data.
The camera 108 and the radiation sources 110 and 112 are disposed inside the imaging room 114. The radiation sources 110 and 112 are, for example, X-ray sources, and partition walls and entrances/exits between the imaging room 114 and the outside are shielded with X-ray protective materials (for example, lead, concrete, or the like) to provide X-ray protection. In a case where the inspection target object OBJ is irradiated with visible light for imaging, there is no need to use the shielded imaging room 114.
The radiation sources 110 and 112 irradiate the inspection target object OBJ placed in the imaging room 114 with radiation in accordance with an instruction from the imaging control unit 102.
The camera 108 images the inspection target object OBJ by receiving radiation emitted from the radiation source 110 to the inspection target object OBJ and reflected by the inspection target object OBJ, or radiation emitted from the radiation source 112 to the inspection target object OBJ and transmitted through the inspection target object OBJ, in accordance with an instruction to execute imaging from the imaging control unit 102. The inspection target object OBJ is held in the imaging room 114 by a holding member (not shown) (for example, a manipulator, a mounting table, or a movable mounting table), and the distance and the angle of the inspection target object OBJ with respect to the camera 108 and the radiation sources 110 and 112 are adjustable. The operator can image a desired location of the inspection target object OBJ by controlling the relative positions between the inspection target object OBJ, the camera 108, and the radiation sources 110 and 112 via the imaging control unit 102.
The radiation sources 110 and 112 end the irradiation of the inspection target object OBJ with the radiation in synchronization with the end of the execution of the imaging by the camera 108.
In the example shown in
The “size of the discontinuity/defect” indicates the size (an example of the first discontinuity information and the second discontinuity information) of the discontinuity (or defect) in the image of each product, which is extracted and measured by the processing unit 22 (the extraction section 220 and the measurement section 222: processor), in the number of pixels. In the example of
In the example of
In the first embodiment, the image of the inspection target object OBJ (subject) can be acquired by the imaging system 100 and the inspection apparatus 10. The image can be acquired in time series. The imaging may be performed at a predetermined timing such as every day, every week, every month, or once a week. In addition, images of different individual objects (parts, products, or the like that are continuously manufactured) of the same type of the inspection target object OBJ may be sequentially captured, or images of the same individual object may be repeatedly captured (information on changes over time may be acquired). It is preferable to image the inspection target object OBJ from the same direction (which may be one direction or a plurality of directions).
The processing unit 22 can acquire the discontinuity information (first discontinuity information) by analyzing the obtained time-series image of the inspection target object OBJ (subject) with the first criterion (a first information acquisition step and a first information acquisition process). Here, the “first criterion” is a criterion for at least one of the type, the number, the position in the subject, the size, the shape, the presence density, or the distance from another discontinuity of the discontinuity. In addition, the discontinuity information includes information indicating the feature of the discontinuity and information related to the occurrence timing of the discontinuity. The “information indicating the feature of the discontinuity” is information indicating at least one of the type, the number, the position in the inspection target object OBJ (subject), the size, the shape, the presence density, or the distance from another discontinuity of the discontinuity. Further, the “information corresponding to the occurrence time of the discontinuity” may be the occurrence timing (for example, the occurrence time) itself or may be other information related to the occurrence timing (for example, at least one of the manufacturing time, the manufacturing number (serial number), the lot number, the inspection time, the inspection number, or the image capturing time of the inspection target object OBJ (subject)). The occurrence timing or the occurrence time need not be strictly specified (for example, need not be specified down to the date and time or hours, minutes, and seconds).
The processing unit 22 (processor) can also acquire information on the “defect” (a discontinuity at a level at which the product is defective) as the “first discontinuity information” by setting a level of the “first criterion” (for example, a size of 4 mm or more).
The condition calculation section 226 can obtain a range of the discontinuity information to be recorded, based on the information regarding the arrangement and/or the first discontinuity information. For example, the condition calculation section 226 (processor) calculates the recording target period (a period in which the first discontinuity information and/or the second discontinuity information is recorded) and the recording condition based on the first discontinuity information (the condition calculation step and the condition calculation process). The condition calculation section 226 can calculate, for example, a period in which the occurrence frequency of the discontinuity having a predetermined feature (for example, the size) is higher than the determination criterion value, a period in which the occurrence frequency is equal to or lower than the determination criterion value, and a period in which the occurrence frequency falls within the determination criterion range, as the recording target period. The condition calculation section 226 may calculate a period in which the first discontinuity information is present (first period) as the recording target period or may calculate the first period and a second period preceding the first period as the recording target period.
As the determination criterion value (determination criterion range) of the occurrence frequency, the condition calculation section 226 can use a criterion set for the number of occurrences of the discontinuity having a predetermined feature in a predetermined period (one day, one week, one month, or the like) or a predetermined number of inspection target objects (one, ten, one hundred, or the like).
In a case of being based on the information regarding the arrangement, the condition calculation section 226 can record the discontinuity information for the subject having a designated range of manufacturing numbers or lot numbers, for example.
The discontinuity information for such a recording target period is effective for predicting the occurrence of the discontinuity (or defect) or analyzing the discontinuity information before the occurrence of the defect. That is, according to the above-described aspect, it is possible to record useful discontinuity information of the subject.
The condition calculation section 226 can calculate a condition for at least one of a specific period, a manufacturing number, a lot number, an inspection time, an inspection number, or an image capturing time as the “recording condition”.
The processing unit 22 (processor) can acquire the second discontinuity information (including the information indicating the feature of the discontinuity and the information corresponding to the occurrence time of the discontinuity) by analyzing the image of the inspection target object OBJ (subject) with the second criterion that is stricter than the first criterion (a second information acquisition step and a second information acquisition process). The processing unit 22 may acquire the second discontinuity information based on the above-described recording condition. The second discontinuity information may be time-series information. In addition, the item of the “information indicating the feature of the discontinuity” and the item of the “information related to the occurrence timing of the discontinuity” may be the same as in the case of the first discontinuity information.
Further, similar to the first criterion, the second criterion is a criterion for at least one of the type, the number, the position in the inspection target object OBJ (subject), the size, the shape, the presence density, or the distance from another discontinuity of the discontinuity, but is a criterion that is stricter than the first criterion. The “second criterion being stricter than the first criterion” means that the discontinuity information is acquired by using the second criterion for a discontinuity that has a lower influence on quality and/or performance of the inspection target object OBJ (subject) than the discontinuity obtained by being analyzed by the first criterion. For example, discontinuities such as air bubbles, pores, or fissuring may be considered “defects” and result in the inspection target object OBJ being deemed defective, for example, in a case where the discontinuities are large or have a high density. However, for example, in a case where the discontinuities are small or have a low density, the discontinuities can be considered to have a low influence on the quality and/or the performance of the inspection target object OBJ (considered acceptable as a product). The processing unit 22 (processor) can use the criterion set by the user via the operation unit 14 as the first criterion and the second criterion.
In addition to the example shown in
The recording control section 228 (processor) records the discontinuity information (second discontinuity information) in the recording target period on the storage unit 24 (recording device) in association with the information indicating the recording target period (a recording step and a recording process). Here, the “information indicating the recording target period” may be, instead of the occurrence timing itself, other information (for example, at least one of the manufacturing time, the manufacturing number (serial number), the lot number, the inspection time, the inspection number, or image capturing time of the inspection target object OBJ (subject)), similar to the “information related to the occurrence timing of the discontinuity” described above. In addition, the format of the recording is not particularly limited and may be a format illustrated in
The recording control section 228 can associate, during recording, the first discontinuity information and the second discontinuity information with each other based on the recording condition. Examples of the association include creating separate files with partially common file names, storing two pieces of discontinuity information in the same folder, or recording both pieces of information in the recording target period on a single file.
The recording control section 228 (processor) may perform recording for a predetermined region in the inspection target object OBJ (subject). The “predetermined region” may be a partial region of the inspection target object OBJ (for example, a region identified to have a high occurrence frequency of discontinuities in past inspections, a region having a high influence on the performance or the quality of the entire product, a region designated by the user, or the like). By performing the recording for such a region, useful discontinuity information can be recorded. The determination of whether the region is the “predetermined region or not” can be made based on the measured position. In addition, the recording control section 228 may refer to, during recording, the “inspection region designation information” included in the product data D200 shown in
The information extraction section 230 (processor) and the output control section 232 (processor) can extract information from the recorded information under the condition designated by the operation of the user and can output the extracted information. In addition, the information extraction section 230 and the output control section 232 may extract and output the information regardless of the operation of the user. Hereinafter, an example of the extraction and output (recording and display) of the information will be described. In the following examples, it is assumed that the same type of the inspection target object OBJ is continuously manufactured and imaged, and the time-series discontinuity information as shown in
In the above-described situation, the information extraction section 230 extracts, from the information recorded on the storage unit 24, the discontinuity information (which may include information on the defect and information on the discontinuity that does not lead to the defect; the same applies hereinafter), for example, in the period from time t3 to time t4, and the output control section 232 outputs the extracted discontinuity information. The output control section 232 may output the discontinuity information to the recording device, such as the storage unit 24, or may output the discontinuity information to the display device, such as the display unit 18.
In the example of
In
In a case where the discontinuity information is displayed in a graph as shown in
In the examples of
In the example shown in
In the example of
The processing unit 22 may, for example, automatically move the timeline bar 610 in a time axis direction in accordance with an information playback operation of the user and continuously change the superimposed display in accordance with the movement.
As described above, with the inspection apparatus 10 (information processing apparatus), the information processing method, and the information processing program according to the first embodiment, it is possible to record useful discontinuity information of the subject.
The modification example of the first embodiment described above will be described below.
In the first embodiment, an aspect has been described in which the processing unit 22 acquires the discontinuity information through the image processing, but the discontinuity information may be acquired by using an image analyzer (learning model) that is constructed by using a machine learning algorithm and that analyzes the image of the subject. That is, the extraction section 220, the measurement section 222, and the classification section 224 among the functions of the processing unit 22 may be constructed by the image analyzer.
In the processing unit 22, an example of a layer configuration in a case where a convolutional neural network (CNN) is used as the image analyzer will be described.
The intermediate layer 252 calculates a feature amount by a convolution operation and a pooling process. The convolution operation performed in the convolutional layer 256 is processing of acquiring a feature map through convolution operations using filters, and plays a role of feature extraction such as edge extraction from the image. Through the convolution operations using filters, a “feature map” of one channel (one sheet) is generated for one filter. The size of the “feature map” is downscaled by convolution and becomes smaller as convolution is performed at each layer. The pooling process performed in the pooling layer 258 is processing of reducing (or enlarging) the feature map output by the convolution operation to create a new feature map and plays a role of providing robustness such that the extracted features are not affected by parallel translation or the like. The intermediate layer 252 can be composed of one or a plurality of layers that perform these processes.
Similar to the first convolutional layer, in the second to n-th convolutional layers, convolution operations using filters F2 to Fn are performed. The reason why the size of the “feature map” in the n-th convolutional layer is smaller than the size of the “feature map” in the second convolutional layer is that the feature map is downscaled by the convolutional layer or the pooling layer up to the preceding stage.
Among the layers of the intermediate layer 252, in the convolutional layer closer to an input side, lower-order feature extraction (such as edge extraction) is performed, while higher-order feature extraction (extraction of features related to the shape, the structure, and the like of the object) is performed as it approaches an output side. In a case where the segmentation is performed for the purpose of measuring a region of interest (for example, the discontinuity), the region of interest is upscaled in the convolutional layer of the latter half portion, and the “feature map” having the same size as the input image set is obtained in the last convolutional layer. Meanwhile, in a case where the detection of the region of interest is performed, the upscaling is not essential because positional information need only be output.
The intermediate layer 252 may include a layer for performing batch normalization in addition to the convolutional layer 256 and the pooling layer 258. The batch normalization process is processing of normalizing the distribution of data in units of mini-batch during learning and plays a role of advancing learning quickly, reducing dependence on initial values, suppressing overtraining, and the like.
The output layer 254 is a layer that performs the position detection of the region of interest appearing in the input image (patch image or the like) based on the feature amount output from the intermediate layer 252 and that outputs the result. In a case of performing segmentation (class classification of the discontinuity or the like), the output layer 254 uses the “feature map” obtained from the intermediate layer 252 to understand the position of the region of interest appearing in the image at the pixel level. That is, it is possible to detect whether or not each pixel of the image belongs to the region of interest (whether or not it belongs to a specific class) and output the detection result. On the other hand, in a case of performing the object detection (detection of the discontinuity), it is not necessary to make a determination at the pixel level, and the output layer 254 outputs the positional information of the target object (discontinuity). In the present invention, various “discontinuities” can be considered to be a “specific class”. The output layer 254 may finely classify and output the discontinuities (gas hole, porosity, foreign material less dense (FMLD), foreign material more dense (FMMD), fissuring, delamination, and the like).
The output layer 254 (learning model 240) can output the detection result (the result of the segmentation and the result of the class classification) for a unit region (each region in a case where the image of the inspection target object OBJ is divided into a plurality of regions). In addition, the output layer 254 can output the certainty (probability, reliability degree) of the detection result. The “unit region” may be composed of one pixel or a plurality of pixels. The learning model 240 may use a region obtained by dividing the image of the subject into a predetermined shape and size as the unit region.
The output layer 254 may execute the discrimination (classification) of the region of interest and output the discrimination result. For example, the output layer 254 may classify the image into two categories, that is, “normal” and “discontinuity”, and may output two scores corresponding to “normal” and “discontinuity” as the discrimination result (the total of the two scores is 100% or 1) or output the classification result in a case where the classification can be clearly made from the two scores. The classification may be two classes or may be multi-class. For example, the output layer 254 may classify the image into three categories, that is, “normal”, “discontinuity at a level that does not lead to the defect”, and “discontinuity corresponding to the defect”. In addition, the output layer 254 may classify the “discontinuity” into “discontinuity corresponding to the defect”, “discontinuity to be focused on”, and “other discontinuities”.
In a case of outputting the discrimination result, the intermediate layer 252 or the output layer 254 may or may not include the fully connected layer as the last one layer or a plurality of layers (refer to
The output layer 254 may output a measurement result of the region of interest. In a case of performing the measurement, the measurement value of the region of interest to be measured can be directly output from the learning model 240. In addition, for example, as described above, segmentation is performed on the region of interest to be measured and then the measurement can be performed based on the result. In a case where the measurement value is directly output, it becomes a regression problem of the measurement value because the measurement value itself is trained for the image.
In a case of using the learning model 240 having the above-described configuration, it is preferable to, during the learning process, calculate the loss function (error function) by comparing the result output by the output layer 254 and the correct answer of recognition for the image set, and reduce (minimize) the loss function by performing processing (error backpropagation) of updating the weight parameter in the intermediate layer 252 from the layer on the output side toward the layer on the input side.
The learning model 240 after the learning end performs at least one of detection, classification, or measurement as the image analysis, and the processing unit 22 (processor) can acquire the discontinuity information based on the result of the analysis through the learning model 240.
The processing unit 22 may correct the result of the analysis by the learning model 240 according to the operation of the user via the operation unit 14 or the like. This correction may be, for example, addition of the discontinuity information, deletion of unnecessary information, amendment of an error, grouping of information, or the like.
In the first embodiment, a case where the inspection target object (subject) is an industrial part or an industrial product has been described, but the inspection target object (subject) in the present invention is not limited to these articles. For example, structures or buildings (including a concrete structure) such as a bridge, a road, a tunnel, a building, or a dam can be used as subjects, and fissuring, delamination, or the like in these subjects can be treated as the discontinuity. In this case as well, the image may be a radiation image (or a radiation transmission image), a visible light image, or an infrared light image.
It should be noted that, in the modification example described above, the recording, extraction, display, and the like of the discontinuity information can be performed in the same manner as in the first embodiment, and useful discontinuity information of the subject can be recorded.
Number | Date | Country | Kind |
---|---|---|---|
2022-004404 | Jan 2022 | JP | national |
The present application is a Continuation of PCT International Application No. PCT/JP2022/045978 filed on Dec. 14, 2022 claiming priority under 35 U.S.C § 119 (a) to Japanese Patent Application No. 2022-004404 filed on Jan. 14, 2022. Each of the above applications is hereby expressly incorporated by reference, in its entirety, into the present application.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/045978 | Dec 2022 | WO |
Child | 18769951 | US |