This disclosure relates to a system for autonomously diagnosing a defect in a component, in particular using penetrant testing. It also relates to a corresponding method. This patent application is a result of a research project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement MNET17_ADMA-1246.
Penetrant testing (PT) is a non-destructive test method used in manufacturing. PT is used for detecting porosity, cracks, fractures, laps, seams and other surface defects in a component. The component to be inspected is coated with a penetrant (dye) that is visible under ultraviolet (UV) or visible (white) light. The penetrant penetrates small surface imperfections, and then the surface of the component is wiped or cleaned in order to remove any penetrant on the surface of the component. Only penetrant that has penetrated into any surface-level discontinuity will remain. A developer may be applied to draw out the penetrant from such discontinuities, so that the penetrant is visible on the surface of the component. The component is then inspected by a qualified inspector to determine whether any penetrant is visible. If so, the shape, position and size of the visible penetrant indication are evaluated to determine the integrity of the component. For example, an inspector may conclude that a defect is present, and assess the type, position and size of that defect.
Evaluation of a defect by an inspector is subjective, and may be inconsistent between inspectors, or inconsistent over time as an inspector becomes fatigued and loses concentration.
It is therefore generally desirable to improve the quality and consistency of diagnosis of defects in components.
According to a first aspect of the disclosure, there is provided a system for autonomously diagnosing a defect in a component to which a penetrant has been applied and at least partially removed for penetrant testing, the system comprising: a device for positioning the component for inspection; a camera configured to take an image of the component when positioned by the device; and a first image evaluation module configured to: process the image of the component with a machine learning algorithm to detect from the image remaining penetrant on the component and based on characteristics of any detected penetrant to provide a first determination of whether or not the image indicates a defect in the component.
As discussed above, evaluation of a defect by an inspector is subjective and may be inconsistent. By providing the above-described system for autonomously diagnosing a defect in a component to which a penetrant has been applied and at least partially removed for penetrant testing, wherein a machine learning algorithm is used to provide a determination of whether or not an image indicates a defect in the component, the quality and consistency of diagnosis of defects in components through penetrant testing can be improved.
As used herein, the term “component” includes a manufactured part that is intended for use in a vehicle such as an aircraft, or in any other machine or equipment. The term “component” as used herein is not limited, however, to a part intended for use in a larger whole; a “component” in the sense of the present disclosure can also be an item which can function on its own.
A “component to which a penetrant has been applied and at least partially removed for penetrant testing” is a component which has been prepared for penetrant testing by the application of a penetrant and the removal of excess penetrant. In other words, the component has been prepared broadly as described in the “Background” section of the present disclosure, by being at least partially coated with penetrant and then wiped or cleaned to remove the penetrant from the surface of the component.
The characteristics on which the provision of the first determination is based may include: the presence or absence of penetrant in the image; and the size, shape and position of any penetrant detected from the image.
The device for positioning the component for inspection may be a robotic device. For example, the device may be a robotic arm comprising a gripper for holding the component. In this way, the device can position the component for inspection by moving the arm or moving the gripper. The device may alternatively be a platform on which the component can rest. For example, the device may be a rotating table, which can position the component for inspection by rotating so as to turn the component.
The system may further comprise: a second image evaluation module, the second image evaluation module configured to: apply predetermined image processing and feature classification rules to the image of the component to detect from the image remaining penetrant on the component and based on characteristics of any detected penetrant to provide a second determination of whether or not the image indicates a defect in the component; and an evaluation comparison module configured to compare the first determination with the second determination and: if the first and second determination are both that the image indicates a defect in the component, to determine that a defect is present in the component; if the first and second determination are both that the image does not indicate a defect in the component, to determine that a defect is not present in the component.
By providing a second image evaluation module that uses predetermined image processing and feature classification rules, in addition to the first image evaluation module that uses a machine learning algorithm, the quality and consistency of diagnosis of defects in components can be further improved, at least in that each module acts as a check on the other.
The first determination and second determination may each comprise a measure of the extent of defects of the component and the evaluation comparison module may be configured to determine whether or not a defect is present based on the measure of the extent of defects of the component.
The measure of the extent of defects of the component may be, or may provide, a measure of the reliability of the component. For example, the measure of reliability of the component may be the inverse of the measure of the extent of defects of the component. In this way, a high value for the measure of the extent of defects would correspond to a low value for the measure of reliability of the component, and vice versa. The measure of the reliability of the component may be a value from a range in which the upper limit is a value indicating that no defects are present and the lower limit is a value indicating that the component is unacceptably damaged. The range may be, for example, 0 to 100, wherein a measure of the reliability of the component of 100 indicates that no defects are present and a measure of the reliability of the component of 0 indicates that the component is unacceptably damaged. Values between the lower limit and the upper limit may indicate that one or more defects are present in the component but those features such as their type, class, size, or position mean that the component is not unacceptably damaged. In other words, even if an indication of a defect is present, it may be determined that the indication is not significant. In such a case, the measure of the reliability of the component will be higher than a case in which a significant defect is present.
The evaluation module may be configured to determine that no defect is present if a weighted sum of the measures of reliability of the component is greater than a first threshold measure.
By using a weighted sum of the measures of the reliability of the component from the first and second evaluation modules, the outputs of these modules can be combined while taking into account the certainty of the evaluation of the first evaluation module and the second evaluation module. For example, the determination provided by the first image evaluation module, comprising a measure of the reliability of the component, can be weighted based on the certainty of the first image evaluation module, represented by a percentage between 0 and 100%. The weighted-sums approach also represents a simple way of combining the first determination and second determination without requiring undue processing.
The evaluation module may be configured to determine that a defect is present if a weighted sum of the measures of the reliability of the component is less than or equal to a second threshold measure.
The first threshold measure and the second threshold measure may be the same. In that case, the evaluation module will have two outputs: that a defect is not present, or that a defect is present.
The first threshold measure may be higher than the second threshold measure. In this case, the evaluation module may be configured to determine that the status of the component is undecided if the weighted sum of the measures of the reliability of the component is at or between the two thresholds. When the weighted sum of the measures of the reliability of the component is less than or equal to the first threshold measure and greater than or equal to the second threshold measure, the evaluation module may be configured to provide the image of the component to a qualified inspector for assessment.
In this way, the system can help to ensure that false determinations of either acceptability or unacceptability of the component are reduced, by providing borderline cases to a qualified inspector for assessment.
The evaluation module may be configured to determine whether or not a defect is present by processing the measures of the reliability of the component using a fuzzy logic inference model. For example, the evaluation module may be configured to process the measures of the reliability of the component using a fuzzy logic inference model to convert its value on a scale descriptive of the component status and applying fuzzy rules to the converted value. In one example, the evaluation module may be configured to process the measures of the reliability of the component using a fuzzy logic inference model and convert these measures to membership functions including at least one selected from the group comprising: “no defect”, “negligible defect”, “dangerous defect” and “undecided”. Fuzzy rules are then applied to these outputs by a fuzzy inference system to provide a final determination on the component.
Processing the measures of the reliability of the component using a fuzzy logic inference model can take into account the certainty of the evaluation of the first evaluation module and the second evaluation module. For example, as with the weighted-sums approach described above, the first and second determination (by the first and second image evaluation modules, respectively), can each be weighted based on the certainty of evaluation of their respective image evaluation module. Fuzzy rules taking into account this weighting can be applied to provide an output in the form of a value representing the reliability of the component.
When the evaluation module is configured in this way, more flexibility can be added in the evaluation than in the case in which a weighted sums approach is used. For example, in the weighted sums approach, the inputs are simplified with an equation, but the reality might be more complex than that. For example, if the shape found by the first image evaluation module is bigger than a threshold value and the position is close to one of the zones (i.e. edges of the component), then the first image evaluation module’s evaluation might be more important than the evaluation of the second image evaluation one. This is because the second image evaluation module will determine that there is a distance between defect and edge, so the defect is not in the edge. The first image evaluation module, one the other hand, based on human inputs, will determine that the defect is distant from the edge, but that it is better to consider the defect in the other zone, because the defect might be also under the visible area. These complex cases can be dealt with easily with fuzzy rules which can adapt the weights given to the outputs of the first and second image evaluation modules based on defect type and position, so as to provide an adaptive way of providing the final evaluation. As mentioned above, the first image evaluation module is configured to provide a first determination of whether or not the image indicates a defect in the component. This first determination may further include a determination of one or more of the following: the type of a defect; the shape of a defect; the size of a defect; and the position of a defect. Similarly, the second image evaluation module may be configured to provide a second determination of whether or not the image indicates a defect in the component, wherein the second determination includes a determination of one or more of: the type of a defect; the shape of a defect; the size of a defect; and the position of a defect.
The above-described methods of determining whether or not a defect is present using a weighted-sums or fuzzy logic approach may be performed for each defect. A final evaluation of the reliability of the part can then be provided based on the results for each defect. For example, where one defect is sufficient for a component to be deemed unacceptable, then if any of the weighted sums of the measures of reliability of the component is greater than a threshold measure, then it may be determined that the component is unacceptable. In this way, it is possible to avoid to aggregate errors and evaluate single defects.
The system may further comprise a controller configured to control movement of the device for positioning a component for inspection. The device for positioning the component for inspection may comprise a robotic device. The device may comprise: a developer applicator for applying a developer to the component and optionally a cleaning device for removing excess penetrant. The controller may be configured additionally to control operation of the developer applicator and optionally of the cleaning device.
The controller may be configured to receive at least one of the first and second determinations of whether or not the image indicates a defect in the component, including the determination of one or more of: the type of a defect; the shape of a defect; the size of a defect; and the position of a defect, and to control the operation of the developer applicator and cleaning device based on the first and/or second determination. For example, the controller can evaluate the amount of developer to be used and movement of the developer applicator needed to cover the defect area.
The system may comprise a graphical user interface (GUI) configured to receive inputs specifying the position, size, shape and type of a defect in a component corresponding to an image of the component and to provide these inputs to the first image evaluation module with the corresponding image of the component for training the first image evaluation module.
Generally, existing systems for recording defects in a component include data such as the size and type of a defect, but do not capture accurately the position of a defect on a component. By providing a GUI configured to receive inputs specifying the position of the defect, improved training data can be provided to the first image evaluation module, resulting in a more accurate determination of whether or not an image indicates a defect in the component. This is particularly the case in examples in which the GUI is configured to display a 3D model of the component and to receive inputs specifying the position of the defect on the 3D model.
According to a second aspect of the disclosure, there is provided a method for autonomously diagnosing a defect in a component, the method comprising: using device, positioning the component for inspection; using a camera, taking an image of the component positioned for inspection; and under control of one or more computing systems configured with executable instructions, processing the image of the component with a machine learning algorithm to provide a first determination of whether or not the image indicates a defect in the component.
As discussed above, by providing a method in which autonomous diagnosis of a defect in a component is performed using a machine learning algorithm, the quality and consistency of diagnosis of defects in components can be improved. Effects described above in relation to the first aspect apply to the corresponding features of this second aspect.
The method may further comprise, under control of the one or more computing systems: applying predetermined image processing and feature classification rules to the image of the component to provide a second determination of whether or not the image indicates a defect in the component; and comparing the first determination with the second determination and: if the first and second determination are both that the image indicates a defect in the component, determining that a defect is present in the component; if the first and second determination are both that the image does not indicate a defect in the component, determining that a defect is not present in the component.
The first determination and second determination may each comprise a measure of the extent of defects of the component, and determining whether or not a defect is present may be based at least in part on the measures of the extent of defects of the component. The measures of the extent of defects may provide a measure of the reliability of the component, as discussed above.
Determining whether or not a defect is present may comprise calculating a weighted sum of the measures of reliability of the component and determining that no defect is present if a weighted sum of the measures of the reliability of the component of the component is greater than a first threshold measure.
Determining whether or not a defect is present may comprise: processing the measures of the reliability of the component of the component using a fuzzy logic inference model.
The device may comprise a robotic device. The method may further comprise using a developer applicator of the device, applying a developer to the component; and using a cleaning device of the device, removing excess penetrant.
The method may further comprise: at a graphical user interface, receiving inputs specifying the position, size and type of a defect in a component corresponding to an image of the component; providing these inputs to the first image evaluation module with the corresponding image of the component; and training the first image evaluation module based on the provided inputs and the corresponding image of the component.
The method may comprise, at the graphical user interface, displaying a 3D model of the component. Receiving an input specifying the position of a defect in the component may comprise receiving an input specifying the position of the defect on the 3D model.
The method may comprise, using the one or more computer systems, controlling the device to position the component for inspection and controlling the camera to take an image of the component when positioned for inspection. The method may further comprise repeating these steps for a plurality of different positions of the component. In this case, the remaining method steps can be repeated for the image taken for each position of the component. In this way, all relevant parts of the component can be analysed for defects.
According to a third aspect of the disclosure, there is provided a computer-readable medium comprising computer-implementable instructions for causing a computer to cause a device to position a component for inspection and a camera to take an image of the component; and to process the image of the component with a machine learning algorithm to provide a first determination of whether or not the image indicates a defect in the component.
The computer-implementable instructions may additionally be for causing a computer to apply predetermined image processing and feature classification rules to the image of the component to provide a second determination of whether or not the image indicates a defect in the component; and to compare the first determination with the second determination and: if the first and second determination are both that the image indicates a defect in the component, determine that a defect is present in the component; if the first and second determination are both that the image does not indicate a defect in the component, determine that a defect is not present in the component.
The first determination and second determination may each comprise a measure of the reliability of the component of the component and the instructions may be for causing a computer to determine whether or not a defect is present based at least in part on the measures of the reliability of the component.
The instructions may be for causing a computer to determine that no defect is present if a weighted sum of the measures of the reliability of the component is greater than a first threshold measure. The instructions may be for causing a computer to determine that a defect is present if a weighted sum of the measures of the reliability of the component is less than a second threshold measure.
As discussed above, the first threshold measure and the second threshold measure may be the same. In that case, there will be two outputs: that a defect is not present, or that a defect is present.
As also discussed above, the first threshold measure may be higher than the second threshold measure. In this case, the instructions may be for causing a computer to determine that the status of the component is undecided if the weighted sum of the measures of the reliability of the component is at or between the two thresholds. When the weighted sum of the measures of the reliability of the component is less than or equal to the first threshold measure and greater than or equal to the second threshold measure, the instructions may be for causing a computer to provide the image of the component to a qualified inspector for assessment.
The instructions may be for causing a computer to determine whether or not a defect is present at least in part by: processing the measures of the reliability of the component using a fuzzy logic inference model.
The instructions may be for causing a computer to cause a robotic device to apply developer to the component and remove excess penetrant.
Optional features of the each aspect are also optional features of each other aspect, with changes of terminology being inferred by the skilled addressee where necessary for these to make sense.
Specific embodiments will be described below by way of example only and with reference to the accompanying drawings, in which:
The camera 103 can detect UV or white light. It would be understood by the skilled person how to select an appropriate camera based on the type of PT implemented. The camera, or the device on which it is mounted, can also include a lighting device (not shown), which is arranged so as to illuminate the component when the system 100 is in use. Again, the light can be UV or white light depending on the type of PT implemented.
The robotic device 101 will now be described in more detail with reference to
In other examples, the device for positioning a component for inspection can take the form of any other device suitable for this purpose. For example, the device may be a platform on which the component can be placed. The platform can rotate so as to turn the component with respect to the camera. This is particularly useful in the case of larger parts which cannot easily be manipulated by the robotic device. The device may comprise both a platform and a robotic device. For example, where the device has a developer applicator and/or a cleaning device, as discussed below, these may form part of a robotic device (as discussed below), but the device may nevertheless include a platform on which the component can be placed.
Returning, now, to
It would be understood by the skilled person that although in this example, training data for the first image evaluation module 104 is gathered via the GUI 108, the data could be gathered by other means, for example by inputs into the database 109 from paper or electronic records of the position, size and type of defects indicated by images of a component.
With continued reference to
As shown in
The first image evaluation module 104, the second image evaluation module 105 and the evaluation comparison module 106 in this example are hardware modules implemented in hardware on the computer 107. However, in other examples, one or more of the first image evaluation module 104, the second image evaluation module 105 and the evaluation comparison module 106 could be implemented as logical features of general circuitry such as a CPU or GPU. One or more of these modules 104, 105 or 106 could also or alternatively be implemented in an application-specific integrated circuit (ASIC) or field-programmable data array.
With reference to
In overview, the system 100 operates as follows: using the device 101, the component is positioned 303 for inspection; using the camera 103, an image of the component is taken 304; and under control of one or more computing systems 107 configured with executable instructions, the image of the component is processed 308 with a machine learning algorithm to detect from the image remaining penetrant on the component and based on characteristics of any detected penetrant to provide 309 a first determination of whether or not the image indicates a defect in the component.
These steps will be described in more detail below but first, with continued reference to
The GUI 108 is configured to display a 3D model of the component and to receive inputs specifying the position of the defect on the 3D model. Thus, in use, a qualified inspector views the 3D model of the component on the GUI, views a component suitably prepared for PT, and, based on the visible indications on the component, specifies the position of a defect in the component on the 3D model. The operator also inputs the size and type of the defect. This information, along with a corresponding image or images of the component is input 301 into the database. The first image evaluation module 104 is trained 306 based on the data and corresponding image. This process is repeated until the first image evaluation module 104 has been trained to a desired level. As just one example, 50 images with their corresponding inputs specifying the size, type and position of a defect may be provided 305 to the first image evaluation module 104 as training data for each position of a defect on a particular type of component. More or fewer images and corresponding inputs may be provided 305 based on the availability of such images and inputs and the desired accuracy of the determination of the first image evaluation module 104 of whether or not the image indicates a defect in the component.
With continued reference to
This determination can include a measure of the reliability of the component, for example a score from 0 to 100, where a score of 100 represents no defects and a score of 0 indicates that the component is unacceptably damaged. The score can be considered to be an indication of the probability that the component is reliable, where 0 represents that the component is not reliable, and 100 represents a certainty that the component is reliable.
Optionally, a qualified inspector may inspect 310 the component to approve the decision output by the first image evaluation module 104.
With reference now to
In overview, the system 100 operates as follows. As described above in relation to the operation of the system when it does not include a second image evaluation module 105 or evaluation comparison module 106, the following steps are performed: using the device 101, the component is positioned 303 for inspection; using the camera, taking an image of the component when positioned for inspection; and under control of one or more computing systems configured with executable instructions, the image of the component is processed 308 with a machine learning algorithm to detect from the image remaining penetrant on the component and based on characteristics of any detected penetrant to provide 309 a first determination of whether or not the image indicates a defect in the component.
In addition to these steps, when, as in
Accordingly, in the system 100 including both first and second image evaluation modules 104, 105, the first image evaluation module 104 is trained as described above and provides 309 a first determination of whether or not an image of a component being inspected indicates a defect in the component, as described above. In addition to this first determination, a second determination is also provided 409. This second determination is provided 409 by the second image evaluation module 105, which is programmed to identify indications of defects in images of components using standard image processing techniques which would be understood by the skilled person. The first determination provided 309 by the first image evaluation module 104 based on the machine learning algorithm developed by the first image evaluation module 104, and the second determination provided 409 by the second image evaluation module 105 based on its programming are compared by the evaluation comparison module 106. In the simple case in which both the first and second determinations indicate that a defect is present in the component, the evaluation comparison module 106 determines 411 that a defect is present in the component. In the other simple case, in which both the first and second determinations indicate that a defect is not present in the component, the evaluation comparison module 106 determines 413 that a defect is not present in the component.
Whenever the first and second determinations are other than the simple cases described above, the evaluation comparison module 106 evaluates 412 the determinations in a more complex manner. Two principal methods of evaluating 412 the determinations are described in this disclosure, although other methods of evaluation are also contemplated (for example an artificial intelligence (Al)-based evaluation). Each of these two methods of evaluating 412 the determinations using the evaluation comparison module 106 will now be described, with reference to
In the methods of both
In the method of
The method will now be described in more detail with continued reference to
The sum of the weighted scores is compared 506 to a threshold measure. For example, the threshold measure in the present case could be set at 90. If the sum less than or equal to the threshold measure, it is determined 507 that a defect is present. If the sum is greater than the threshold measure, it is determined 508 that no defect is present. Accordingly, with the weightings, scores and threshold measure set out above for this particular example, it would be determined 508 that no defect is present.
In some examples, the sum of the weighted scores can also be compared against a second threshold measure (this comparison is not illustrated in
In the method shown in
In the present example, the fuzzy rules are as follows (where M1 indicates the first determination, weighted for certainty of the first image evaluation module, and M2 indicates the second determination, weighted for certainty of the second image evaluation module):
The output is a value of the reliability of the component, resulting in the surface plotted in
Accordingly, as shown in
In the methods shown in
The methods shown in
The methods shown in
Computer-implementable instructions for causing a computer to carry out the method described herein (where relevant, by causing a device suitable for positioning a component for inspection and camera to carry out certain method steps) can be on a computer-readable medium (CRM). This is shown in
A computer readable medium may include non-transitory type media such as physical storage media including storage discs and solid state devices. A computer readable medium may also or alternatively include transient media such as carrier signals and transmission media. A computer-readable storage medium is defined herein as a non-transitory memory device. A memory device includes memory space within a single physical storage device or memory space spread across multiple physical storage devices.
There have now been described a system, method and computer-readable medium for autonomously diagnosing a defect in a component.
Number | Date | Country | Kind |
---|---|---|---|
20190337.4 | Aug 2020 | EP | regional |
This application is a national stage of, and claims priority to, Patent Cooperation Treaty Application No. PCT/EP2021/072087, filed on Aug. 6, 2021, which application claims priority to European Patent Application No. EP 20190337.4, filed on Aug. 10, 2020, which applications are hereby incorporated herein by reference in their entireties.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2021/072087 | 8/6/2021 | WO |