The present invention relates to a defect inspection apparatus and a defect inspection method.
PTL 1 discloses an apparatus for detecting a defect on a surface of work. This apparatus acquires a plurality of images with respect to a measurement target portion of the work in a state that a bright and dark pattern created by an illumination device is moved relative to the work targeted for the detection of the surface defect, extracts a feature point in the image by performing binarization processing and applying a threshold value on each of the images, acquires a multi-dimensional feature amount with respect to each feature point, and extracts a tentative defect candidate. If the number of images in which the tentative defect candidate matches estimated coordinates is a preset threshold value or more among a plurality of images in which the tentative defect candidate is extracted, the apparatus determines that the tentative defect candidate is a defect candidate, synthesizes the plurality of images containing the determined defect candidate to generate a synthesized image, and detects the defect based on the generated synthesized image.
However, the surface defect inspection apparatus disclosed in PTL 1 may end up erroneously detecting the defect because synthesizing the plurality of images containing the determined defect candidate to generate the synthesized image and detecting the defect based on the single generated synthesized image.
One of objects of the present invention is to provide a defect inspection apparatus and a defect inspection method capable of improving determination accuracy of an inspection.
According to one aspect of the present invention, a defect inspection method includes causing a defect inspection apparatus to perform an image acquisition step of changing a relative pose between an inspection target and an imaging portion and capturing an image each time the relative pose is changed, thereby acquiring a plurality of images, a feature amount acquisition step of acquiring a specific feature acquired by scanning each of the plurality of images as a first feature amount, a waveform data acquisition step of acquiring first waveform data in which the first feature amount is arranged in an order of a change in the relative pose, and a defect determination step of determining a defect according to the first waveform data.
Therefore, according to the one aspect of the present invention, the determination accuracy of the inspection can be improved by determining a stain or a defect according to the first waveform data in which the first feature amount is arranged in the order of the change in the relative pose.
An inspection apparatus 1 according to the first embodiment includes a camera (an imaging portion) 2, a robot 3, and a computer 4.
The camera 2 captures an image of a crown surface 5a of a piston (an inspection target) 5.
The crown surface 5a of the piston 5 has a machined surface and an as-cast surface significantly different from each other in surface property.
The robot 3 changes the pose (the angle) of the piston 5 relative to the camera 2.
The computer 4 is, for example, a personal computer, and includes a CPU 6.
The CPU 6 includes an image acquisition portion 7, a feature amount acquisition portion 8 including a region extraction portion 8a, a waveform data acquisition portion 9, and a defect determination portion 10.
The image acquisition portion 7 acquires a plurality of images of the crown surface 5a of the piston 5 captured by the camera 2 (an image acquisition step).
The region extraction portion 8a extracts and identifies a defect candidate region by means of machine learning and/or image processing (a defect candidate region identification step).
The machine learning is, for example, learning using a neural network, and a CNN (Convolutinal Neural Netwark) is employed in the first embodiment.
The feature amount acquisition portion 8 acquires images of a plurality of defect candidate regions extracted and identified by the region extraction portion 8a by means of the machine learning and/or the image processing, and entirely scans a pixel region of each of them, thereby acquiring a luminance difference (a first feature amount) (a feature amount acquisition step).
The luminance difference refers to a difference between a maximum value and a minimum value of acquired luminances.
The waveform data acquisition portion 9 acquires first waveform data in which the acquired first feature amount is arranged in the order of a change in the relative pose (the angle) (a waveform data acquisition step).
The luminance difference is set to zero at an angle number not extracted and identified as the defect candidate region.
The defect determination portion 10 determines a defect based on the shape of the first waveform data by means of the machine learning (a defect determination step).
This machine learning is, for example, learning using a neural network, and an RNN (Recurrent Neural Netwark) is employed in the first embodiment.
In step S1, the robot 3 changes the relative pose (the angle) of the piston 5, and the image acquisition portion 7 acquires the plurality of images of the crown surface 5a of the piston 5 captured by the camera 2 (the image acquisition step).
More specifically, the total number of relative poses (angles) is 25 angles.
In step S2, the region extraction portion 8a extracts and identifies the defect candidate region by means of the machine learning and/or the image processing (the defect candidate region identification step).
In step S3, the feature amount acquisition portion 8 acquires the images of the plurality of defect candidate regions extracted and identified by the region extraction portion 8a by means of the machine learning and/or the image processing, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference (the feature amount acquisition step).
In Step S4, the waveform data acquisition portion 9 acquires the first waveform data in which the acquired luminance difference is arranged in the order of the change in the relative pose (the angle) (the waveform data acquisition step).
The luminance difference is set to zero at an angle number not extracted and identified as the defect candidate region.
In step S5, the defect determination portion 10 determines a defect based on the shape of the first waveform data by means of the machine learning (the defect determination step).
More specifically, the luminance difference has such characteristics that the luminance difference is small at a stain because the stain is imaged in light black and the luminance difference is large at a defect because the defect is imaged in dark black as illustrated in
Further, the luminance difference has such characteristics that the luminance difference emerges randomly at a stain and emerges consecutively in a bright field at a defect even when the value thereof is the same therebetween as illustrated in
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 14 and 18 to 20 as the defect candidate regions for a machined surface 1, and extracts and identifies images numbered with angle numbers 14 to 17 as the defect candidate regions for a machined surface 2.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 14 and 18 to 20 of the machined surface 1 and the images numbered with the angle numbers 14 to 17 of the machined surface 2 extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference. Then, the waveform data acquisition portion 9 acquires the first waveform data arranged in the order of the change in the relative pose (the angle).
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired first waveform data of the luminance difference.
That is, the defect determination portion 10 determines that the images numbered with the angle numbers 14 and 18 to 20 of the machined surface 1 are non-defective (a stain) because a luminance difference a1 thereof is small, and the images numbered with the angle numbers 14 to 17 of the machined surface 2 are defective (a defect) because a luminance difference a2 thereof is large.
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 10, 12, and 15 as the defect candidate regions for the as-cast surface, and extracts and identifies images numbered with angle numbers 14 to 17 as the defect candidate regions for the machined surface.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 10, 12, and 15 of the as-cast surface and the images numbered with the angle numbers 14 to 17 of the machined surface extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference. Then, the waveform data acquisition portion 9 acquires the first waveform data arranged in the order of the change in the relative pose (the angle).
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired first waveform data of the luminance.
That is, no difference lies between a luminance difference a3 of the images numbered with the angle numbers 10, 12, and 15 of the as-cast surface and the luminance difference a3 of the images numbered with the angle numbers 14 to 17 of the machined surface, but the luminance difference of the as-cast surface emerges randomly while the luminance of the machined surface emerges consecutively.
Therefore, the defect determination portion 10 determines that the images numbered with the angle numbers 10, 12, and 15 of the as-cast surface are non-defective (a stain) and the images numbered with the angle numbers 14 to 17 of the machined surface are defective (a defect).
Next, advantageous effects of the first embodiment will be described.
The first embodiment brings about the following advantageous effects.
Therefore, the first embodiment can improve the determination accuracy of inspecting whether the target is defective (a defect) or non-defective (a stain).
Therefore, the first embodiment can further improve the determination accuracy of the inspection, because the luminance more sharply changes as the difference increases between the maximum value and the minimum value of the luminances.
(3) The region extraction portion 8a of the feature amount acquisition portion 8 is configured to extract and identify the defect candidate region using the CNN (Convolutinal Neural Netwark), which is machine learning.
Therefore, the first embodiment facilitates the determination about a defect.
Therefore, the first embodiment can further improve the determination accuracy of inspecting whether the target is defective (a defect) or non-defective (a stain).
More specifically, the luminance gradient has such characteristics that the maximum value of the luminance gradient is small at a stain and the maximum value of the luminance gradient is large at a defect as illustrated in
Further, the luminance gradient has such characteristics that the luminance gradient emerges randomly at a stain and emerges consecutively in a bright field at a defect as illustrated in
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 14 and 18 to 20 as the defect candidate regions for the machined surface 1, and extracts and identifies images numbered with angle numbers 21 to 24 as the defect candidate regions for the machined surface 2.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 14 and 18 to 20 of the machined surface 1 and the images numbered with the angle numbers 21 to 24 of the machined surface 2 extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance gradient. Then, the waveform data acquisition portion 9 acquires the first waveform data arranged in the order of the change in the relative pose (the angle).
The luminance gradient is set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired first waveform data of the luminance gradient.
That is, the defect determination portion 10 determines that the images numbered with the angle numbers 14 and 18 to 20 of the machined surface 1 are non-defective (a stain) because the luminance gradient thereof is small, and the images numbered with the angle numbers 21 to 24 of the machined surface 2 are defective (a defect) because the luminance gradient thereof is large.
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 14, 16, and 18 as the defect candidate regions for the as-cast surface, and extracts and identifies images numbered with angle numbers 21 to 24 as the defect candidate regions for the machined surface.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 14, 16, and 18 of the as-cast surface and the images numbered with the angle numbers 21 to 24 of the machined surface extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance gradient. Then, the waveform data acquisition portion 9 acquires the first waveform data arranged in the order of the change in the relative pose (the angle).
The luminance gradient is set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired first waveform data of the luminance.
That is, the defect determination portion 10 determines that the images numbered with the angle numbers 14, 16, and 18 of the as-cast surface are non-defective (a stain) because the luminance gradient thereof is small and emerges randomly, and the images numbered with the angle numbers 21 to 24 of the machined surface are defective (a defect) because the luminance gradient thereof is large and emerges consecutively.
In this manner, while the first embodiment uses the luminance difference as the first feature amount, the second embodiment is configured to use the luminance gradient as the first feature amount.
Other than that, the second embodiment is configured similarly to the first embodiment, and therefore similar components to the first embodiment will not be described here while being indicated by the same reference numerals.
Next, advantageous effects of the second embodiment will be described.
The second embodiment can bring about the following advantageous effects similarly to the first embodiment.
Therefore, the second embodiment can improve the determination accuracy of inspecting whether the target is defective (a defect) or non-defective (a stain).
Therefore, although a plurality of luminance gradients can be acquired, the use of the maximum value can further improve the determination accuracy of inspecting whether the target is defective (a defect) or non-defective (a stain).
Therefore, the second embodiment facilitates the determination about a defect.
Therefore, the second embodiment can further improve the determination accuracy of inspecting whether the target is defective (a defect) or non-defective (a stain).
More specifically, the luminance dispersion (a standard deviation of a luminance distribution) σ has such characteristics that the luminance dispersion (the standard deviation of the luminance distribution) σ is high at a stain and the luminance dispersion (the standard deviation of the luminance distribution) σ is low at a defect as illustrated in
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 14, 16, and 18 as the defect candidate regions for the as-cast surface, and extracts and identifies images numbered with angle numbers 22 to 24 as the defect candidate regions for the machined surface.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 14, 16, and 18 of the as-cast surface and the images numbered with the angle numbers 22 to 24 of the machined surface extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance dispersion (the standard deviation of the luminance distribution) σ. Then, the waveform data acquisition portion 9 acquires the first waveform data arranged in the order of the change in the relative pose (the angle).
The luminance dispersion (the standard deviation of the luminance distribution) is set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired waveform data of the luminance.
That is, the defect determination portion 10 determines that the images numbered with the angle numbers 14, 16, and 18 of the as-cast surface are non-defective (a stain) because the luminance dispersion (the standard deviation of the luminance distribution) σ is high, and the images numbered with the angle numbers 22 to 24 of the machined surface are defective (a defect) because the luminance dispersion (the standard deviation of the luminance distribution) σ is low.
In this manner, while the first embodiment uses the luminance difference as the first feature amount, the third embodiment is configured to use the luminance dispersion (the standard deviation of the luminance distribution) σ as the first feature amount.
Other than that, the third embodiment is configured similarly to the first embodiment, and therefore similar components to the first embodiment will not be described here while being indicated by the same reference numerals.
Therefore, the third embodiment can bring about similar advantageous effects to the first embodiment.
More specifically,
As the geometric characteristics, a stain is characterized by being small in width, area, and/or the like, and a defect is characterized by being large in width and/or area.
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images as the defect candidate regions of the machined surface 1 having a predetermined or lower luminance and extracts and identifies images as the defect candidate regions of the machined surface 2 having the predetermined or lower luminance (
The geometric characteristics are set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired waveform data of the geometric characteristics.
That is, the defect determination portion 10 determines that the images of the machined surface 1 are defective (a defect) because the geometric characteristics thereof are large in width and/or area, and the images of the machined surface 2 are non-defective (a stain) because the geometric characteristics thereof are small in width and/or area.
In this manner, while the first embodiment uses the luminance difference as the first feature amount, the fourth embodiment is configured to use the geometric characteristics as the first feature amount.
Other than that, the fourth embodiment is configured similarly to the first embodiment, and therefore similar components to the first embodiment will not be described here while being indicated by the same reference numerals.
Therefore, the fourth embodiment can bring about similar advantageous effects to the first embodiment.
For example, when the color intensity has such characteristics that luminance differences of G (green) and B (blue) are large at a defect and luminance differences of G (green) and R (red) are large at a stain, this allows whether the target is a defect to be determined based on how large the luminance differences of B (blue) and R (red) are.
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images as the defect candidate regions of the machined surface and extracts and identifies images as the defect candidate regions of the as-cast surface (
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the plurality of images extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the color intensity (the luminance difference) of each of them. Then, the waveform data acquisition portion 9 acquires the first waveform data arranged in the order of the change in the relative pose (the angle).
The color intensity (the luminance difference) is set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired waveform data of the intensity (the luminance difference) for each color (RGB).
That is, the defect determination portion 10 determines that the color intensity (the luminance difference) of the images of the machined surface is defective (a defect) because the luminance difference of B (blue) is large and the luminance difference of R (red) is small, and the color intensity (the luminance difference) of the images of the as-cast surface is non-defective (a stain) because the luminance difference of B (blue) is small and the luminance difference of R (red) is large.
In this manner, while the first embodiment uses the luminance difference as the first feature amount, the fifth embodiment is configured to use the color intensity (the luminance difference) as the first feature amount.
Other than that, the fifth embodiment is configured similarly to the first embodiment, and therefore similar components to the first embodiment will not be described here while being indicated by the same reference numerals.
Therefore, the fifth embodiment can bring about similar advantageous effects to the first embodiment.
The first embodiment uses only the first feature amount but the sixth embodiment is configured to use a second feature amount in addition to the first feature amount.
More specifically, in step S3a, the feature amount acquisition portion 8 acquires the plurality of images of the defect candidate regions extracted and identified by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference (the first feature amount) and the luminance gradient (the second feature amount). In step S4a, the waveform data acquisition portion 9 acquires the first waveform data and second waveform data in which the luminance difference (the first feature amount) and the luminance gradient (the second feature amount) are arranged in the order of the change in the relative pose (the angle), respectively. In step S5a, the defect determination portion 10 determines a defect based on the shapes of the first waveform data and the second waveform data by means of the machine learning.
Other than that, the sixth embodiment is configured similarly to the first embodiment, and therefore similar components to the first embodiment will not be described here while being indicated by the same reference numerals.
More specifically.
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 18, 20, and 23 as the defect candidate regions for the as-cast surface, and extracts and identifies images numbered with angle numbers 22 to 25 as the defect candidate regions for the machined surface.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 18, 20, and 23 of the as-cast surface and the images numbered with the angle numbers 22 to 25 of the machined surface extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference and the luminance gradient. Then, the waveform data acquisition portion 9 acquires the first waveform data and the second waveform data in which the luminance difference and the luminance gradient are arranged in the order of the change in the relative pose (the angle), respectively.
Further, the luminance difference and the luminance gradient are set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired first waveform data of the luminance.
That is, a luminance gradient b1 of the image numbered with the angle number 18 of the as-cast surface and the luminance gradient b1 of the image numbered with the angle number 22 of the machined surface are large, but the luminance difference a1 of the as-cast surface is small and the luminance difference a2 of the machined surface is large.
Therefore, the defect determination portion 10 determines that the image numbered with the angle number 18 of the as-cast surface is non-defective (a stain) and the image numbered with the angle number 22 of the machined surface is defective (a defect).
The defect determination portion 10 also makes the determination with respect to the images numbered with the other extracted and identified angle numbers in a similar manner.
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 18, 20, and 23 as the defect candidate regions for the as-cast surface, and extracts and identifies images numbered with angle numbers 22 to 25 as the defect candidate regions for the machined surface.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 18, 20, and 23 of the as-cast surface and the images numbered with the angle numbers 22 to 25 of the machined surface extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference and the luminance gradient. Then, the waveform data acquisition portion 9 acquires the first waveform data and the second waveform data in which the luminance difference and the luminance gradient are arranged in the order of the change in the relative pose (the angle), respectively.
Further, the luminance difference and the luminance gradient are set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired first waveform data and second waveform data of the luminance.
That is, the luminance gradient b1 of the image numbered with the angle number 18 of the as-cast surface is small and a luminance gradient b2 of the image numbered with the angle number 22 of the machined surface is large, and the luminance difference a1 of the as-cast surface and the luminance difference a1 of the machined surface are large.
Therefore, the defect determination portion 10 determines that the image numbered with the angle number 18 of the as-cast surface is non-defective (a stain) because the luminance difference a1 is large but the luminance gradient b1 is small, and the image numbered with the angle number 22 of the machined surface is defective (a defect) because both the luminance difference a1 and the luminance gradient b2 are large.
The defect determination portion 10 also makes the determination with respect to the images numbered with the other extracted and identified angle numbers in a similar manner.
Next, advantageous effects of the sixth embodiment will be described.
The sixth embodiment brings about the following advantageous effects in addition to the advantageous effects of the first embodiment.
The sixth embodiment uses the first feature amount and the second feature amount but the seventh embodiment is configured to use a third feature amount in addition to the first feature amount and the second feature amount.
More specifically, in step S3b, the feature amount acquisition portion 8 acquires the plurality of images of the defect candidate regions extracted and identified by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference (the first feature amount), the luminance gradient (the second feature amount), and the luminance dispersion (the third feature amount). In step S4b, the waveform data acquisition portion 9 acquires the first waveform data, the second waveform data, and third waveform data in which the luminance difference (the first feature amount), the luminance gradient (the second feature amount), and the luminance dispersion (the third feature amount) are arranged in the order of the change in the relative pose (the angle), respectively. In step S5b, the defect determination portion 10 determines a defect based on the shapes of the first waveform data, the second waveform data, and the third waveform data by means of the machine learning.
Other than that, the seventh embodiment is configured similarly to the sixth embodiment, and therefore similar components to the sixth embodiment will not be described here while being indicated by the same reference numerals.
More specifically,
In the defect candidate region identification step, the region extraction portion 8a extracts and identifies images numbered with angle numbers 14, 17, 18, and 20 as the defect candidate regions for the as-cast surface, and extracts and identifies images numbered with angle numbers 19, 22, 23, and 25 as the defect candidate regions for the machined surface.
In the feature amount acquisition step and the waveform data acquisition step, the feature amount acquisition portion 8 acquires the images numbered with the angle numbers 14, 17, 18, and 20 of the as-cast surface and the images numbered with the angle numbers 19, 22, 23, and 25 of the machined surface extracted and identified as the defect candidate regions by the region extraction portion 8a, and entirely scans the pixel region of each of them, thereby acquiring the luminance difference, the luminance gradient, and the luminance dispersion. Then, the waveform data acquisition portion 9 acquires the first waveform data, the second waveform data, and the third waveform data in which the luminance difference, the luminance gradient, and the luminance dispersion are arranged in the order of the change in the relative pose (the angle), respectively.
The luminance difference, the luminance gradient, and the luminance dispersion are set to zero at an angle number not extracted and identified as the defect candidate region.
In the defect determination step, the defect determination portion 10 determines a defect using the RNN (Recurrent Neural Netwark) based on the acquired first waveform data, second waveform data, and third waveform data of the luminance.
That is, the defect determination portion 10 determines that the as-cast surface is non-defective (a stain) because the luminance gradient and the luminance difference are large but the luminance dispersion is high, and the machined surface is defective (a defect) because the luminance gradient and the luminance difference are large and the luminance dispersion is low.
Next, advantageous effects of the seventh embodiment will be described.
The seventh embodiment brings about the following advantageous effects in addition to the advantageous effects of the sixth embodiment.
Therefore, the seventh embodiment further improves the determination accuracy due to an increase in the information amount.
The region extraction portion 8a extracts and identifies the defect candidate regions using the CNN (Convolutinal Neural Netwark) to acquire the luminance difference (the first feature amount) in the first embodiment, but is configured to entirely scan the pixel region of each of the images at the plurality of angles to acquire the luminance difference (the first feature amount) in the eighth embodiment.
Other than that, the eighth embodiment is configured similarly to the first embodiment, and therefore similar components to the first embodiment will not be described here while being indicated by the same reference numerals.
Next, the advantageous effects of the eighth embodiment will be described.
The eighth embodiment brings about the following advantageous effects in addition to the advantageous effects of the first embodiment.
Therefore, the eighth embodiment can reduce the cost of the computer 4.
Having described the embodiments for implementing the present invention, the specific configuration of the present invention is not limited to the configurations of the embodiments, and the present invention also includes even a design modification and the like thereof made within a range that does not depart from the spirit of the present invention, if any.
For example, the inspection target is not limited to the piston, and the machine learning is not limited to the CNN (Convolutinal Neural Netwark) and the RNN (Recurrent Neural Netwark).
Further, the pose control portion changes the pose of the piston by means of the robot, but may fulfill this intended purpose by changing the position of the camera.
The present application claims priority under the Paris Convention to Japanese Patent Application No. 2022-98638 filed on Jun. 20, 2022. The entire disclosure of Japanese Patent Application No. 2022-98638 filed on Jun. 20, 2022 including the specification, the claims, the drawings, and the abstract is incorporated herein by reference in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2022-098638 | Jun 2022 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2023/021136 | 6/7/2023 | WO |