This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2021-095766 filed Jun. 8, 2021.
The present invention relates to a surface inspection apparatus, a non-transitory computer readable medium storing a program, and a surface inspection method.
Today, in various products, parts made by molding synthetic resin (hereinafter referred to as “molded products”) are used. On the other hand, visually observable defects may appear on the surface of the molded product. This type of defect includes a “sink mark” that is an unintentionally formed dent, a “weld” that is formed at a portion where the molten resin joins, and the like. In addition, even in the case of texture processing in which unevenness is intentionally formed on the surface, a difference from the expected texture may appear. The texture changes due to combined factors of color, luster, and unevenness.
As an example of apparatuses that inspect the quality of the surface of an object, there is an apparatus that displays the quality of the inspected portion as a single numerical value. The quantified numerical value is highly objective, unlike the case of the sensory test. On the other hand, the primary portion that has contributed to the calculation of the numerical value cannot be understood only by displaying the numerical value. Therefore, an operator cannot check whether the portion where he/she pays attention to is used for the calculation of the numerical value or the portion that he/she does not expect is used for the calculation of the numerical value.
An example of related art includes JP2009-264882A.
Aspects of non-limiting embodiments of the present disclosure relate to a surface inspection apparatus and a non-transitory computer readable medium storing a program that make it possible to check whether or not a portion that has contributed to the calculation of a numerical value representing a quality of the surface matches the portion where an operator pays attention to, unlike a case where a relationship between the numerical value representing the quality of the surface and the portion that has contributed to the calculation is unknown.
Aspects of certain non-limiting embodiments of the present disclosure overcome the above disadvantages and/or other disadvantages not described above. However, aspects of the non-limiting embodiments are not required to overcome the disadvantages described above, and aspects of the non-limiting embodiments of the present disclosure may not overcome any of the disadvantages described above.
According to an aspect of the present disclosure, there is provided a surface inspection apparatus including an imaging device configured to image a surface of an object to be inspected, and a processor configured to calculate a numerical value representing a quality of the surface by processing an image captured by the imaging device, and display, on a display device, the image including an index for specifying a position of a portion that has contributed to the calculation of the numerical value and the numerical value.
Exemplary embodiment(s) of the present invention will be described in detail based on the following figures, wherein:
Usage Example of Surface Inspection Apparatus
Hereinafter, exemplary embodiments of the present invention will be described with reference to the drawings.
An imaging unit of the surface inspection apparatus 1 used in the first exemplary embodiment is a so-called area camera, and a range to be imaged (hereinafter referred to as an “imaging range”) is defined by a surface. Illuminations (not shown) are configured to include components that are specular reflection conditions over the entire imaging range.
In the case of
In the case of the inspection by the area camera, the inspection by the surface inspection apparatus 1 and the inspection target 10 is performed in a stationary state. In other words, the inspection of the surface of the inspection target 10 is performed in a state where the surface inspection apparatus 1 and the inspection target 10 do not move relatively.
In the case of
The actual inspection target 10 may have holes, notches, protrusions, steps, and the like.
The types of surface finishes of the inspection target 10 include no processing, mirror finish processing, semi-mirror finish processing, and texturing processing.
The surface inspection apparatus 1 inspects defects on the surface and textures of the inspection target 10.
Defects include, for example, sink marks and welds. The sink mark refers to a dent on the surface generated in the thick portion or the rib portion, and the weld refers to a streak generated in the portion where the tips of the molten resin join in the mold. The defects also include scratches and dents caused by hitting an object.
The texture is a visual or tactile impression, and is influenced by the color, luster, and unevenness of the surface of the object. The unevenness of the surface also includes streaks generated in cutting the mold. This type of streak is different from a defect.
The surface inspection apparatus 1 according to the present exemplary embodiment is used not only for inspection of defects and texture, but also for inspection of surface stains.
The surface inspection apparatus 1 generates an image in which defects on the surface of the inspection target 10 are emphasized, and quantifies a result of evaluating the texture to output the result.
The defects herein are unevenness and streaks appearing in the portion that should be flat, that is, sink marks and welds. The texture is evaluated by a numerical value (hereinafter also referred to as a “score”). The score is an example of a numerical value representing the quality of the surface of the inspection target 10.
For example, multivariate analysis is used to calculate the score. In multivariate analysis, for example, features appearing in the luminance distribution are analyzed. An example of a feature includes a streaky pattern extending along a direction of the sink mark, for example.
In addition, there is also a method of using artificial intelligence to calculate the score. For example, the score of a partial region within an inspection range is calculated by giving the image captured by the camera to a learning model obtained by deep machine learning of the relationship between the image of the defect and the score.
The inspection target 10 shown in
On the other hand, the surface inspection apparatus 1 is arranged vertically above the inspection target 10. In other words, an optical axis of an optical system used by the surface inspection apparatus 1 for imaging the inspection target 10 is set substantially parallel to the normal of the surface of the inspection target 10. Hereinafter, the conditions required for this optical axis are also referred to as “imaging conditions”.
In this case, the surface inspection apparatus 1 is installed at a position that satisfies the imaging conditions. The surface inspection apparatus 1 may be installed by fixing the surface inspection apparatus to a specific member, or may be detachably attached to the specific member.
However, the surface inspection apparatus 1 may be a portable apparatus. In a case where the surface inspection apparatus is portable, an operator inspects any surface by, for example, holding the surface inspection apparatus 1 in his/her hand and directing the light receiving surface toward the inspection target 10.
In
Configuration of Surface Inspection Apparatus
The surface inspection apparatus 1 shown in
The processor 101, the ROM 102, and the RAM 103 function as so-called computers.
The processor 101 realizes various functions through the execution of a program. For example, the processor 101 performs the calculation or the like of the score for evaluating the texture of the imaged surface of the inspection target 10 through the execution of the program.
Image data obtained by imaging the surface of the inspection target 10 is stored in the auxiliary storage device 104. For the auxiliary storage device 104, for example, a semiconductor memory or a hard disk device is used. Firmware and application programs are also stored in the auxiliary storage device 104. In the following, firmware and application programs are collectively referred to as a “program”.
The program that realizes the functions described in the present exemplary embodiment and other exemplary embodiments which will be described later can be provided not only by a communication unit but also by storing the program in a recording medium such as a CD-ROM.
The display 105 is, for example, a liquid crystal display or an organic EL display, and displays an image of the entire inspection target 10 or a specific portion of the inspection target 10. The display 105 is also used for positioning the imaging range with respect to the inspection target 10.
In the case of the present exemplary embodiment, the display 105 is integrally provided in the main body of the surface inspection apparatus, but may be an external device connected through the communication IF 109 or a part of another device connected through the communication IF 109. For example, the display 105 may be a display of another computer connected through the communication IF 109.
The operation reception device 106 is configured with a touch sensor arranged on the display 105, physical switches and buttons arranged on a housing, and the like.
In the case of the present exemplary embodiment, a power button and an imaging button are provided as an example of physical buttons. In a case where the power button is operated, for example, the light source 108 is turned on and the imaging by the camera 107 is started. Further, in a case where the imaging button is operated, a specific image captured by the camera 107 at the time of operation is acquired as an image for inspection.
A device that integrates the display 105 and the operation reception device 106 is called a touch panel. The touch panel is used to receive operations of a user on keys displayed in software (hereinafter also referred to as “soft keys”).
In the case of the present exemplary embodiment, a color camera is used as the camera 107. For the image sensor of the camera 107, for example, a charge coupled device (CCD) imaging sensor element or a complementary metal oxide semiconductor (CMOS) imaging element is used.
Since a color camera is used as the camera 107, it is possible in principle to observe not only the luminance of the surface of the inspection target 10 but also the color tone. The camera 107 is an example of an imaging device.
In the case of the present exemplary embodiment, a white light source is used as the light source 108. The white light source generates light in which light in a visible light band is evenly mixed.
In the case of the present exemplary embodiment, a parallel light source is used as the light source 108. Further, a telecentric lens is arranged on the optical axis of the camera 107.
The light source 108 in the present exemplary embodiment is arranged at an angle at which a light component specular-reflected on the surface of the inspection target 10 is mainly incident on the camera 107.
The communication IF 109 is configured with a module conforming to a wired or wireless communication standard. For the communication IF 109, for example, an Ethernet (registered trademark) module, a universal serial bus (USB), a wireless LAN, or the like is used.
Structure of Optical System
The opening portion 111 is provided with an opening 111A into which illumination light illuminating the surface of the inspection target 10 and reflected light reflected by the surface of the inspection target 10 are input/output, and a flange 111B surrounding an outer edge of the opening 111A.
In the case of
The opening 111A and the flange 111B do not have to have similar shapes, the opening 111A may have a circular shape, and the flange 111B may have a rectangular shape.
The flange 111B is used for positioning the surface inspection apparatus 1 in an imaging direction with respect to the surface of the inspection target 10. In other words, the flange 111B is used for positioning the camera 107 and the light source 108 with respect to the surface to be inspected. The flange 111B also serves to prevent or reduce the incident of external light or ambient light on the opening 111A.
The housing 100 shown in
Further, the display 105 and the operation reception device 106 are attached to the side surface of the housing 100 on the side where the camera 107 is attached.
A modulation transfer function (MTF) in the field of view of the camera 107 is generally uniform. Therefore, the variation in contrast due to the difference in the position in the field of view is small, and the surface of the inspection target 10 can be faithfully imaged.
In the case of
The surface of the actual inspection target 10 has structural or design unevenness, curved surfaces, steps, joints, fine unevenness formed in the molding process, and the like.
Therefore, as the normal N0 of the inspection target 10, an average value of the normal N0 of a region AR of interest in the inspection target 10 or the normal N0 of a specific position P of interest may be used.
Further, as the normal line N0 of the inspection target 10, the normal line N0 of the average virtual surface or the representative portion of the inspection target 10 may be used.
In the case of
Inspection Operation
The process shown in
In the surface inspection apparatus 1 according to the present exemplary embodiment, the light source 108 (see
In the captured image field 121, a distribution of luminance values, that is, a grayscale image is displayed. In the case of
In the example of
The legend 124 is shown on the right side of the captured image field 121. In the case of
In the case of the operation screen 120 shown in
In the present exemplary embodiment, in a case where an operator checking the image displayed on the display 105 operates the imaging button, the image used for evaluating the quality of the surface is determined.
Therefore, the processor 101, which has started the inspection operation by operating the power button, determines whether or not the operation of the imaging button has been received (step S1). The operation of the operation button is an example of an operation of giving an instruction to start an inspection.
While a negative result is obtained in step S1, the processor 101 repeats the determination in step S1.
In a case where a positive result is obtained in step S1, the processor 101 acquires an image to be used for inspection (step S2). Specifically, the image displayed on the display 105 at the time when the imaging button is operated is acquired.
In the case of the present exemplary embodiment, in a case where the imaging button is operated, the update of the image displayed in the captured image field 121 (see
Next, the processor 101 calculates the score using the luminance profile within the inspection range (step S3).
The dent shown in the part (B) in
In this case, the luminance profile S shown in the part (C) in
The representative luminance value herein is given as an integral value of the luminance values of the pixels having an identical X coordinate. The convex waveform of the luminance profile S shows a bright region as compared with the surroundings, and the concave waveform of the luminance profile S shows a dark region as compared with the surroundings.
The score is calculated as, for example, a difference between the maximum value and the minimum value of the luminance profile S.
The score depends on the width, height, depth, number, etc. of the unevenness formed on the surface. For example, even though the height of the convex portion and the depth of the concave portion are identical, the score of the partial region where the convex portion or the concave portion having a longer width is formed becomes high.
Further, even though the widths of the convex portion and the concave portion are identical, the score of the partial region where the higher convex portion and the deeper concave portion are formed becomes high. In the case of the present exemplary embodiment, a high score means poor quality.
In the present exemplary embodiment, the partial region that contributes to the calculation of the score is defined as a space between the start point of the convex waveform and the end point of the concave waveform of the luminance profile S shown in the part (C) in
In a case where the score is calculated, the processor 101 extracts the partial region corresponding to the maximum value of the score (step S4). The partial region herein is an image portion that has contributed to the calculation of the score in the inspection region. In a case where a plurality of maximum values of scores are found, a plurality of partial regions are extracted. Basically, one partial region is extracted.
Subsequently, the processor 101 displays a frame line indicating the extracted partial region in the image (step S5). Further, the processor 101 generates an image in which the features of the extracted partial region are emphasized (hereinafter referred to as an “emphasized image”) and displays the generated image separately (step S6).
In the present exemplary embodiment, the processor 101 extracts a specific periodic component appearing in a specific direction from the extracted partial region, and generates an emphasized image by superimposing the feature image on the original image by the inverse transformation of the extracted periodic component.
For the extraction of periodic components, for example, two-dimensional DCT (=Discrete Cosine Transform), DST (=Discrete Sine Transform), FFT (=Fast Fourier Transform), and the like are used.
In inverse transformation to the feature image, an intensity component (that is, a luminance value) of each pixel is normalized by the maximum value, and a gradation range of the feature image is expanded. In addition, by mapping a color component to the intensity component of the feature image, it is possible to distinguish the feature image from the original image portion expressed in gray scale.
By displaying the emphasized image, it is possible to check the surface state even in a case where it is difficult to visually recognize the minute structure in the grayscale image obtained by imaging the surface of the partial region where the score is calculated.
In the case of the present exemplary embodiment, the generated emphasized image is displayed side by side in the operation screen identical to the grayscale image captured by the camera 107.
In addition, the processor 101 displays the corresponding score on the operation screen 120 (see
Example of Operation Screen
Hereinafter, an example of a screen displayed at the time of inspection of the inspection target 10 by the surface inspection apparatus 1 will be described.
Screen Example 1
In the case of
In the score field 122 of the operation screen 120, a numerical value of “3.1” is displayed as the score calculated for this partial region.
Further, in the captured image field 121, a frame line 125 indicating a portion that has contributed to the calculation of the score (hereinafter referred to as a “main factor portion”) is displayed to be superimposed on the original image. In the case of
The display of the frame line 125 makes it possible to determine whether or not the partial region used to calculate the numerical value displayed in the score field 122 matches the partial region where the operator pays attention to.
The frame line 125 herein is an example of “an index for specifying a position of a portion that has contributed to the calculation of the score”.
The frame line 125 shown in
On the operation screen 120 shown in
In the case of
Screen Example 2
In the case of
The captured image field 121 shown in
Meanwhile, the portion surrounded by the frame line 125 corresponds to the structural outer edge of the inspection target 10 (see
Therefore, the operator can notice that the score displayed on the operation screen is not the score of the portion of his/her interest. In this case, the operator retakes the image to be inspected.
The processor 101 may output an error signal notifying that the captured image is unsuitable for surface quality inspection in a case where a pattern of luminance difference exceeding the threshold value is detected in the image.
Further, the processor 101 may output an error signal even in a case where the calculated score exceeds the threshold value. For example, in a case where the threshold value is “4”, the processor 101 outputs an error signal in a case where “5” is calculated as the score.
In a case where the error signal is output, the processor 101 may discard the image used to calculate the score and autonomously reacquire another image. In addition, the processor 101 may notify the operator of the need for reimaging. The notification may use a display or a sound. In this case, the processor 101 may display the reason why reimaging is considered necessary.
Screen Example 3
In the case of
Even in this case, the processor 101 (see
From this display, the operator knows that the score in the score field 122 is calculated from the lower pattern.
Screen Example 4
In the case of
Here, in a case where the two scores calculated for the two partial regions corresponding to each of the two streaky patterns have an identical value, the two frame lines 125 are displayed in the captured image field 121.
However, the operation screen 120 shown in
Even in a case where there are three or more scores where the difference between the maximum value of the score and the second and subsequent scores is smaller than the threshold value, the number of frame lines 125 displayed in the captured image field 121 may be limited to a predetermined number. For example, only two of the frame line 125 indicating the partial region corresponding to the maximum score and the frame line 125 indicating the partial region corresponding to the second highest score may be displayed in the captured image field 121.
In the example of
Meanwhile, in the display form shown in
Also in the case of
Screen Example 5
In the screen example shown in
In a case where the difference in the magnitude of the score can be identified, the luminance of the frame line 125A and the luminance of the frame line 125B may be changed, or only one of the frame line 125A and the frame line 125B may be displayed by blinking.
Further, the difference in the magnitude of the score may be expressed by the difference in the line type such as a double line or a broken line. That is, the difference in the magnitude of the score may be expressed by the difference in the display form.
In the case of
Screen Example 6
In
The screen examples shown in
In the case of
For example, in a case where the operator taps the captured image field 121, the screen of the part (A) in
“3.1” is displayed in the score field 122 of the operation screen 120 shown in the part (A) in
On the other hand, “3” is displayed in the score field 122 of the operation screen 120 shown in the part (B) in
Although the operation screen 120 shown in
However, in a case where the number of screens to be switched is larger than the predetermined threshold value, by providing a function to return to the screen before the switching by a specific operation, the efficiency of the check work of the operator is realized as compared with the case where the screen can be switched only in the forward order.
Further, the screen switching may be executed by the processor 101 in units of several seconds.
Screen Example 7
In the case of the operation screen 120 shown in
Meanwhile, in the case of the operation screen 120 shown in
In the example of
In the case of
Meanwhile, in the case of the operation screen 120 shown in
In the case of
Screen Example 8
In the part (A) in
The legend 126 of the part (B) in
In the case of the part (B) in
The designation of the portion where the score is to be calculated is not limited to the case where the outer edge of the corresponding partial region is designated, but can also be performed by tapping the point of attention. In this case, the processor 101 may determine the partial region including the tapped point and calculate the score.
Screen Example 9
On the operation screen 120 shown in
On the other hand, on the operation screen 120 shown in
In the display by the arrow 125E, the range of the main factor portion cannot be known. However, as in the case of screen example 2 (see
Screen Example 10
In the case of the operation screen 120 shown in
The frame line 125 in the other screen examples was substantially rectangular, but the frame line 125 shown in
In the case of the present exemplary embodiment, the surface inspection apparatus 1 (see
The appearance configuration and the like of the surface inspection apparatus 1 according to the present exemplary embodiment are identical to the appearance configuration and the like of the surface inspection apparatus 1 described in the first exemplary embodiment.
In the case of
Therefore, in a case where the processor 101 acquires the image being captured by the camera 107 (step S11), the processor 101 calculates the score using the luminance profile within the inspection range (step S3).
Hereinafter, the processor 101 extracts a partial region corresponding to the maximum value of the score (step S4), displays the frame line 125 (see
After the end of step S7, the processor 101 returns to step S11 and repeatedly executes a series of processes. By repeating this series of processes, the score displayed on the operation screen 120 (see
That is, in a case where the position imaged by the camera 107 changes, the positions of the score and the frame line 125 also change.
A so-called line camera is used for an imaging unit of the surface inspection apparatus 1A used in the present exemplary embodiment. Therefore, the imaging range is linear.
In the case of the present exemplary embodiment, at the time of inspection, an inspection target 10 is moved in the direction of the arrow while being installed on a uniaxial stage 20. By moving the uniaxial stage 20 in one direction, the entire inspection target 10 is imaged.
The positional relationship between the camera 107 (see
(1) Although the exemplary embodiments of the present invention have been described above, the technical scope of the present invention is not limited to the scope described in the above-described exemplary embodiments. It is clear from the description of the claims that the above-described exemplary embodiments with various modifications or improvements are also included in the technical scope of the present invention.
(2) In the above-described exemplary embodiments, a color camera is used as the camera 107 (see
(3) In the above-described exemplary embodiments, a white light source is used as the light source 108 (see
Further, the illumination light is not limited to visible light, but may be infrared light, ultraviolet light, or the like.
(4) In the above-described exemplary embodiments, the surface inspection apparatus 1 (see
For example, two light sources may be used. In that case, one light source may be arranged at an angle at which a specular-reflected light component is mainly incident on the camera 107 (see
(5) In the above-described exemplary embodiments, a parallel light source is used as the light source 108 (see
(6) In the above-described exemplary embodiments, the processor 101 (see
(7) In the embodiments above, the term “processor” refers to hardware in a broad sense. Examples of the processor include general processors (e.g., CPU: Central Processing Unit) and dedicated processors (e.g., GPU: Graphics Processing Unit, ASIC: Application Specific Integrated Circuit, FPGA: Field Programmable Gate Array, and programmable logic device).
In the embodiments above, the term “processor” is broad enough to encompass one processor or plural processors in collaboration which are located physically apart from each other but may work cooperatively. The order of operations of the processor is not limited to one described in the embodiments above, and may be changed.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2021-095766 | Jun 2021 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5544256 | Brecher | Aug 1996 | A |
20070201739 | Nakagaki | Aug 2007 | A1 |
20100162401 | Sakaki | Jun 2010 | A1 |
20180347970 | Sasaki | Dec 2018 | A1 |
20190120770 | Chen | Apr 2019 | A1 |
20190139212 | Hanzawa | May 2019 | A1 |
20190285554 | Konishi | Sep 2019 | A1 |
20200134807 | Ozaki | Apr 2020 | A1 |
20200234419 | Ota | Jul 2020 | A1 |
20200292462 | Chen | Sep 2020 | A1 |
Number | Date | Country |
---|---|---|
2001305075 | Oct 2001 | JP |
2009264882 | Nov 2009 | JP |
Number | Date | Country | |
---|---|---|---|
20220392052 A1 | Dec 2022 | US |