This application is a National Phase Application of PCT International Application No. PCT/IB2020/053298, having an International Filing Date of Apr. 7, 2020 which claims priority to Italian Application No. 102019000005536 filed Apr. 10, 2019, each of which is hereby incorporated by reference in its entirety.
The present invention refers to a method for acquiring images of a part to undergo optical inspection and a viewing group implementing said method, in particular for an optical inspection machine for the quality control of parts, for example gaskets.
There are known optical inspection machines for the quality control of parts. Generally, these machines comprise a rotary table on which parts to be inspected are placed, and inspection stations arranged above the rotary table so as to subject the part to various optical inspections. The inspection stations are configured to acquire various images of the part, which differ depending on the various positions of the video cameras and/or various lighting conditions of the part.
It is obvious that the greater the accuracy of the required quality control, the greater the number of inspection stations required to acquire various images of the part. This results in an increase in the size of the machine and the time to complete an inspection cycle.
Efforts have been made to reduce the number of inspection stations and consequently the overall size of the machine and the inspection cycle time, by illuminating the part in the inspection station with several light sources that illuminate the part from various directions and with different colors of the light beam.
An image processing unit connected to the video camera acquires the image obtained by the video camera and performs a certain number of filtering operations on the image so that the part is represented with the data of a single color at a time, and then emphasizes the surfaces most exposed to the illumination source of that color.
The overall image of the part is then obtained by interpolation of the data obtained in the filtering operation. It is obvious that the resolution of the final image is much lower than that of the image obtained before the filtering operation. In some cases this loss of resolution may entail an unacceptable drop in the accuracy of the dimensional check of the part.
The purpose of this invention is to propose an image acquisition method, particularly for a viewing group of a machine for the quality control of parts, which can reduce the number of inspection stations and therefore the overall size of the machine and the part inspection cycle time, but without suffering the serious drawback mentioned above regarding a loss of resolution of the image used to conduct the dimension check on the part.
This purpose is achieved with an image acquisition method, with a viewing group, and with an optical inspection machine. The disclosure further describes preferred embodiments of the invention.
The features and advantages of the method, the viewing group, and the machine according to the invention will become clear from the description given below of preferred embodiments, given solely as non-limiting examples, in reference to the enclosed figures, wherein:
a, and 3b are three examples of various images of the same part obtained with the method and viewing group according to the invention;
In said drawings, number 1 refers to a viewing group as a whole for acquiring images of a part 2 to be subjected to a dimensional check.
Viewing group 1 comprises a controllable digital video camera 10 for taking a series of images of framed part 2.
Digital video camera 10 preferably has a high frame rate. For example, digital video camera 10 is capable of working, for instance, from 100 to 600 Hz (that is, 100 to 600 images per second). Each shot is defined by a preset “exposure” time, as indicated in the example shown.
Viewing group 1 also comprises a group control unit, housed for example in a special space 3 of housing 4 that contains viewing group 1 in the example of a viewing machine 100 shown in
Viewing group 1 also comprises an illuminator 12 suitable for illuminating part 2 to be inspected, and an illumination control unit 14 operationally connected to illuminator 12.
Illuminator 12 comprises a plurality of illumination sources 16a, 16b, 16c, 16d, and 16e arranged to illuminate various portions of the part or to illuminate the part from different angles.
Illumination control unit 14 has a plurality of output channels 14a-14d, one for each individual illumination source and/or for each group of illumination sources.
Illumination control unit 14 is programmed to activate an output channel or a combination of output channels in response to an input signal 18 synchronized to the “trigger” control pulses of digital video camera so as to generate a sequence of flashes s0-s3 of individual illumination sources and/or of individual groups of illumination sources according to a preset illumination program.
In one embodiment, illumination control unit 14 is programmed to adjust the brightness of individual illumination sources 16-16e and/or of the individual groups of illumination sources on the output channels activated according to the lighting program.
In one embodiment shown in
In the chart of
In response to a second control pulse from digital video camera 10, and for the duration of the exposure time, the second illumination source is turned on with 100% brightness and a third illumination source is turned on with 80% brightness, whereas the first and fourth illumination sources are kept off.
In response to a third control pulse from digital video camera 10, and for the duration of the exposure time, the first illumination source is turned on with 100% brightness, the third illumination source is turned on with 50% brightness, and the fourth illumination source is turned on with 100% brightness. The second illumination source is left off.
b show three different images of part 2 obtained with the various lighting conditions.
As an innovation, each of the various images of the part, which represent various portions of the part's surface and/or represent the part with differing contrasts, has the maximum possible resolution because it is not obtained through filtering operations but rather from an appropriate selection of the illumination sources to be activated with each shot of the video camera. Consequently, the quality of the part's dimension measurement is the best possible for a given resolution of the video camera.
In one embodiment shown in
In one embodiment shown for example in
Machine 100 is provided with a processing unit, not shown, operationally connected to digital video camera 10 and suitable for acquiring the images of the part taken by the video camera under the various lighting conditions determined by the lighting program. Machine 100 is also provided with a user interface comprising a monitor on which the processing unit displays the images of the part, for example the images in
The invention also refers to a method for acquiring images of a part 2 to be inspected, comprising the following steps:
In one embodiment, the individual illumination sources or individual groups of illumination sources controllable by the illumination control unit 14, not only in terms of turning them on and off, but also in terms of adjusting the brightness level of the emitted light.
In one embodiment, video camera 10 itself sends an input signal to illumination control unit 14 in response to the control signal received by video camera 14.
In order to satisfy contingent requirements, a person skilled in the art could make modifications, adaptations, and substitutions of parts with functionally equivalent ones to the embodiments of the method and viewing group according to the invention. Each feature described as belonging to a possible embodiment may be implemented independently of the other described embodiments.
Number | Date | Country | Kind |
---|---|---|---|
102019000005536 | Apr 2019 | IT | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2020/053298 | 4/7/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/208512 | 10/15/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5072127 | Cochran et al. | Dec 1991 | A |
6201892 | Ludlow | Mar 2001 | B1 |
6207946 | Jusoh | Mar 2001 | B1 |
10824055 | McGuire | Nov 2020 | B1 |
10897797 | Franciosa | Jan 2021 | B2 |
11442020 | Lang | Sep 2022 | B2 |
20080170380 | Pastore | Jul 2008 | A1 |
20150355101 | Sun | Dec 2015 | A1 |
20160210524 | Hasegawa | Jul 2016 | A1 |
20170089840 | Hashiguchi | Mar 2017 | A1 |
20180165820 | Rhodes, Jr. | Jun 2018 | A1 |
20180252691 | Blanc | Sep 2018 | A1 |
20180330489 | Kido | Nov 2018 | A1 |
20190104577 | Miller | Apr 2019 | A1 |
20190188841 | Kato | Jun 2019 | A1 |
20190268522 | Hayashi | Aug 2019 | A1 |
20190289189 | Inazumi | Sep 2019 | A1 |
20200175669 | Bian | Jun 2020 | A1 |
20200340929 | Chehaiber | Oct 2020 | A1 |
20200412932 | Naruse | Dec 2020 | A1 |
20210299879 | Pinter | Sep 2021 | A1 |
20220178839 | Berselli | Jun 2022 | A1 |
20220262019 | Watanabe | Aug 2022 | A1 |
20230021095 | Zhou | Jan 2023 | A1 |
20230038844 | Li | Feb 2023 | A1 |
Number | Date | Country |
---|---|---|
103884650 | Jun 2014 | CN |
102015105656 | Oct 2016 | DE |
0955538 | Nov 1999 | EP |
1455179 | Aug 2004 | EP |
1748643 | Jan 2007 | EP |
2280270 | Feb 2011 | EP |
2006174984 | Jul 2006 | JP |
4061637 | Mar 2008 | JP |
200969131 | Apr 2009 | JP |
20170121840 | Nov 2017 | KR |
WO-9922224 | May 1999 | WO |
WO-2014161255 | Oct 2014 | WO |
2015056186 | Apr 2015 | WO |
WO-2020049551 | Mar 2020 | WO |
Entry |
---|
Y. Y. Schechner, S. K. Nayar and P. N. Belhumeur, “Multiplexing for Optimal Lighting,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, No. 8, pp. 1339-1354, Aug. 2007, doi: 10.1109/TPAMI.2007.1151. (Year: 2007). |
Number | Date | Country | |
---|---|---|---|
20220148155 A1 | May 2022 | US |