1. Field
The following description relates to a vision inspection system and a method of inspecting an inspection object using the same, and more particularly, to a vision inspection system which obtains a number of scanned images of an inspection object for inspection, and an inspection method of inspecting the inspection object using the vision inspection system.
2. Description of the Related Art
An optical inspection system consists of a camera and a computer. The camera obtains image data by capturing images of various inspection objects and the computer processes the image data input from the camera using an image processing program. The optical inspection system has been widely utilized in various fields including recognition and inspection of an inspection object, sorting out defective or non-defective work-pieces, and the like.
A number of patent applications such as U.S. Pat. No. 7,030,351 and US Patent Application No. 2003/0197925 A1 disclose a vision inspection system. Especially, the vision inspection system disclosed in the above mentioned documents consists of a work-piece stage, a camera stage, a controller, a camera, and a computer. The work-piece stage is movable along an X axis and a Y axis so as to load, unload, and position a work-piece. The camera stage is installed above the work-piece stage and is operable to move and rotate along the X, Y, and Z axes for positioning and focusing the camera. The controller is connected to the computer to control the operation of the work-piece stage, the camera stage, and the camera.
According to the prior art, the vision inspection system employs a linescan camera with a high resolution to precisely inspect a defect of the work-piece in micrometer units. The linescan camera scans an inspection object along one horizontal line to acquire a scanned image. The inspection of large-sized inspection objects including a cell, a panel, a module, and a glass substrate such as a thin film transistor-liquid crystal display (TFT-LCD), a plasma display panel (PDP), and an organic electroluminescence (OEL) is performed by a number of linescan cameras. The linescan cameras divide the entire inspection object into a plurality of areas and scan the divided areas. A plurality of markings are placed on scanned images as reference points such that coordinates of a defect can be calculated with reference to the markings by the computer that processes the scanned images.
However, the vision inspection system according to the prior art has the respective linescan cameras positioned by the individual camera stages, so that it takes a significant amount of time and effort for the arrangement of the linescan cameras, and precise alignment of the linescan cameras is difficult to achieve. The positions of the linescan cameras are easily changed by various conditions such as vibration, impact, and mechanical modulation. Hence, to achieve reliability and reproducibility of the inspection, a method of easily recognizing positions of the linescan cameras is required and positioning of the linescan cameras must be periodically performed.
The following description relates to a vision inspection system which calculates processing parameters of linescan cameras by providing markings on a table on which an inspection object is loaded to be transferred, and an inspection method of inspecting the inspection object using the vision inspection system.
In addition, the following description relates to a vision inspection system which easily performs positioning and arrangement of linescan cameras, and an inspection method of inspecting an inspection object using the vision inspection system.
Also, the following description relates to a vision inspection system which accurately inspects a defect of an inspection object to significantly increase reliability and reproducibility, and an inspection method of inspecting an inspection object using the vision inspection system.
In one general aspect, provided is a vision inspection system including: a work-piece stage configured to include a table on which an inspection object is loaded and move the table between a first position at which the inspection object is loaded and a second position at which an image of the inspection object is scanned; a plurality of linescan cameras, each configured to is be arranged at the second position along a direction orthogonal to a transfer direction of the inspection object and scan an image of the inspection object to obtain a scanned image; and a computer configured to be connected with the work-piece stage and the linescan cameras and process the scanned image of the inspection object which is input from each of the linescan cameras, wherein a plurality of markings, each of which has a marking stage coordinate value, are provided on an upper surface of the table along an arrangement direction of the linescan cameras such that the linescan cameras can obtain scanned images of the markings, each two neighboring markings are placed in a field of view of each of the linescan cameras, the markings between the first and the last markings are respectively placed in overlapping portions of the fields of view of each two neighboring linescan cameras, and the computer is configured to compute marking image coordinate values from scanned images of the markings which are input from the linescan cameras and simultaneously process the scanned image of the inspection object using the marking image coordinate values.
In another general aspect, provided is an inspection method of inspecting an inspection object using a vision inspection system which comprises a work-piece stage configured to include a table on which an inspection object is loaded and move the table linearly between a first position at which the inspection object is loaded and a second position at which an image of the inspection object is scanned, a plurality of linescan cameras, each configured to be arranged at the second position along a direction orthogonal to a transfer direction of the inspection object and scan an image of the inspection object to obtain a scanned image, and a computer configured to be connected with the work-piece stage and the linescan cameras and process the scanned image of the inspection object by processing image data of the inspection object which is input from each of the linescan cameras, the inspection method including: providing a plurality of markings, each of which has a marking stage coordinate value, on an upper surface of the table along an arrangement direction of the linescan cameras such that the linescan cameras can obtain scanned images of the markings; obtaining the scanned images of the markings using the linescan cameras; calculating a marking image coordinate value from the scanned image of each of the markings; obtaining the scanned image of the inspection object using the linescan cameras when the marking image coordinate value falls within an allowable tolerance range with respect to the marking stage coordinate value; calculating a work-piece image coordinate value of the inspection object from the scanned image of the inspection object; calculating a work-piece image-stage coordinate value from the work-piece image coordinate value; and determining the inspection object as being non-defective when the work-piece image-stage coordinate value falls within an allowable tolerance range with respect to the work-piece stage coordinate value.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
The following description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein will be suggested to those of ordinary skill in the art. Also, descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness.
Referring to
A work-piece stage 30 is installed on the upper surface of the surface table 20 to load and carry the inspection object 2. The work-piece stage 30 consists of a table 32 and a linear actuator 34. The table 32 is disposed to move on the upper surface of the surface table 20 along one direction of the surface table 20, that is, the X axis or the Y axis. The inspection object 2 is placed fixedly on an upper surface of the table 32 by a clamp or a fixture. In the example shown in
The linear actuator 34 is interposed between the upper surface of the surface table 20 and a lower surface of the table 32. The linear actuator 34 consists of a pair of linear motion guides and a linear motor 38. The pair of linear motion guides is interposed between the upper surface of the surface table 20 and the lower surface of the table 32, and the linear motor is disposed between the pair of linear motion guides and connected to the table 32. The linear motion guides include a pair of guide rails 36a and a plurality of sliders 36b. The guide rails 36a are fixed on the upper surface of the surface table 20, and the sliders 36b are fixed to the lower surface of the table 32 and operable to slide along the guide rails 36a. The table 32 moves linearly by driving of the linear motor 38 and the guide of the linear motion guides 36.
The linear actuator 34 may include a servo motor, a lead screw, a ball nut, and a pair of linear motion guides. The work-piece stage 30 may be implemented as a rectangular coordinate robot having X-axis and Y-axis linear actuators that move the table 32 linearly along the X- and Y-axis directions of the surface table 20. Furthermore, the work-piece stage 30 may include a multiaxial robot that linearly reciprocates the table 32 along the X-, Y-, and Z-axis directions of the surface table 20 and rotates the table 32 with respect to the X, Y, and Z axes. The rectangular coordinate robot or the multiaxial robot enables the inspection object 2 to be positioned precisely on the table 32.
A plurality of linescan cameras 40-1, 40-2, 40-3, . . . , 40-n are arranged on the upper surface of the surface table 20 along the X-axis direction so as to be aligned with respect to the second position P2. The linescan cameras 40-1, 40-2, 40-3, . . . , 40-n divide the inspection object 2 into areas, acquire images of the divided areas, and output scanned images of the inspection object 2. The linescan cameras 40-1, 40-2, 40-3, . . . , 40-n are installed, respectively, on a plurality of camera stages 50. The camera stages 50 are mounted on the overhead frame 26. Because the camera stages 50 are operable to prompt the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n to move linearly in the X-, Y-, or Z-axis directions and to rotate with respect to the X, Y, or Z axis, the positioning and focusing of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n can be precisely performed. The camera stages 50 may be configured to be driven by the linear actuator, the rectangular coordinate robot, or the multiaxial robot instead of the overhead frame 26.
The vision inspection system 10 shown in the examples illustrated in
The computer 60 controls the operation of the work-piece stage 30 to move the inspection object 2 with respect to the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n. Moreover, the computer 60 processes the scanned images input from the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n using an image processing program, and outputs resultant data, such as an output scanned is image of the inspection object 2 and a result of inspecting the defect 4, through an output device such as a monitor 64.
Referring to
The plurality of markings M-1, M-2, M-3, . . . , M-n are arranged along an arrangement direction of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n, that is, along the X-axis direction. Every two neighboring markings of all markings M-1, M-2, M-3, . . . , M-n are placed in a field of view FOV-1, FOV-2, FOV-3, . . . , FOV-N of each linescan camera 40-1, 40-2, 40-3, . . . , 40-n. The markings between the first and the last markings M-1 and M-n are respectively placed in overlapping portions of the fields of view of each two neighboring linescan cameras. In the example illustrated in
Hereinafter, a method of inspecting an inspection object using the vision inspection system having the above-described configuration will be described with reference to
Referring to
Referring to
Then, the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n are driven to scanned images of the markings M-1, M-2, M-3, . . . , M-n and obtain scanned images (S106), and marking image coordinate values are obtained from the scanned images of the markings M-1, M-2, M-3, . . . , M-n (S108). The computer 60 transmits a frame trigger signal to the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n for the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n to scan the images simultaneously.
A frame trigger line FT is shown in the examples illustrated in
The linescan cameras 40-1, 40-2, 40-3, . . . , 40-n input scanned images acquired by capturing images of the moved table 32 and markings M-1, M-2, M-3, . . . , M-n in the computer 60. As illustrated in
Thereafter, the computer 60 determines whether the marking image coordinate values fall within an allowable tolerance range with respect to the marking stage coordinate values of the markings M-1, M-2, M-3, . . . , M-n (S110). The determination of whether the difference between the marking image coordinate values the of the markings M-1, M-2, M-3, . . . , M-n falls within the tolerance error may be performed by verifying processing parameters of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n. The processing parameters include pixel resolution, an X-axis stage coordinate value OX (mm) and a Y-axis stage coordinate value OY (mm) of the zero point of the image frame, and inclination angles of the linescan cameras. The pixel resolution refers to an actual size of one pixel in the scanned image. The inclination angle of each linescan camera 40-1, 40-2, 40-3, . . . , 40-n refers to an angle of the linescan camera with respect to the X axis. The processing parameters of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n are acquired from the marking stage coordinate values and the marking image coordinate values.
An actual size value ReX (mm/Px) of one pixel in the X-axis direction with respect to the scanned images of the markings M-1, M-2, M-3, . . . , M-n is calculated using Equation 1. Although Rex is determined by optical systems of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n there may be a minute error due to alignment errors between the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n. Thus, for precise inspection of the inspection object 2, Rex is calculated using Equation 1 below
where X and x have the same positive direction. M1X represents an X-axis stage coordinate value of the left marking of the two markings which are located in each field of vision of the linescan cameras. M2X represents an X-axis stage coordinate value of the right marking m1x represents an X-axis image coordinate value of the left marking. m2x represents an X-axis image coordinate value of the right marking.
The inclination angle θ (radian) of each linescan camera with respect to the X axis is obtained using Equation 2 as below
where M2Y represents a Y-axis stage coordinate value of the right marking of the two markings placed in the field of view of each linescan camera. m2y indicates a Y-axis image coordinate value of the right marking.
The X-axis stage coordinate value OX (mm) and Y-axis stage coordinate value OY (mm) of the zero point 44 of the image frame 42 can be obtained as defined by Equations 3 below. OX and OY represent the actual coordinate values on the table.
OX=M
1
X−m
1
x×ReX
OY=M
1
Y−m
1
y×ReY−m
1
x×ReX×tan θ (3)
Here, X and x have the same positive direction, and Y and y also have the same positive direction. M1Y represents a Y-axis stage coordinate value of the left marking of the two markings placed in the field of view of each linescan camera. m1y represents a Y-axis image coordinate value of the left marking.
An actual size ReY (mm/Px) of one pixel in the Y-axis direction with respect to the scanned images of the markings M-1, M-2, M-3, . . . , M-n is determined by a travel speed S (mm/sec) of the inspection object and a cycle C (sec) of the trigger signal, and can be calculated as defined by Equation 4:
ReY=S×C (4)
To inspect a scanned image of the inspection object 2, the processing parameters of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n should be accurate. If the processing parameters are out of an allowable tolerance range, the precise inspection of the inspection object 2 cannot be realized. If the processing parameters of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n, which are obtained by processing the marking stage coordinate values and the marking image coordinate values, are within the allowable tolerance range, the computer 60 determines that the arrangement of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n is completed.
If the marking image coordinate values are out of the allowable tolerance range, the computer 60 stops the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n, and drives the linear motor 38 is of the linear actuator 34 in another direction to return the table 32 to the first position P1 (S112). When the table 32 is returned to the first position P1, the computer 60 outputs a message to request the arrangement of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n to an output device such as a monitor 62 (S114), and is terminated. An operator operates the camera stages 50 of the respective linescan cameras 40-1, 40-2, 40-3, . . . , 40-n to enable the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n to move linearly in the X-, Y-, and Z-axis directions and rotate with respect to the X, Y, and Z axes. Accordingly, precise positioning and focusing of the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n can be performed, thereby aligning the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n.
Meanwhile, in operation S110, when the marking image coordinate values are within the allowable tolerance range, the computer 60 drives the linescan cameras 40-1, 40-2, 40-3, . . . , 40-n to scan the inspection object 2 to obtain the scanned image (S116). The linescan cameras 40-1, 40-2, 40-3, . . . , 40-n scan the inspection object 2 which is moved while being loaded on the table 32 to obtain the scanned image, and input the scanned image of the inspection object 2 to the computer 60.
The computer 60 calculates the work-piece image coordinate value from the scanned image of the inspection object 2 (S118), and calculates a work-piece image-stage coordinate value from the calculated work-piece image coordinate value (S120). The computer 60 substitutes the work-piece image coordinate value in the stage coordinate transformation to produce the work-piece image-stage coordinate value. The work-piece image-stage coordinate value is an actual stage coordinate value of the inspection object 2.
The stage coordinate transformation may be expressed as Equations 5, which calculates the work-piece stage coordinate value from the work-piece image coordinate value.
WX=OX+wx×ReX
WY=OY+wy×ReY+wx×ReX×tan θ (5)
Here, WX (mm) represents a work-piece stage coordinate value with respect to the X axis, and WY (mm) represents a work-piece stage coordinate value with respect to the Y axis. wx represents a work-piece image coordinate value with respect to the X axis, and wy represents a work-piece image coordinate value with respect to the Y axis.
The computer 60 determines whether the work-piece image-stage coordinate value obtained from Equation 5 falls within an allowable tolerance range with respect to the work-piece stage coordinate value (S122). When the work-piece image-stage coordinate value is within the allowable tolerance range with respect to the work-piece stage coordinate value, the computer 60 determines that the inspection object 2 is non-defective (S124).
When the work-piece image-stage coordinate value is out of the allowable tolerance range with respect to the work-piece stage coordinate value, the computer 60 detects a difference between the work-piece image-stage coordinate value and the allowable tolerance range with respect to the work-piece stage coordinate value as a defect 4 (S126), and calculates a defect stage coordinate value of the defect 4 (S128). Specifically, the computer 60 computes a defect image coordinate value of the defect 4 from the scanned image of the inspection object 2, and calculates the defect stage coordinate value of the defect 4 by substituting the defect image coordinate value in the stage coordinate transformation in the same manner as when producing the work-piece image-stage coordinate value. The defect stage coordinate value is an actual coordinate value of the defect 4 which is present on the inspection object 2.
Referring to
A TFT-LCD panel has a sealed liquid crystal inlet. To inspect location and break of the seal, a stage coordinate value of the seal, that is, a target value of the seal, is first acquired, and then input to the database of the computer 60. Then, an image coordinate value and an image-stage coordinate value are obtained from the scanned image of each linescan camera 40-1, 40-2, 40-3, . . . , 40-n. If the seal is broken, the image-stage coordinate value falls out of the allowable tolerance range with respect to the stage coordinate value, and accordingly, the TFT-LCD panel is determined as being defective. The computer 60 determines a region where the seal is broken as a defect. In addition, if the length of the seal that is calculated from the scanned image of the seal is greater than the allowable tolerance range, the seal is determined as a defect.
The computer 60 displays the result of inspecting the inspection object 2 through an output device such as the monitor 62, and stores the result in the database 64 (S130). The computer 60 calculates a size of the defect 4, and determines the inspection object 2 that has the defect 4 as being defective. Finally, once the inspection of the inspection object 2 is completed, the table 32 is returned from the second position P2 to the first position P1 (S132). Thus, the defect 4 of the inspection object 2 can be precisely inspected, thereby significantly increasing the reliability and reproducibility.
A number of examples have been described above. Nevertheless, it will be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or is supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
As described above, according to the vision inspection system and the inspection method of inspecting an inspection object using the vision inspection system, a plurality of markings are provided as references on a table which is moved while loading the inspection object thereon, processing parameters of linescan cameras are calculated with reference to the markings, and positioning and aligning of the linescan cameras are conveniently performed by verifying the processing parameters. Moreover, a defect of the inspection object is precisely and accurately inspected, so that the reliability and reproducibility can be substantially improved.
Number | Date | Country | Kind |
---|---|---|---|
10-2008-0014403 | Feb 2008 | KR | national |
This application claims the benefit under 35 U.S.C. §119(a) of International Patent Application No. PCT/KR2009/000602, filed on Feb. 10, 2009, the entire disclosure of which is incorporated herein by reference for all purposes.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/KR09/00602 | 2/10/2009 | WO | 00 | 8/17/2010 |