The present application claims priority of Japanese Application Number 2017-245290, filed on Dec. 21, 2017, the disclosure of which is hereby incorporated by reference herein in its entirety.
1. Technical Field
The invention relates to an object inspection system and an object inspection method.
2. Background Art
Object inspection systems that capture images of the surface of an object such as a workpiece and inspect the surface of the object are known (e.g., Japanese Unexamined Patent Publication (Kokai) No. 2017-015396).
Such object inspection systems acquire a master image of the object that serves as a reference, acquire the inspection image of the object to be inspected, and carry out an inspection based on a comparison of the two images to see whether there is any visually recognizable error in the object to be inspected relative to the reference object.
A displacement may occur between the master image and the inspection image in such a case. A technique has been desired for a quick and easy registering of the master image and the inspection image in case of such a displacement.
In an aspect of the present disclosure, the object inspection system includes an imaging section configured to image a first object and a second object which have a common outer shape; a movement machine configured to move the first object or the second object and the imaging section relative to each other; a positional data acquisition section configured to acquire first positional data of the movement machine when the movement machine disposes the first object and the imaging section at a first relative position, and acquire second positional data of the movement machine when the movement machine disposes the second object and the imaging section at a second relative position; an image data acquisition section configured to acquire a first image of the first object imaged by the imaging section at the first relative position, and acquire a second image of the second object imaged by the imaging section at the second relative position; and an image registering section configured to register the first image and the second image with each other in an image coordinate system of the imaging section, using the first positional data, the second positional data, and a known positional relation between the image coordinate system and a movement machine coordinate system of the movement machine, wherein the object inspection system is configured to inspect whether or not there is a visually recognizable error in the second object with respect to the first object, based on the first image and the second image that are registered with each other.
In another aspect of the present disclosure, an object inspection method includes acquiring first positional data of a movement machine when the movement machine disposes a first object and an imaging section at a first relative position; acquiring a first image of the first object imaged by the imaging section at the first relative position; acquiring second positional data of the movement machine when the movement machine disposes a second object and the imaging section at a second relative position, the first object and the second object having a common outer shape; acquiring a second image of the second object imaged by the imaging section at the second relative position; registering the first image and the second image with each other in an image coordinate system of the imaging section, using the first positional data, the second positional data, and a known positional relation between the image coordinate system and a movement machine coordinate system of the movement machine and; and inspecting whether or not there is a visually recognizable error in the second object with respect to the first object, based on the first image and the second image that are registered with each other.
According to an aspect of the present disclosure, when there is a displacement between the master image and the inspection image, the two images can be registered using positional data of the movement machine. This reduces the task of registering and shortens the time for the task.
In the following, embodiments of the present disclosure will be described in detail with reference to the drawings. In the various embodiments described below, similar elements are assigned the same reference numerals, and the redundant descriptions thereof will be omitted. First, with reference to
The object inspection system 10 includes a controller 12, a movement machine 14, an imaging section 16, and a lighting device 18. The controller 12 includes e.g. a CPU and a storage (not illustrated), and controls the movement machine 14, the imaging section 16, and the lighting device 18.
In this embodiment, the movement machine 14 is a vertical articulated robot, and includes a robot base 20, a swiveling body 22, a robot arm 24, a wrist 26, and a robot hand (gripper) 28. The robot base 20 is fixed on a floor of a work cell. The swiveling body 22 is mounted on the robot base 20 so as to be rotatable about a vertical axis.
The robot arm 24 includes an upper arm 30 rotatably coupled to the swiveling body 22 and a forearm 32 rotatably coupled to a distal end of the upper arm 30. The wrist 26 is attached to a distal end of the forearm 32 and rotatably supports the robot hand 28.
As illustrated in
The plurality of fingers 36 extend from the hand base 34 in a direction, and include stepped portions 36a on surfaces of the
The movement machine 14 includes a plurality of servomotors 38 (
A robot coordinate system (movement machine coordinate system) CR (
A tool coordinate system CT is set for the robot hand 28. The tool coordinate system CT is one of the coordinate systems for automatic control, and the position and orientation of the robot hand 28 in space are determined by expressing the position and direction of the tool coordinate system CT in the robot coordinate system CR.
As illustrated in
The controller 12 operates the swiveling body 22, the robot arm 24, and the wrist 26 in the robot coordinate system CR such that the position and orientation of the robot hand 28 coincide with those determined by the tool coordinate system CT. In this way, the robot hand 28 is disposed at any position and orientation in the robot coordinate system CR.
The imaging section 16 includes an optical system such as a focus lens, and an image sensor such as a CCD sensor or a CMOS sensor. In this embodiment, the imaging section 16 is fixed at a predetermined position in the robot coordinate system CR so as to be separate away from the movement machine 14. The imaging section 16 images an object W in accordance with a command from the controller 12, and sends the captured image to the controller 12.
The fixation position of the imaging section 16 and the optical axis O of the imaging section 16 (i.e., the optical path of the object image entering the optical system of the imaging section 16) are expressed by coordinates in the robot coordinate system CR by calibrating them with respect to the robot coordinate system CR, and pre-stored in the storage of the controller 12. Due to this, the controller 12 can recognize the positions of the imaging section 16 and the optical axis O in the robot coordinate system CR.
The lighting device 18 includes e.g. an incandescent lamp, fluorescent lamp, or an LED, and is fixed to a predetermined position. The lighting device 18 switches on and off in accordance with a command from the controller 12, and illuminates the object W gripped by the movement machine 14.
Next, with reference to
The first object W1 and the second object W2 have a common outer shape. In this embodiment, the first object W1 and the second object W2 are rectangular plates having the same outer shape. Each of the first object W1 and the second object W2 includes a total of four holes H.
First, the controller 12 operates the movement machine 14 so as to grip the first object W1 stored in a predetermined storage place by the robot hand 28. At this time, the robot hand 28 grips the first object W1 at a gripping position (target position) designated in advance. This gripping position is determined by a position and direction of the tool coordinate system CT set by the controller 12 when the robot hand 28 grips the first object W1.
As an example, the operator designates the position of the origin of the tool coordinate system CT on the first object W1 by operating on an operation section (not illustrated) such as a keyboard or a touch panel provided in the object inspection system 10.
Assume that the operator designates the position of the origin of the tool coordinate system CT at the center of the upper surface SU of the first object W1 as illustrated in
The controller 12 then operates the movement machine 14 so as to dispose the robot hand 28 at the position and orientation defined by the set tool coordinate system CT, and grip the first object W1 by the robot hand 28. As a result, as illustrated in
When the robot hand 28 grips the first object W1 at the designated gripping position, the position of the first object W1 relative to the robot hand 28 (i.e., the tool coordinate system CT) becomes known, and any position on the first object W1 can be expressed by the coordinates in the robot coordinate system CR, using this gripping position and the drawing data of the first object W1.
In other words, the first object W1 gripped by the robot hand 28 at a known gripping position can be regarded as a component of the movement machine 14, which is controlled in the robot coordinate system CR.
Then, the controller 12 operates the movement machine 14 so as to move the first object W1 such that at least a part of the surface SI of the first object W1 is within a field of view A of the imaging section 16, and dispose the first object W1 and the imaging section 16 at a reference relative position (“reference relative position” corresponding to “first relative position” in the claims).
The field of view A of the imaging section 16 will be described with reference to
An example of the view angle of the imaging section 16 is illustrated as an imaginary line B in
In other words, the field of view A indicates the area on the surface SI that can be imaged in focus by the imaging section 16 when the imaging section 16 and the surface SI are disposed separate away from each other by a distance D. The resolution of an image imaged by the imaging section 16 is inversely proportionate to the field of view A. Specifically, smaller the field of view A is, higher the resolution of the captured image is.
After gripping the first object W1 by the robot hand 28 at the gripping position, the controller 12 operates the movement machine 14 so as to dispose the robot hand 28 at a first position and orientation illustrated in
More specifically, the controller 12 sets the tool coordinate system CT at a first position and direction (i.e., the position of the origin and the directions of the axes) illustrated in
As a result, the robot hand 28 is disposed at the first position and orientation, and the first object W1 gripped by the robot hand 28 is positioned at the first reference relative position with respect to the imaging section 16 as illustrated in
At this time, the field of view A of the imaging section 16 is disposed at the position illustrated in
Then, the controller 12 sends a command to the lighting device 18 so as to switch on the lighting device 18. Whereby, the first object W1 gripped by the movement machine 14 is illuminated by the lighting device 18.
Then, the controller 12 sends an imaging command to the imaging section 16. Upon receiving the imaging command from the controller 12, the imaging section 16 images the surface SI of the first object W1.
The first master image 40 illustrated in
Each pixel of the first master image 40 imaged by the imaging section 16 is expressed by coordinates in the image coordinate system CI in
The positional relation between the robot coordinate system CR and the image coordinate system CI (i.e., the position of the origin and the axis-directions of the image coordinate system CI with respect to the robot coordinate system CR) is known by the above-described calibration.
More specifically, the operator acquires the coordinate transformation data between the robot coordinate system CR and the image coordinate system CI by calibrating the fixation position and the position of the optical axis O of the imaging section 16 with respect to the robot coordinate system CR. The coordinate transformation data is represented by e.g. a Jacobian matrix for transforming coordinates in the robot coordinate system CR into coordinates in the image coordinate system CI.
The controller 12 acquires the coordinate transformation data, and stores it in the storage. Thus, in this embodiment, the controller 12 functions as a coordinate transformation data acquisition section 42 (
The controller 12 acquires the positional data of the movement machine 14 when the first object W1 and the imaging section 16 are disposed at the first reference relative position and the imaging section 16 images the first object W1 (first positional data, referred to as “reference positional data” hereinafter).
For example, the controller 12 acquires, as a first reference positional data, information of the position and direction of the tool coordinate system CT in the robot coordinate system CR when the first object W1 and the imaging section 16 are disposed at the first reference relative position.
Alternatively, the controller 12 records the rotation angles of the servomotors 38 of the movement machine 14 when the first object W1 and the imaging section 16 are disposed at the first reference relative position, and calculates the position and orientation of the robot hand 28 in the robot coordinate system CR from the calculated rotation angles. The controller 12 may acquire the calculated position and orientation of the robot hand 28 as the first reference positional data.
The first reference positional data are information corresponding to the position and orientation of the first object W1 in the robot coordinate system CR when the first object W1 and the imaging section 16 are disposed at the first reference relative position. Thus, in this embodiment, the controller 12 functions as a positional data acquisition section 44 (
Then, the controller 12 operates the movement machine 14 so as to dispose the robot hand 28 gripping the first object W1 at a second position and orientation. When the robot hand 28 is disposed at the second position and orientation, the first object W1 gripped by the robot hand 28 is disposed at a second reference relative position with respect to the imaging section 16. At this time, the field of view A of the imaging section 16 is disposed at an area A2 with respect to the surface SI of the first object W1 as illustrated in
When the first object W1 and the imaging section 16 are positioned at the second reference relative position (i.e., the robot hand 28 is disposed at the second position and orientation), the controller 12 sends an imaging command to the imaging section 16 so as to image an image the surface SI of the first object W1. As a result, a second master image corresponding to the area A2 in
The area An (n=1 to 12) in
As illustrated in
When the robot hand 28 is disposed at the n-th position and orientation, the controller 12 sets the tool coordinate system CT at the n-th position and direction. The controller 12 then operates the movement machine 14 and moves the robot hand 28 so as to coincide with the position and orientation defined by the tool coordinate system CT disposed at the n-th position and direction.
In this way, the controller 12 sequentially disposes the robot hand 28 at the third position and orientation, the fourth position and orientation, . . . the n-th position and orientation, whereby, sequentially positions the first object W1 gripped by the robot hand 28 at the third reference relative position, the fourth reference relative position, . . . the n-th reference relative position with respect to the imaging section 16.
The controller 12 causes the imaging section 16 to image the surface SI of the first object W1 each time the first object W1 and the imaging section 16 are positioned at the respective reference relative positions. As a result, the controller 12 acquires the first to the twelfth master images that correspond to the areas A1 to A12 in
Further, the controller 12 functions as the positional data acquisition section 44 and acquires, as the n-th reference positional data, the position of the movement machine 14 when the first object W1 and the imaging section 16 are disposed at the n-th reference relative position and the n-th master image is captured.
The controller 12 carries out a series of positioning operations as described above, in accordance with a robot program. The robot program can be constructed e.g. by the operator teaching to the actual movement machine 14 the motions of disposing the robot hand 28 at the n-th position and orientation using a teach pendant (not illustrated).
The robot program includes information of the n-th position and direction of the tool coordinate system CT and information of the rotation angles of the servomotors 38 of the movement machine 14 when disposing the robot hand 28 at the n-th position and orientation.
Then, the controller 12 acquires an inspection image (second image) of a surface SI to be inspected of a second object W2. More specifically, the controller 12 operates the movement machine 14 so as to grip the second object W2 stored in a predetermined storage place by the robot hand 28.
At this time, the robot hand 28 grips the second object W2 at the same gripping position as that designated for the first object W1. Specifically, when the robot hand 28 grips the second object W2, as illustrated in
The controller 12 then operates the movement machine 14 so as to dispose the robot hand 28 at the position and orientation defined by the set tool coordinate system CT, and grip the second object W2 by the robot hand 28. In this way, the robot hand 28 grips the second object W2 at the same gripping position as that for the first object W1.
Then, the controller 12 operates the movement machine 14 so as to dispose the second object W2 gripped by the robot hand 28 at a first inspection relative position (“inspection relative position” corresponding to “second relative position” in the claims) with respect to the imaging section 16. The first inspection relative position is a position corresponding to the first reference relative position. Then, the controller 12 sends an imaging command to the imaging section 16 so as to image the surface SI of the second object W2, and acquires an inspection image (second image).
In this respect, a displacement may occur between the first reference relative position and the first inspection relative position. Such a displacement can occur when the controller 12 causes the imaging section 16 to image the second object W2 while moving the second object W2 by the movement machine 14. Alternatively, such a displacement can occur when the position and orientation of the movement machine 14 is slightly adjusted upon imaging the second object W2.
In the first inspection image 48 illustrated in
Such a positional difference δ occurs when the second object W2 disposed at the first inspection relative position is displaced from the first object W1 disposed at the first reference relative position in the x-axis positive direction of the image coordinate system CI.
In the first inspection image 50 illustrated in
Such a positional difference θ occurs when the second object W2 disposed at the first inspection relative position is displaced from the first object W1 disposed at the first reference relative position in the rotational direction about the optical axis O of the imaging section.
Further, in the first inspection image 52 illustrated in
Such a positional difference α occurs when the second object W2 disposed at the first inspection relative position is displaced from the first object W1 disposed at the first reference relative position in the direction along the optical axis O of the imaging section (more specifically, displaced more separate away from).
When this positional difference α occurs, the distance D1 between the surface SI of the first object W1 disposed at the first reference relative position and the imaging section 16 is different from the distance D2 between the surface SI of the second object W2 disposed at the first inspection relative position and the imaging section 16 (more specifically, D1<D2). The positional difference α represents the reduction rate of the first inspection image 52 to the first master image 40.
In the first inspection image 53 illustrated in
This first inspection image 53 is described with reference to
As described above, the first object W1 is positioned at the first reference relative position with respect to the imaging section 16 when the first master image 40 is captured. As illustrated in section (a) in
On the other hand, as illustrated in section (b) in
Due to this inclination, the image of the second object W2 in the first inspection image 53 is displaced from the image of the first object W1 in the first master image 40 as indicated by the arrow E. The positional difference between the image of the second object W2 in the first inspection image 53 and the image of the first object W1 in the first master image 40 due to such an inclination can be expressed by a matrix M (e.g., a homography matrix) described later.
If the above-mentioned displacement occurs, it is necessary to register the master image and the inspection image with each other when comparing the master image and the inspection image in order to inspect the surface SI of the second object W2. The object inspection system 10 according to this embodiment calculates the positional difference δ, θ, α, M, and registers the master image and the inspection image on the basis of the positional difference δ, θ, α, M.
Herein, “to register” (or “registering”) means to cause the image of the first object W1 in the master image and the image of the second object W2 in the inspection image to coincide with each other (i.e., position alignment) in the image coordinate system CI.
More specifically, the controller 12 functions as the positional data acquisition section 44, and acquires the positional data of the movement machine 14 (second positional data, referred to as “inspection positional data” hereinafter) when the movement machine 14 disposes the second object W2 and the imaging section 16 at the first inspection relative position and the first inspection image 48, 50, 52 is captured by the imaging section 16.
For example, the controller 12 acquires, as the first inspection positional data, information of the position and direction of the tool coordinate system CT in the robot coordinate system CR when the second object W2 and the imaging section 16 are disposed at the first inspection relative position.
Alternatively, the controller 12 records the rotation angles of the servomotors 38 of the movement machine 14 when the second object W2 and the imaging section 16 are disposed at the first inspection relative position, and calculates the position and orientation of the robot hand 28 in the robot coordinate system CR from the rotation angles. The controller 12 may acquire the calculated position and orientation of the robot hand 28 as the first inspection positional data.
The first inspection positional data are the data corresponding to the position and orientation of the second object W2 in the robot coordinate system CR when the second object W2 and the imaging section 16 are disposed at the first inspection relative position.
Similarly, the controller 12 operates the movement machine 14 so as to position the second object W2 gripped by the robot hand 28 at the n-th inspection relative position (n=2 to 12) with respect to the imaging section 16. The n-th inspection relative position corresponds to the n-th reference relative position.
The controller 12 causes the imaging section 16 to image the surface SI of the second object W2 each time the second object W2 and the imaging section 16 are positioned at the n-th inspection relative position, and acquires the n-th inspection image. Then, the controller 12 functions as the positional data acquisition section 44 to acquires, as the n-th inspection positional data, the positional data of the movement machine 14 when the second object W2 and the imaging section 16 are positioned at the n-th inspection relative position.
Subsequently, the controller 12 calculates the first positional difference (e.g., δ, θ, or α) on the basis of the positional relation between the robot coordinate system CR and the image coordinate system CI, the first reference positional data, and the first inspection positional data.
As an example, regarding the above-described positional difference δ or θ, the controller 12 acquires the coordinates in the robot coordinate system CR of the position and direction of the tool coordinate system CT included in the first reference positional data, and the coordinates in the robot coordinate system CR of the position and direction of the tool coordinate system CT included in the first inspection positional data, and calculates the difference ΔR1 between these two sets of coordinates.
The controller 12 then transforms the difference ΔR1, which is expressed as the coordinates in the robot coordinate system CR, into coordinates in the image coordinate system CI, using the coordinate transformation data obtained by calibration. Thus, the positional difference δ or θ in the image coordinate system CI is calculated.
Further, regarding the above-described positional difference α, the controller 12 calculates the above-described distances D1 and D2 from the coordinates in the robot coordinate system CR of the position of the tool coordinate system CT included in the first reference positional data, and the coordinates in the robot coordinate system CR of the position of the tool coordinate system CT included in the first inspection positional data.
The controller 12 then calculates the positional difference α representing the reduction rate (or the enlargement rate) of the first inspection image 52 to the first master image 40, based on the calculated ratio between D1 and D2. The ratio between D1 and D2 correlates to the reduction rate a (or the enlargement rate α) of the first master image 40 to the first inspection image 52.
Using the similar method, the controller 12 calculates the n-th positional difference between the image of the first object W1 in the n-th master image and the image of the second object W2 in the n-th inspection image (n=2 to 12). Thus, in this embodiment, the controller 12 functions as an image positional difference acquisition section 54 (
Subsequently, the controller 12 shifts the first master image 40 or the first inspection image 48, 50, 52 on the basis of the calculated first positional difference (δ, θ, α) so as to register these two images.
For example, if the controller 12 calculates the first positional difference δ, the controller 12 moves the first inspection image 48 (or the first master image 40) in the x-axis negative direction (or the x-axis positive direction) of the image coordinate system CI on the x-y plane of the image coordinate system CI by the first positional difference δ.
If the controller 12 calculates the first positional difference θ, the controller 12 rotates the first inspection image 50 (or the first master image 40) counterclockwise (or clockwise) as viewed from the front of
If the controller 12 calculates the first positional difference α, the controller 12 enlarges (or reduces) the first inspection image 48 (or the first master image 40) on the basis of the first positional difference α.
As a result, the image of the first object W1 in the first master image 40 and the image of the second object W2 in the inspection image 48 can be coincided with each other in the image coordinate system CI. Thus, in this embodiment, the controller 12 functions as an image registering section 56 (
Then, the controller 12 shifts the n-th master image or the n-th inspection image on the basis of the n-th positional difference, and sequentially registers these two images, similarly.
Then, the controller 12 inspects whether or not there is any visually recognizable error in the second object W2 with respect to the first object W1, using the n-th master image (40) and the n-th inspection image (48, 50, 52) registered with each other. More specifically, the controller 12 generates the n-th differential image between the n-th master image and the n-th inspection image registered with each other.
The n-th differential image represents e.g. the difference between the luminance of each pixel of the n-th master image and the luminance of each pixel of the n-th inspection image. By analyzing the n-th differential image, the controller 12 can inspect whether or not there is an error (a scar, difference in surface roughness, etc.) on the surface SI of the second object W2 with respect to the surface SI of the first object W1.
Thus, in this embodiment, the controller 12 functions as an object inspection section 58 (
Next, with reference to
In step S1, the controller 12 acquires a master image (first image) and reference positional data (first positional data). This step S1 will be described with reference to
In step S12, the controller 12 acquires the n-th master image. Specifically, the controller 12 operates the movement machine 14 so as to dispose the first object W1 gripped by the robot hand 28 at the n-th reference relative position with respect to the imaging section 16, by the above-mentioned method.
The controller 12 then causes the imaging section 16 to image the surface SI of the first object W1 and acquires the n-th master image. If the number “n” is set as n=1 at the start of this step S12, the controller 12 acquires the first master image 40 illustrated in
In step S13, the controller 12 acquires the n-th reference positional data. More specifically, the controller 12 acquires, as the n-th reference positional data, the positional data of the movement machine 14 when the first object W1 and the imaging section 16 are disposed at the n-th reference relative position, by the above-mentioned method.
In step S14, the controller 12 increments the number “n” by “1” (i.e., n=n+1).
In step S15, the controller 12 determines whether or not the number “n” is greater than “12”. This number “12” corresponds to the number of areas A1 to A12 illustrated in
On the other hand, when the controller 12 determines that the number “n” is not greater than “12” (i.e., determines NO), the controller 12 returns to step S12. Thus, the controller 12 carries out the loop of steps S12 to S15 until it determines YES in step S15.
With reference to
In step S22, the controller acquires the n-th inspection image. More specifically, the controller 12 operates the movement machine 14 so as to move the second object W2 gripped by the robot hand 28 with respect to the imaging section 16, and disposes the second object W2 and the imaging section 16 at the n-th inspection relative position, by the above-mentioned method.
The controller 12 then causes the imaging section 16 to image the surface SI of the second object W2 and acquires the n-th inspection image. At this time, the controller 12 may cause the imaging section 16 to image the second object W2 while moving the second object W2 with the movement machine 14. In such a case, the relative position of the second object W2 relative to the imaging section 16 at a time when the second object W2 is imaged by the imaging section 16 is the n-th inspection relative position.
If the number “n” is set as n=1 at the start of step S22, the controller 12 acquires the first inspection image 48, 50, or 52 illustrated in
In step S23, the controller 12 acquires the n-th inspection positional data. More specifically, the controller 12 acquires, as the n-th inspection positional data, the positional data of the movement machine 14 when the second object W2 and the imaging section 16 are disposed at the n-th inspection relative position, by the above-mentioned method.
In step S24, the controller 12 increments the number “n” by “1” (i.e., n=n+1).
In step S25, the controller 12 determines whether or not the number “n” is greater than “12”. When the controller 12 determines that the number “n” is greater than “12” (i.e., determines YES), the controller 12 proceeds to step S3 in
When the controller 12 determines that the number “n” is not greater than “12” (i.e., determines NO), the controller 12 returns to step S22. Thus, the controller 12 carries out the loop of steps S22 to S25 until it determines YES in step S25.
With reference to
In step S32, the controller 12 determines whether or not there is a difference between the n-th reference relative position and the n-th inspection relative position. More specifically, the controller 12 reads out the n-th reference positional data (e.g., the position coordinates of the tool coordinate system CT in the robot coordinate system CR) and the n-th inspection positional data, and determines whether or not the difference Δ2 between them is equal to or more than a predetermined threshold value β.
When Δ2≥β is satisfied, the controller 12 determines YES, and proceeds to step S33. When Δ2<β, the controller 12 determines NO, and proceeds to step S34.
In step S33, the controller 12 acquires the n-th positional difference. When the number “n” is set as n=1 at the start of step S33, the controller 12 acquires the first positional difference δ, θ, or α between the first master image 40 and the first inspection image 48, 50, or 52 in the image coordinate system CI, by the above-mentioned method.
In step S34, the controller 12 increments the number “n” by “1” (i.e., n=n+1).
In step S35, the controller 12 determines whether or not the number “n” is greater than “12”. When the controller 12 determines that the number “n” is greater than “12” (i.e., determines YES), the controller 12 proceeds to step S4 in
When the controller 12 determines that the number “n” is not greater than “12” (i.e., determines NO), the controller 12 returns to step S32. Thus, the controller 12 carries out the loop of steps S32 to S35 until it determines YES in step S35.
With reference to
In step S42, the controller 12 determines whether or not the controller 12 has acquired the n-th positional difference in the above-described step S33. When the number “n” is set as n=1 at the start of step S42, the controller 12 determines whether or not the controller 12 has acquired the first positional difference (e.g., δ, θ, or α).
When the controller 12 determines that the controller 12 has acquired the n-th positional difference (i.e., determines YES), the controller 12 proceeds to step S43. When the controller 12 determines that the controller 12 has not acquired the n-th positional difference (i.e., determines NO), the controller 12 proceeds to step S44.
In step S43, the controller 12 registers the n-th master image and the n-th inspection image. When the number “n” is set as n=1 at the start of step S43, the controller 12 shifts the first master image 40 or the first inspection image 48, 50, or 52 so as to register these two images, on the basis of the first positional difference (δ, θ, α), by the above-mentioned method.
As a result, the image of the first object W1 in the first master image 40 and the image of the second object W2 in the inspection image 48 can be coincided with each other in the image coordinate system CI.
On the other hand, when the controller 12 determines NO in step S42, in step S44, the controller 12 superimposes the n-th master image on the n-th inspection image with reference to the image coordinate system CI. There is no n-th positional difference between these n-th master image and the n-th inspection image, i.e., the n-th reference relative position is the same as the n-th inspection relative position.
Therefore, by simply superimposing the n-th master image on the n-th inspection image with reference to the image coordinate system CI, the image of the first object W1 in the first master image 40 and the image of the second object W2 in the inspection image 48 are coincided with each other in the image coordinate system CI.
In step S45, the controller 12 increments the number “n” by “1” (i.e., n=n+1).
In step S46, the controller 12 determines whether or not the number “n” is greater than “12”. When the controller 12 determines that the number “n” is greater than “12” (i.e., determines YES), the controller 12 proceeds to step S5 in
When the controller 12 determines that the number “n” is not greater than “12” (i.e., determines NO), the controller 12 returns to step S42. Thus, the controller 12 carries out the loop of steps S42 to S46 until it determines YES in step S46.
With reference to
The controller 12 analyzes the n-th differential image and inspects whether or not there is any error on the surface SI of the second object W2 with respect to the surface SI of the first object W1. By executing these operations from n=1 to n=12, it is possible to inspect all area in the surface SI of the second object W2.
As described above, in this embodiment, the controller 12 registers the n-th master image and the n-th inspection image with each other in the image coordinate system CI, using the known positional relation (coordinate transformation data) between the robot coordinate system CR and the image coordinate system CI, the n-th reference positional data, and the n-th inspection positional data.
According to this configuration, if there is a displacement between the master image and the inspection image, it is possible to register these two images with each other using the positional data of the movement machine 14. Whereby, it is possible to facilitate and expedite the work necessary for registering.
Further, in this embodiment, the controller 12 acquires the coordinate transformation data indicating the positional relation between the robot coordinate system CR and the image coordinate system CI. According to this configuration, the coordinates in the robot coordinate system CR can be transformed into those in the image coordinate system CI, and it is possible to accurately register the master image and the inspection image with each other, using the positional data of the movement machine 14.
Further, in this embodiment, the controller 12 acquires the positional difference (e.g., δ, θ, or α) (step S3), and registers the master image and the inspection image on the basis of the positional difference. According to this configuration, it is possible to more accurately register the master image and the inspection image with each other.
Next, with reference to
In this embodiment, the controller 12 executes the step S1 illustrated in
In step S51, the controller 12 receives designation of n-th group of reference points to be set on the n-th master image. Assume that the number “n” is set as n=1 at the present moment. In this case, the controller 12 displays the first master image 40 illustrated in
The operator operates an operation section, such as a keyboard or a touch panel while viewing the first master image 40 displayed on the display, and designates at least one reference point.
The reference points 60, 62 and 64, and the reference points 80, 82, 84 and 86, which are designated for the first master image 40, will be referred to as a first group of reference points 60, 62, and 64, and a first group of reference points 80, 82, 84, and 86.
The controller 12 receives the designation of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86). Thus, in this embodiment, the controller 12 functions as a reference point reception section 70 (
In step S52, the controller 12 acquires the n-th reference positional data. More specifically, the controller 12 acquires, as the n-th reference positional data, the coordinates of the n-th group of reference points in the robot coordinate system CR, as well as the coordinates of the n-th group of reference points in the tool coordinate system CT.
Assume that the number “n” is set as n=1 at the present moment. In this case, the controller 12 first acquires the coordinates in the image coordinate system CI of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) designated in step S51.
Then, the controller 12 transforms the coordinates of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the image coordinate system CI to those in the robot coordinate system CR, using the coordinate transformation data (i.e., transformation matrix) between the robot coordinate system CR and the image coordinate system CI, whereby acquiring the coordinates of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the robot coordinate system CR.
Then, the controller 12 acquires the coordinates of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the tool coordinate system CT. At this time, the tool coordinate system CT is disposed at the first position and direction, which correspond to the first reference relative position, in the robot coordinate system CR (step S12).
Since the positional relation between the robot coordinate system CR and the tool coordinate system CT is thus known, the controller 12 can acquire the coordinates of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the tool coordinate system CT by multiplying the coordinates of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the robot coordinate system CR by a transformation matrix. This transformation matrix is e.g. a Jacobian matrix and determined by the position and direction of the tool coordinate system CT in the robot coordinate system CR.
The coordinates of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the tool coordinate system CT is the data indicative of the positions of the reference points 60, 62, and 64 (or 80, 82, 84, and 86) on the surface S1 of the first object W1.
In this way, the controller 12 acquires, as the positional data of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86), the coordinates of the reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the robot coordinate system CR, as well as the coordinates of the reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the tool coordinate system CT.
As described above, when the robot hand 28 grips the first object W1 at the designated gripping position, the first object W1 can be regarded as a component of the movement machine 14. Therefore, the positional data of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) on the first object W1 can be regarded as the first reference positional data of the movement machine 14 when the movement machine 14 disposes the first object W1 and the imaging section 16 at the first reference relative position.
Thus, the controller 12 functions as a positional data acquisition section 66 (
With reference to
Step S2 according to this embodiment differs from the above-described embodiment in step S23. Specifically, in step S23, the controller 12 acquires, as the n-th inspection positional data, the positional data of points on the second object W2 (second reference points, referred to as “inspection reference points” hereinafter), that correspond to the n-th group of reference points, when the movement machine 14 disposes the second object W2 and the imaging section 16 at the n-th inspection relative position.
If the number “n” is set as n=1 at the present moment, the controller 12 acquires the positional data of a first group of inspection reference points on the second object W2 corresponding to the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86).
The first group of inspection reference points are defined such that the positions of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) with respect to the first object W1 (i.e., coordinates in the tool coordinate system CT) are the same as the positions of the first group of inspection reference points with respect to the second object W2.
The controller 12 acquires, as the first inspection positional data, the coordinates of the first group of inspection reference points in the robot coordinate system CR. The coordinates of the first group of inspection reference points in the robot coordinate system CR can be calculated from the coordinates of the first group of reference points 60, 62, and 64 (or 80, 82, 84, and 86) in the tool coordinate system CT acquired in step S52, and from the position and direction of the tool coordinate system CT in the robot coordinate system CR when the second object W2 and the imaging section 16 are disposed at the first inspection relative position.
Thus, in this step S23, the controller 12 functions as a positional data acquisition section 66 (
With reference to
Specifically, in step S33, the controller 12 acquires, as the n-th positional difference, the difference in the image coordinate system CI between the positional data of the n-th group of reference points acquired in step S52 and the positional data of the n-th group of inspection reference points acquired in step S23.
If the number “n” is set as n=1 at the present moment, and the controller 12 transforms the coordinates of the first group of inspection reference points in the robot coordinate system CR acquired in step S23 to the coordinates in the image coordinate system CI, using the coordinate transformation data (transformation matrix) between the robot coordinate system CR and the image coordinate system CI.
In the first inspection image 48 illustrated in
In the first inspection image 48, the image of the second object W2 is displaced from the image of the first object W1 in the first master image 40 by a positional difference δ in the x-axis positive direction of the image coordinate system CI. Such a displacement coincides with the positional difference in the image coordinate system CI between the reference point 60 and the inspection reference point 60a, between the reference point 62 and the inspection reference point 62a, and between the reference point 64 and the inspection reference point 64a.
The controller 12 acquires, as the first positional difference δ, the difference between the coordinates of the reference point 60, 62, or 64 and the coordinates of the inspection reference point 60a, 62a, or 64a in the image coordinate system CI.
In the first inspection image 50 illustrated in
In the first inspection image 50, the image of the second object W2 in the first inspection image 50 is rotated from the image of the first object W1 in the first master image 40 about the optical axis O of the imaging section by a positional difference θ.
As an example, on the basis of the coordinates in the image coordinate system CI of the reference points 60, 62, and 64 and of the inspection reference points 60b, 62b, and 64b, the controller 12 calculates an angle formed by a straight line passing through two of the reference points 60, 62, and 64 (e.g., reference points 60 and 62) and a straight line passing through two of the inspection reference points 60b, 62b, and 64b (e.g., inspection reference points 60b and 62b) that correspond to the two of the reference points 60, 62, and 64. The controller 12 acquires the angle as the positional difference θ.
In the first inspection image 52 illustrated in
In the first inspection image 52, the image of the second object W2 in the first inspection image 52 is reduced with respect to the image of the first object W1 in the first master image 40 by a positional difference α.
As an example, on the basis of the coordinates in the image coordinate system CI of the reference points 60, 62, and 64 and of the inspection reference points 60c, 62c, and 64c, the controller 12 calculates the ratio of the area of the figure (a triangle in this embodiment) defined by the reference points 60, 62, and 64 to the area of the figure defined by the inspection reference points 60c, 62c, and 64c in the image coordinate system CI. The controller 12 calculates the positional difference α based on the calculated ratio.
As an example, the controller 12 calculates a matrix M that satisfies following equations 1-4, using the coordinates CR_80, CR_82, CR_84, and CR_86 of the first group of reference points 80, 82, 84, and 86 in the robot coordinate system CR, which have been acquired in the above-described step S52, and the coordinates CR_80a, CR_82a, CR_84a, and CR_86a of the first group of inspection reference points 80a, 82a, 84a, and 86a in the robot coordinate system CR, which have been acquired in the above-described step S23.
CR_80=M·CR_80a (equation 1)
CR_82=M·CR_82a (equation 2)
CR_84=M·CR_84a (equation 3)
CR_86=M·CR_86a (equation 4)
Herein, the coordinates CR-80 represent the coordinates of the reference point 80 in the robot coordinate system CR, and the coordinates CR-80a represent the coordinates of the inspection reference point 80a in the robot coordinate system CR. This is also applicable to the other coordinates CR-82, CR-84, CR-86, CR-82a, CR-84a, and CR-86a.
Each parameter of the matrix M can be calculated from the equations 1 to 4. The matrix M thus calculated is e.g. a homography matrix and expresses the first positional difference between the image of the second object W2 in the first inspection image 53 and the image of the first object W1 in the first master image 40 illustrated in
Thus, the controller 12 acquires the n-th positional difference (α, θ, α, M) between the n-th master image and the n-th inspection image in the image coordinate system CI. Therefore, the controller 12 functions as the image positional difference acquisition section 68 (
With reference to
On the other hand, when the matrix (including the matrix M) has been acquired as the n-th positional difference in step S33, in step S43, the controller 12 transforms the n-th inspection image (53) with the matrix (M).
Then, the controller 12 superimposes the transformed n-th inspection image and the n-th master image (40) on each other with reference to the image coordinate system CI. Whereby, the image of the second object W2 in the transformed n-th inspection image can be coincided with the image of the first object W1 in the n-th master image (40).
When the n-th inspection image (53) transformed with the matrix (M) is superimposed on the n-th master image and there is still a displacement between the image of the second object W2 in the transformed n-th inspection image and the image of the first object W1 in the n-th master image (40), the controller 12 may further shift the transformed n-th inspection image or the n-th master image (40) on the x-y plane of the image coordinate system CI so as to register these two images (40).
After step S4, in step S5, the controller 12 inspects whether or not there is any visually recognizable error in the second object W2 with respect to the first object W1.
Thus, in this embodiment, the controller 12 acquires the positional data of the reference points (60, 62, 64, 80, 82, 84, 86) on the first object W1 and the positional data of the inspection reference points (second reference points) on the second object W2 (step S52, S23), and using these positional data, acquires the positional difference between the master image (40) and the inspection image (48, 50, 52). According to this configuration, it is possible to calculate the positional difference between the master image and the inspection image with high precision.
Further, in this embodiment, the controller 12 receives the designation of the reference point (60, 62, 64, 80, 82, 84, 86) from the operator (step S51). According to this configuration, the operator can arbitrarily set the reference point to be used for registering the master image and the inspection image.
Note that, the controller 12 may set a feature point of the first object W1 as a reference point. The feature point may be e.g. the center point of the hole H, an edge of the first object W1, a pattern or shape formed on the surface SI of the first object W1. In this case, the controller 12 detects the feature point of the first object W1 in step S51, e.g. by carrying out image processing.
Next, with reference to
In this embodiment, the controller 12 acquires a positional displacement between the position at which the robot hand 28 grips the second object W2 and the designated gripping position. Such a positional displacement may be obtained e.g. by carrying out image processing on the image captured by the imaging section 16 (e.g., the n-th inspection image), or may be detected by providing another imaging section for detecting the positional displacement.
The controller 12 functions as a positional displacement acquisition section 72 (
Alternatively, the controller 12 may acquire a positional displacement in the image coordinate system, which can be obtained by transforming the positional displacement in the robot coordinate system CR into that in the image coordinate system CI, and in the above-described step S43, register the master image and the inspection image further using the positional displacement in the image coordinate system.
According to this configuration, even when the position at which the robot hand 28 grips the second object W2 is displaced from the designated gripping position, it is possible to register the master image and the inspection image.
Next, with reference to
Specifically, the controller 12 disposes an imaging section model, which is a model of the imaging section 16, a machine model, which is a model of the movement machine 14, and an object model, which is a model of the first object W1, in a model environment which is a virtual space.
The controller 12 then virtually operates the imaging section model and the machine model in the model environment so as to dispose the imaging section model and the machine model at the n-th reference relative position. The controller 12 acquires, as the n-th reference positional data, the positional data of the machine model when the imaging section model and the machine model are disposed at the n-th reference relative position in the model environment.
Thus, in this embodiment, the controller 12 functions as a positional data acquisition section 74 to acquire the n-th reference positional data, wherein the positional data acquisition section 74 includes a simulation section 76 configured to acquire the n-th reference positional data by simulation. According to this configuration, since the n-th reference positional data can be acquired by simulation without teaching the actual movement machine 14, it is possible to reduce the work for teaching.
Next, with reference to
Specifically, in the object inspection system 100, the imaging section 16 is fixed to the wrist 26 of the movement machine 14. On the other hand, the objects W1 and W2 are to be fixed to a workpiece holder 102, and disposed at a predetermined position in the robot coordinate system CR so as to be separate away from the movement machine 14. The storage of the controller 12 pre-stores the information of the fixed positions of the objects W1 and W2 in the robot coordinate system CR.
In this embodiment, a tool coordinate system CT is set for the imaging section 16. This tool coordinate system CT is one of the coordinate systems for automatic control, and the position and orientation of the imaging section 16 in space are defined by expressing the position of the tool coordinate system CT in the robot coordinate system CR.
In this embodiment, the tool coordinate system CT is set such that the z-axis of the tool coordinate system CT coincides with the optical axis O of the imaging section 16. Thus, the image coordinate system CI is disposed in a predetermined positional relation with the tool coordinate system CT.
The controller 12 has information of the position and direction of the tool coordinate system CT in the robot coordinate system CR. Therefore, the positional relation between the robot coordinate system CR and the image coordinate system CI is known through the tool coordinate system CT.
The controller 12 operates the swiveling body 22, the robot arm 24, and the wrist 26 in the robot coordinate system CR such that the position and orientation of the imaging section 16 coincide with those defined by the tool coordinate system CT. Thus, the imaging section 16 is disposed at any position and orientation in the robot coordinate system CR.
The controller 12 of the object inspection system 100 can register the master image and the inspection image and inspect whether or not there is any error in the inspection image with respect to the master image, by carrying out the flows illustrated in
The processes in the operational flow of the object inspection system 100, which are different from the object inspection system 10, will be described below. When the flow illustrated in
With reference to
If the number “n” is set as n=1 at the start of step S12, the controller 12 disposes the imaging section 16 and the first object W1 at the first reference relative position illustrated in
Each time the controller 12 disposes the imaging section 16 and the first object W1 at the n-th reference relative position (i.e., n=1 to 12), the controller 12 functions as the coordinate transformation data acquisition section 42 (
In step S13, the controller 12 functions as the positional data acquisition section 44 (
For example, the controller 12 acquires, as the n-th reference positional data, information of the position and direction of the tool coordinate system CT in the robot coordinate system CR when the imaging section 16 and the first object W1 are disposed at the n-th reference relative position.
When step S1 illustrated in
If the number “n” is set as n=1 at the start of step S12, the controller 12 disposes the imaging section 16 and the first object W1 at the first inspection relative position, and images the surface SI of the first object W1 by the imaging section 16 so as to acquire the first inspection image 48, 50, or 52 illustrated in
Each time the controller 12 disposes the imaging section 16 and the second object W2 at the n-th inspection relative position, the controller 12 functions as the coordinate transformation data acquisition section 42 to acquire information on the positional relation between the robot coordinate system CR and the image coordinate system CI (coordinate transformation data).
In step S23, the controller 12 acquires the n-th inspection positional data. More specifically, the controller 12 acquires, as the n-th inspection positional data, the positional data of the movement machine 14 when the imaging section 16 and the second object W2 are disposed at the n-th inspection relative position.
With reference to
For example, regarding the first positional difference δ or θ, the controller 12 calculates the coordinates in the robot coordinate system CR of the position and direction of the tool coordinate system CT included in the first reference positional data, and the coordinates in the robot coordinate system CR of the position and direction of the tool coordinate system CT included in the first inspection positional data, and calculates the difference ΔR1 between these two sets of coordinates.
The controller 12 then transforms the difference ΔR1 expressed as the coordinates in the robot coordinate system CR into that in the image coordinate system CI at the time of execution of step S22 (or S12), using the coordinate transformation data acquired in step S22 (or S12). The positional difference δ or θ in the first inspection image 48, 50 (or the first master image 40) in the image coordinate system CI is thereby calculated.
Further, regarding the above-described positional difference α, the controller 12 calculates the distances D1 and D2 from the coordinates in the robot coordinate system CR of the position of the tool coordinate system CT included in the first reference positional data and the coordinates in the robot coordinate system CR of the position of the tool coordinate system CT included in the first inspection positional data. The controller 12 then calculates the positional difference α representing the reduction rate (or enlargement rate) of the first inspection image 52 to the first master image 40 from the ratio of the calculated D1 and D2.
When acquiring the matrix M as a positional difference, the controller 12 calculates each parameter of the matrix M, e.g. by using the coordinates CR_80, CR_82, CR_84, and CR_86 of the first group of reference points 80, 82, 84, and 86 in the robot coordinate system CR, which are included in the first reference positional data, and the coordinates CR_80a, CR_82a, CR_84a, and CR_86a of the first group of inspection reference points 80a, 82a, 84a, 86a in the robot coordinate system CR, which are included in the first inspection positional data, similarly as the above-mentioned embodiment.
With reference to
When the positional difference δ, θ, α in the first inspection image 48, 50, 52 is calculated in step S33, in this step S43, the controller 12 functions as the image registering section 56 (
When the positional difference δ, θ, α in the first master image 40 is calculated in step S33, in this step S43, the controller 12 shifts the first master image 40 to register the first master image 40 and the first inspection image 48, 50, 52. As a result, the image of the first object W1 in the first master image 40 and the image of the second object W2 in the inspection image 48 can be coincided with each other in the image coordinate system CI.
Next, in step S5 illustrated in
As described above, according to this embodiment, if there is a displacement between the master image and the inspection image, these two images can be registered with each other using the positional data of the movement machine 14. Due to this, it is possible to reduce the work for registering and shorten the time therefor.
Next, with reference to
The positional data acquisition section 112, the image data acquisition section 114, and the image registering section 116 may be constituted by individual computes or by a single computer.
The positional data acquisition section 112 acquires the reference positional data of the movement machine 14 when the movement machine 14 disposes the first object W1 and the imaging section 16 at the reference relative position, and acquires the inspection positional data of the movement machine 14 when the movement machine 14 disposes the second object W2 and the imaging section 16 at the inspection relative position.
The image data acquisition section 114 acquires the master image imaged by the imaging section 16 at the reference relative position, and acquires the inspection image imaged by the imaging section 16 at the inspection relative position.
The image registering section 116 registers the master image and the inspection image in the image coordinate system CI, using the reference positional data, the inspection positional data, and the known positional relation between the robot coordinate system CR and the image coordinate system CI.
For example, the image registering section 116 pre-stores a data table representing the relation between the coordinates in the robot coordinate system CR and the coordinates in the image coordinate system CI, and applies the reference positional data and the inspection positional data in the robot coordinate system CR to the data table, whereby acquiring the reference positional data and the inspection positional data in the image coordinate system CI.
Then, the image registering section 116 registers the master image and the inspection image on the basis of the reference positional data and the inspection positional data in the image coordinate system CI. In this case, the image registering section 116 can register the master image and the inspection image without acquiring the above-described positional difference.
Note that, the lighting device 18 may be omitted from the object inspection system 10 or 100, wherein the surface SI of the object W1, W2 may be illuminated by e.g. natural light.
In the embodiments described above, the movement machine 14 is a vertical articulated robot. However, the movement machine 14 may be a parallel link robot or any other machine such as a loader.
Further, in the embodiments described above, the first object W1 and the second object W2 have the same outer shape. However, the first object W1 and the second object W2 may only have a common outer shape at least partially. In this case, the imaging section 16 acquires a master image and an inspection image of the common outer shape.
The present disclosure has been described by means of embodiments but the embodiments described above do not limit the scope of the invention according to the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2017-245290 | Dec 2017 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
8396329 | Brooksby | Mar 2013 | B2 |
20160379357 | Takazawa | Dec 2016 | A1 |
Number | Date | Country |
---|---|---|
H5-301183 | Nov 1993 | JP |
2007-240434 | Sep 2007 | JP |
2017-15396 | Jan 2017 | JP |
2017-203701 | Nov 2017 | JP |
Number | Date | Country | |
---|---|---|---|
20190197676 A1 | Jun 2019 | US |