The present invention relates to a method and a system for measuring the position and attitude of an object, such as a work, in a three-dimensional space.
There has been proposed a method for measuring the position of an object by combining a distance measurement sensor with a camera. A method has been proposed, in which the detection results of the positions of known three points on an object obtained by a camera, are subjected to parallax correction based on the distance measurement results of the three points obtained by a distance measurement sensor on a reference surface that is perpendicular to the optical axis of the camera, thereby measuring the three-dimensional positions of the three points, that is, the three-dimensional positions of the object (refer to Japanese Patent Application Laid-Open No. H03-92712). There has been proposed a method for estimating the position and the attitude of a plane on the basis of the measurement results of the X-Y positions of three marks on a plane obtained by a camera and the measurement result of the Z position thereof obtained by a distance measurement sensor (refer to Japanese Patent Application Laid-Open No. H09-304013).
However, using a distance measurement sensor increases the cost of a measurement system. To avoid such an increase in cost, a method has been proposed, in which an imaging device is used in place of a distance measurement sensor to measure the three-dimensional position of an object. There has been proposed a method whereby a plane on which four or more marks, the geometric relationship of which are known, are present is imaged by a camera, and the three-dimensional positions of the plurality of the marks are estimated on the basis of the positions of the plurality of marks on the plane and the positions of the plurality of marks on a two-dimensional image coordinate system (refer to Japanese Patent Application Laid-Open No. S62-175603). There has been proposed another method in which three characteristic points of an object are extracted in a two-dimensional image taken by a camera and the three-dimensional position of the object is measured on the basis of the result of the extraction and the distances among the three characteristic points, which are measured in advance (refer to Japanese Patent No. 2534517). There has been proposed yet another method in which a plurality of marks two-dimensionally arranged in a grid-like manner are imaged by a camera, and the three-dimensional positions of the plurality of marks are estimated according to a predetermined relational expression on the basis of the two-dimensional positions of the plurality of marks obtained by the imaging (refer to Japanese Patent Application Laid-Open No. H06-259536). There has been proposed still another method in which the three-dimensional positions of four marks of an object, the distances among the four marks being measured in advance, are estimated on the basis of the distances in addition to the two-dimensional positions in an image of the object taken by a camera (refer to Japanese Patent No. 4602704).
However, there are cases where, when a component is imaged using an imaging device to check the assembled state of components, a mark provided on the component is at least partly covered by another component or a group of components, thus making it difficult to perform the checking operation.
Therefore, it is an object of the present invention to provide a technology that makes it possible to estimate the position and the attitude of an object in a three-dimensional coordinate system with ease and high accuracy even in a situation where a part of the object is inconveniently covered by another object.
A position and attitude estimation method in accordance with the present invention includes: first size measurement processing for acquiring a captured image of an object through an imaging device, the position and the attitude of which are fixed in a real space defined by an XYZ coordinate system and an optical axis of which is parallel to a Z-axis, and for measuring a size of each of a plurality of index areas in the captured image, the plurality of index areas being defined in the object; vertical real space position estimation processing for estimating a Z coordinate value of each of a plurality of index points included in each of the plurality of index areas on the basis of a measurement result in the first size measurement processing; first real space attitude estimation processing for estimating, as a first real space attitude, an attitude of an index straight line in a real space, the index straight line passes through paired index points among the plurality of index points, on the basis of at least one of an X coordinate value and a Y coordinate value of each of the plurality of index points and an estimation result of the Z coordinate value of each of the plurality of index points; and real space attitude estimation processing for estimating a real space attitude of the object on the basis of an estimation result of the first real space attitude, wherein, in the first size measurement processing, an irradiation device, which has its position and attitude fixed in the real space, is used to irradiate light beam to a first index point among the plurality of index points, thereby forming a first index area as at least one index area among the plurality of index areas.
According to the position and attitude estimation method in accordance with the present invention, the Z coordinate value of each index point is estimated on the basis of the size of each index area of an object acquired through an imaging device which has an optical axis parallel to the Z-axis in the real space. This relies on a qualitative relationship, in which the measurement size of an index point increases as the index point is closer to the imaging device (or the image pickup element of the imaging device), whereas the measurement size decreases as the index point is farther from the imaging device. Based on the Z coordinate value of each index point, the first real space attitude of an index straight line passing through paired index points (the first real space attitude being dependent upon a directional vector) is estimated. The first index area which includes the first index point is formed and defined by irradiating a light beam to the first index point among the plurality of index points by using the irradiation device.
In an imaging range of an imaging device, even in a situation where a part of an object is covered by another object, the first index area is formed by irradiating the light beam to the first index point of the object, avoiding the aforesaid another object or threading through a gap. A plurality of index areas including the plurality of index points on the same index straight line can be easily included in a captured image. Hence, even in such a situation, the real space position defined by at least the Z coordinate value of an object and the real space attitude of the object defined by at least the first real space attitude of an index straight line can be estimated with ease and high accuracy.
Preferably, the position and attitude estimation method in accordance with the present invention further includes second size measurement processing for measuring the size of at least one index area among the plurality of index areas in an extending direction of the index straight line in the captured image; and second real space attitude estimation processing for estimating, as a second real space attitude, the real space attitude of at least one index area on the basis of a measurement result in the second size measurement processing, wherein the estimation result of the first real space attitude and the estimation result of the second real space attitude are consolidated by the real space attitude estimation processing, thereby estimating the real space attitude of the object.
According to the position and attitude estimation method, the second real space attitude of at least one index area in the captured image (dependent upon the perpendicular vector of the index area) is estimated on the basis of the size of the index area in the extending direction of the index straight line. The estimation result of the first real space attitude and the estimation result of the second real space attitude are consolidated, thereby improving the accuracy of estimating the real space attitude of an object.
In the position and attitude estimation method according to the present invention, preferably, the estimation results of the first real space attitudes of the plurality of index straight lines which pass through index points constituting each of different pairs and which are the same or parallel to each other are consolidated to estimate the real space attitude of the object in the real space attitude estimation processing.
The position and attitude estimation method improves the accuracy of the estimation of the real space attitude of each of a plurality of index straight lines passing through index points constituting different pairs, thus leading to higher accuracy of the estimation of the real space attitude of an object.
In the position and attitude estimation method according to the present invention, the plurality of index points are preferably set such that two specified directions that are orthogonal to each other in the real space are defined.
According to the position and attitude estimation method, the real space attitude of an object that is defined by the first real space attitude (or the consolidated result of a plurality of estimation results of the first real space attitude or the consolidated result of the first real space attitude and the second real space attitude) of each of two index straight lines orthogonal to each other in a real space is estimated.
Preferably, the position and attitude estimation method according to the present invention further includes horizontal real space position estimation processing for estimating an X coordinate value and a Y coordinate value of a second index point on the basis of the position of the second index point in the captured image among the plurality of index points, the second index point being fixed to an object coordinate system.
According to the position and attitude estimation method, the X coordinate value and the Y coordinate value of the second index point are estimated on the basis of the position of the second index point in the captured image, the second index point being fixed in the object coordinate system. With this arrangement, the real space position of an object defined by the estimation results of the X coordinate value and the Y coordinate value in addition to the estimation result of the Z coordinate value of the second index point is estimated with high accuracy.
In the position and attitude estimation method according to the present invention, a point included in the second index area defined by a profile that can be recognized in the object through the imaging device preferably is defined as the second index point.
The position and attitude estimation method improves the accuracy of estimating the real space position and the real space attitude of an object by improving the accuracy of measuring the position of the second index point in a captured image.
Preferably, the position and attitude estimation method according to the present invention further includes: first preparation processing for measuring a change in the size of at least one index area in the captured image of the object while changing the Z coordinate value of the at least one index point among the plurality of index points, thereby defining, as a first correlation, the correlation between the Z coordinate value of the at least one index point and the size of the at least one index area in the captured image, wherein the Z coordinate value of each of the plurality of index points is estimated according to the first correlation in the vertical real space position estimation processing.
According to the position and attitude estimation method, the Z coordinate value of each index point is estimated according to the first correlation defined in advance, thus improving the accuracy of estimating the real space position and the real space attitude of an object.
Preferably, the position and attitude estimation method according to the present invention further includes: second preparation processing for measuring a change in the size of the at least one index area in the captured image in an extending direction of the index straight line while changing the real space attitude of the at least one index area, thereby defining, as a second correlation, the correlation between the real space attitude of the at least one index area and the size thereof in the extending direction of the index straight line in the captured image, wherein the second real space attitude is estimated according to the second correlation in the second real space attitude estimation processing.
According to the position and attitude estimation method, the second real space attitude is estimated according to the second correlation defined in advance, so that the accuracy of estimating the real space position and the real space attitude of an object is improved.
According to the position and attitude estimation method as a first embodiment of the present invention, the position and the attitude of an object coordinate system (x, y, z) in a real space coordinate system (X, Y, Z) are estimated as the real space position and the real space attitude of a work W (object) (refer to the lower diagram of
The work W is, for example, a component to be assembled to another work disposed such that the position and the attitude thereof in the real space are maintained to be constant. When the work W has been assembled to another work, the position and attitude estimation method in accordance with the present invention is carried out to evaluate the difference of each of the real space position and the real space attitude thereof from each of a desired position and a desired attitude.
By maintaining the real space position and the real space attitude of the irradiation device 2 to be constant, an X coordinate value Xi and a Y coordinate value Yi of a first index point among a plurality of index points Pi (i=0, 1, 2, . . . N) are specified in advance. The position of a second index point in an object coordinate system is fixed. In the first embodiment, a plurality of index straight lines passing through paired index points Pi and Pj are non-parallel to each other. For example, as illustrated in the upper diagram of
The position and attitude estimation method in accordance with the present invention is implemented by processors (e.g. single core processors or multicore processors) constituting a computer, which read necessary data and software from a memory in the processor or from an external memory and carry out arithmetic processing on the data according to the software. For example, each processing element working as a functional element that carries out the processing in each step illustrated in
The first size measurement processing element may have a function for controlling the operation of the imaging device 4 in addition to the operation of a drive mechanism (which is composed primarily of a motor and a power transmission mechanism) for adjusting the position and attitude of the imaging device 4 in the real space defined by an XYZ coordinate system. The first size measurement processing element may have a function for controlling the operation of the irradiation device 2 in addition to the operation of a drive mechanism for adjusting the position and attitude of the irradiation device 2 (or the base 1).
A first preparation processing element may have a function for controlling the operation of a drive mechanism for adjusting the relative positions or the relative positions and attitudes of the imaging device 4 and the work W (object) order to change the Z coordinate value of at least one index point P among a plurality of index points.
A second preparation processing element may have a function for controlling the operation of a drive mechanism for adjusting the relative attitudes or the relative positions and attitudes of the imaging device 4 and the work W (object) in order to change the real space attitude of at least one index area.
Using the irradiation device 2, the light beam 20 is irradiated to a first index point Pi1 among the plurality of index points Pi of the work W (STEP02 of
An area derived from the appearance or the structure of the work W that can be identified in a captured image that has been acquired by the imaging device 4 is defined as a second index area Ai2, and a point included in the second index area Ai2, such as a central point of the second index area Ai2 or a point on the boundary line of the second index area Ai2, is defined as a second index point Pi2. The second index point Pi2 has its position fixed in the object coordinate system because of the nature of the second index point Pi2. In the example illustrated in
Subsequently, the captured image of the work W is acquired through the imaging device 4 (STEP04 of
Based on the captured images, the first size measurement processing is carried out to measure a size ϕi (or the diameter) of the index area Ai in an image coordinate system (STEP10 of
Further, the vertical real space position estimation processing is carried out to estimate the Z coordinate value Zi (Z coordinate value) of the index point Pi on the basis of the measurement result of the size ϕi of the index area Ai in the image coordinate system (u, v) (STEP12 of
Based on the captured image, the horizontal real space position estimation processing is carried out to measure an X coordinate value Xi2 and a Y coordinate value Yi2 of a second index point Pi2 in the real space on the basis of the position (ui2, vi2) of the second index point Pi2 in an image coordinate system (u, v) (STEP14 of
The real space position (Xi2, Yi2, Zi2) of the second index point Pi2 is estimated as the real space position of the work W by carrying out the real space position estimation processing on the basis of the result of the vertical real space position estimation processing (refer to STEP12 of
Based on the result of the vertical real space position estimation processing (refer to STEP10 of
The first real space attitude θ1(i, j) may be estimated according to relational expression (01) on the basis of a difference ΔZj (k) of a Z coordinate value Zi (k) of the index point Pi in a randomly chosen state, in which the attitude of an xy plane of the object coordinate system with respect to an XY plane of the real space is unknown, from a Z coordinate value Zi (0) of the index point Pi in a reference state, in which the attitude of the xy plane of the object coordinate system with respect to the XY plane of the real space is the reference attitude (e.g. parallel). The Z coordinate value Zi (0) of the index point Pi may be measured in advance after the attitude of the xy plane of the object coordinate system with respect to the XY plane of the real space is adjusted to the reference attitude.
θ1(i, j)arctan {(ΔZj(k)−ΔZi(k))/Lij} (01)
For example, in the state illustrated in the lower diagram of
In contrast to the above, in the state illustrated in the lower diagram of
Based on the captured image, the second size measurement processing is carried out to measure the size of at least one index area among the plurality of index areas Ai in the extending direction of the index straight line in the image coordinate system (STEP20 of
Subsequently, based on the measurement result in the second size measurement processing, the second real space attitude estimation processing is carried out to estimate the real space attitude of at least one index area Ai as the second real space attitude (STEP22 of
For example, there is a correlation (a second correlation) between a tilt angle ξ of the index straight line Q01 with respect to the XY plane of the real space (corresponding to the x-axis of the object coordinate system in this example) and a size ϕij (ξ) of the index area Ai in the image coordinate system in the extending direction of the index straight line (the x direction in this example). According to the correlation, the size ϕ01 (ξ) decreases, as illustrated in the lower diagrams of
The second real space attitude θ2(i, j) is estimated according to relational expression (02) on the basis of a difference Δϕij (k) of the size ϕij (ξ(k)) of the index area Ai in a randomly chosen state, in which the attitude of the xy plane of the object coordinate system relative to the XY plane of the real space is unknown, from the size ϕij (ξ(0)) of the index area Ai in the reference state, in which the attitude of the xy plane of the object coordinate system with respect to the XY plane of the real space is the reference attitude (e.g. parallel). The size ϕij (ξ, (0)) of the index area Ai may be measured in advance after the attitude of the xy plane of the object coordinate system with respect to the XY plane of the real space is adjusted to the reference attitude.
θ2(i, j)=arccos {(Δϕij(k)/ϕij(k)} (02)
The real space attitude estimation processing is carried out (STEP24 of
θ(i, j)=αθ1(i, j)+(1−α)θ2(i, j) (0<α<1) (04)
A weight coefficient α is set to, for example, 0.5, but may be changed, as necessary, according to the difference in accuracy of estimation between the first real space attitude θ1(i, j) and the second real space attitude θ2(i, j). For example, if the recognition accuracy of the first index area derived from light beam in a captured image is higher than the recognition accuracy of the second index area derived from the structure or the appearance of the work W, then the α may be set to a value that is larger than 0.5.
The position and attitude estimation method as a second embodiment of the present invention differs from the first embodiment in that the second size measurement processing (refer to STEP20 of
According to the second embodiment, a plurality of index straight lines Qij and Qi′j′ are disposed in parallel to each other or are the same straight line. For example, as illustrated in the upper diagram of
First size measurement processing, vertical real space position estimation processing, and first real space attitude estimation processing (refer to STEP10→STEP12→STEP18 in
θ=Σm=1−Mβmθ1(im, jm)(0<βm<1, Σβm=1) (06)
For example, in a reference state illustrated in the lower diagram of
(Effect of the Present Invention)
According to the position and attitude estimation method in accordance with the present invention, the Z coordinate value of each index point Pi is estimated on the basis of the size ϕi of each index area Ai of the work W acquired through the imaging device 4 having the optical axis parallel to the Z-axis of the real space (refer to STEP04→STEP10→STEP12 of
In the imaging range of the imaging device 4, even in the situation where a part of the work W is covered by another object, the first index area Ai1 is formed by irradiating the light beam 20 to the first index point Pi1 of the work W, avoiding the aforesaid another object or threading through a gap. A plurality of index areas Ai including the plurality of index points Pi on the same index straight line can be easily included in a captured image. Hence, even in such a situation, it is possible to easily estimate, with high accuracy, the real space position defined by at least the Z coordinate value Zi of the work W and the real space attitude of the work W defined by at least the first real space attitude θ1(i, j) of the index straight line Qij.
According to the first embodiment, the second real space attitude θ2(i, j) of at least one index area Ai in a captured image (dependent upon the perpendicular vector of the index area Ai) is estimated on the basis of the size ϕi, j of the index area Ai in the extending direction of the index straight line (refer to STEP20→STEP22 of
According to the second embodiment, in the real space attitude estimation processing (refer to STEP24 of
In the foregoing embodiments, the horizontal real space position estimation processing (refer to STEP14 of
As with the second embodiment, a plurality of index points Pi may be defined, and the angle of an index straight line Q13 (corresponding to the x-axis of the object coordinate system) passing through a first pair of second index points P1 and P3, and the angle of an index straight line Q24 (corresponding to the y-axis of the object coordinate system) passing through a second pair of second index points P2 and P4 with respect to the X-axis and the Y-axis may be estimated as first real space attitudes θ1(1, 3) and θ1(2, 4), and then the attitude of the work W may be estimated according to the same procedure as that of the first embodiment of the present invention.
In the foregoing embodiments, the first preparation processing may be carried out in advance. More specifically, the change in the size ϕk of at least one index area Ak in a captured image of the work W is measured while changing the Z coordinate value Zk of at least one index point Pk among the plurality of index points Pi, thereby defining the correlation between the Z coordinate value Zk and the size ϕk as the first correlation. Thus, the Z coordinate value Zi of each index point Pi is estimated according to the first correlation in the vertical real space position estimation processing (STEP12 of
In the foregoing embodiments, the second preparation processing may be carried out in advance. More specifically, the change in the size ϕij of at least one index area Ai in a captured image in the extending direction of the index straight line Qij is measured while changing the real space attitude θ(i, j) of that particular index area Ai, thereby defining the correlation between the real space attitude θ(i, j) and the size ϕij in the captured image in the extending direction of the index straight line Qij as the second correlation. Thus, the second real space attitude θ2(i, j) is estimated according to the second correlation in the second real space attitude estimation processing (refer to STEP22 of
Number | Date | Country | Kind |
---|---|---|---|
2016-190454 | Sep 2016 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20060290781 | Hama | Dec 2006 | A1 |
Number | Date | Country |
---|---|---|
62-175603 | Aug 1987 | JP |
03-092712 | Apr 1991 | JP |
06-259536 | Sep 1994 | JP |
2534517 | Jun 1996 | JP |
09-304013 | Nov 1997 | JP |
2001-227925 | Aug 2001 | JP |
2001227925 | Aug 2001 | JP |
4602704 | Oct 2010 | JP |
2013-088169 | May 2013 | JP |
2013088169 | May 2013 | JP |
Entry |
---|
Japanese Office Action dated May 8, 2018, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20180089853 A1 | Mar 2018 | US |