This application claims the benefits of Chinese application Serial No. 201911308696.3, filed on Dec. 18, 2019, the disclosures of which are incorporated by references herein in its entirety.
The present disclosure relates in general to an automated calibration system for a coordinate frame, and more particularly to an automated calibration system for a workpiece coordinate frame of a robot. In addition, this present disclosure also relates in general to an automated calibration method applied to the automated calibration system for the workpiece coordinate frame of the robot.
As technology progresses, various robots have been applied more and more widely to different fields. Generally, a typical robot is formed as an articulated robot arm having multiple joints, and usually furnished with a tool at an end of the robot arm, such as a welding tool, a drilling tool and so on. However, automation application of the robot is obtained anyway through human teaching in advance.
Prior to the work of robot, a position of a tool center point (TCP) needs to be precisely calibrated, such that a controller of the robot can instruct a tooling having the TCP to move along an expected path.
Nevertheless, as the working path of the robot goes more and more complicated, accuracy of the working path is highly affected by the precision of the robot. In addition, position accuracy of the robot with respect to the workpiece coordinate system can affect directly the performance of the robot. Thus, the accuracy of the workpiece coordinate system is one of important factors for precisely operating the robot.
Currently, while the robot is to be automatically operated, the relative position relationship between the workpiece and the robot shall be confirmed in advance. However, position bias is always possible due to precision of positioning devices, manufacturing tolerances upon the workpiece and so on. Thus, prior to the robot work, precise coordinates for manufacturing shall be confirmed in advance with relevant calibrations in positioning the workpiece.
Conventionally, a typical position calibration method for a workpiece is to overlap the TCP and each of the designated points at the workpiece through human teaching, and record coordinates of the workpiece at each individual coinciding.
In the human teaching, the robot is taught to overlap the tool center with each of multiple designated points in a workpiece coordinate system, and then the calibration can be complete. However, operational performance of this calibration method is highly influenced by the user's experience. In addition, operation of the calibration method is also degraded by less accuracy of the robot and some not proper operational behaviors, from which the tool center might collide and thus damage the workpiece.
On the other hand, an automated calibration method is also introduced in the art. However, this automated calibration method does have the following disadvantages.
(1) In this conventional automated calibration method, external image sensors are required for monitoring the robot, and, through the designated points to overlap target points or to measure distances with the target points, coordinates of the workpiece can be thus corrected. While in performing this calibration method to different tooling, it is quite possible that images of some designated points might be blocked by the robot itself or some other objects.
(2) According to this conventional calibration method, the image sensors are mounted to measure in advance so as to confirm relative positions between the sensors and the robot, or a control center is applied to directly touch the workpiece. However, both of the aforesaid operations may damage the workpiece, and human operation error might be inevitable.
(3) In this conventional calibration method, a CAD file can be alternatively applied to obtain relative distances between the robot and the calibration device or the workpiece. However, the foregoing operation is time-consuming, and also the workpiece shall have obvious characteristic points so that practical measurements can be successfully made to reduce possible errors.
This disclosure is an improvement of a previous patent application of utility, titled as “System and method for calibrating tool center point of robot”, with application Ser. No. 15/845,168 filed on Dec. 18, 2017, and hereinafter “the prior application”.
In this disclosure, improvements over the prior application include at least the following.
(1) Images of the designated points can be always obtained without obstacles such as the robot itself or the tooling to block the image sensors.
(2) Cost for constructing the entire system is substantially reduced.
Thus, the automated calibration system for a workpiece coordinate frame of a robot and the method thereof provided by this disclosure can perform calibration without direct contacting or colliding the workpiece, can avoid blocking by the robot itself or the tooling, needs not to calibrate the position relationship between the image sensors and the robot prior to practical operations, needs no additional image sensors above the workpiece, can complete the position calibration of the workpiece in one single step, and can improve various disadvantages of the prior art.
In one embodiment of this disclosure, an automated calibration system for a workpiece coordinate frame of a robot includes a physical image sensor having a first image central axis and disposed on a flange of the robot, and a controller for controlling the physical image sensor and the robot to rotate by an angle to construct a virtual image sensor having a second image central axis. The second image central axis and first image central axis are intersected at an intersection point. The controller controls the robot to move repeatedly a characteristic point on a workpiece back and forth between the first image central axis and the second image central axis until the characteristic point overlaps the intersection point, the characteristic point as a calibration point including coordinates of a plurality of joints of the robot is recorded, then a next characteristic point is introduced to repeat the aforesaid movement for overlapping the intersection point again and again until a number of the plurality of calibration points is accepted, and each of the plurality of calibration points is evaluated to calculate coordinates of a virtual tool center point and the workpiece with respect to the robot.
In another embodiment of this disclosure, an automated calibration method for a workpiece coordinate frame of a robot, includes a step of providing a physical image sensor, forming an image coordinate system and having a first image central axis, disposed on a flange at an end of the robot; a step of applying the controller to rotate by an angle the physical image sensor and the robot to construct a virtual image sensor having a second image central axis, the second image central axis intersecting the first image central axis to form an intersection point; a step of the controller controlling the robot to move repeatedly a characteristic point on a workpiece back and forth between the first image central axis and the second image central axis, until the characteristic point and the intersection point are overlapped, and recording as a calibration point including coordinates of a plurality of joints of the robot; a step of the controller controlling the robot to move a next characteristic point according to the aforesaid movement to overlap the intersection point so as to generate a plurality of other calibration points; and, a step of based on each of the plurality of calibration points to calculate coordinates of a virtual tool center point and the workpiece with respect to the robot.
Further scope of applicability of the present application will become more apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating exemplary embodiments of the disclosure, are given by way of illustration only, since various changes and modifications within the spirit and scope of the disclosure will become apparent to those skilled in the art from this detailed description.
The present disclosure will become more fully understood from the detailed description given herein below and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present disclosure and wherein:
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
Referring to the system framework of
It shall be emphasized that, in the system and method provided by this disclosure, the physical tooling is not needed during the calibration process, but the characteristic points or the designated points on the workpiece W are used for calibration. On the workpiece W, the only one requirement to be a qualified characteristic point is any point within an intersection region of the visual fields of the physical image sensor 11 and the virtual image sensor 12, such as a center or an intersection point of lines and planes. As shown in
The physical image sensor 11 has a visual field which has a first image central axis A, and is disposed on a flange F at an end of the robot R. The flange F is defined with a coordinate system (xf-yf-zf). The visual field of the physical image sensor 11 intersects a Z axis (zf) originated at a center of the flange F, and the Z axis (zf) is perpendicular to a horizontal plane expanded by an X axis (xf) and a Y axis (yf) of the coordinate system (xf-yf-zf).
Firstly, after the robot R moves an arbitrary designated point on the workpiece W into an arbitrary position within the visual field of the physical image sensor 11, then the position of the designated point is defined as an origin O (not shown in the figure) for an image coordinate system. The image coordinate system is the coordinate system (x1C-y1C-z1C) formed by the images captured by the physical image sensor 11.
After the origin O is moved to overlap a designated point, the robot R moves in an xR direction of the coordinate system of the robot R by an arbitrary distance LR so as to obtain a projected coordinate point P′x1=(x11, y11) of the physical image sensor 11, and a spatial vector for this point P′x1=(x11, y11) is defined as {right arrow over (U1)}=(−x11, −y11, −z11).
Similarly, after the origin O is moved to overlap another designated point, the robot R moves in an yR direction of the coordinate system of the robot R by the arbitrary distance LR so as to obtain another projected coordinate point P′y1=(x21, y21) of the physical image sensor 11, and a spatial vector for this point P′y1=(x21, y21) is defined as {right arrow over (V1)}=(−x21, −y21, −z21).
Similarly, after the origin O is moved to overlap a further designated point, the robot R moves in an zR direction of the coordinate system of the robot R by the arbitrary distance LR so as to obtain a further projected coordinate point P′z1=(x31, y31) of the physical image sensor 11, and a spatial vector for this point P′z1=(x31,y31) is defined as {right arrow over (W1)}=(−x31, −y31, −z31).
According to orthogonality of the coordinate system, following simultaneous equations can be obtained:
{right arrow over (U1)}·{right arrow over (V1)}=0 (1)
{right arrow over (V1)}·{right arrow over (W1)}=0 (2)
{right arrow over (U1)}·{right arrow over (W1)}=0 (3)
Thus, constant vectors {right arrow over (U1)}, {right arrow over (V1)}, {right arrow over (W1)} can be obtained.
Theoretically, these simultaneous equations have two sets of solutions, (z11, z21, z31) and (z12, z22, z32), in which (z11, z21, z31)=−(z12, z22, z32). Thus, the distance variation between two arbitrary designated points on the workpiece in the image can be utilized to judge whether, while in moving in the coordinate system (xR-yR-zR) of the robot R, the designated point moves toward or away from the physical image sensor 11, such that the correct branch solution can be determined.
Hence, coordinates of the physical image sensor 11 with respect to the coordinate system of the flange F is:
in which flangeTCCD stands for the coordinates of the physical image sensor 11 with respect to the coordinate system of the flange F, and baseTflange stands for the coordinates of the flange F with respect to the base coordinate system (xb-yb-zb) of the robot R.
Thereupon, the transformation relationship between the coordinate system of the robot R and that of the physical image sensor 11 can be obtained as follows:
S
R=baseTflangeflangeTCCDSc (5)
in which Sc stands for the movement in the coordinate system (x1C-y1C-z1C) of the physical image sensor 11, and SR stands for the movement in the coordinate system (xR-yR-zR) of the robot R.
After the transformation relationship of coordinate systems between the physical image sensor 11 and the robot R has been established, then any point within the visual field of the physical image sensor 11 is rotated, with respect to an arbitrary coordinate axis of the flange F, by an angle θv to generate a second viewing angle to locate the virtual image sensor 12, as shown in
Referring now to
Step (a): Move the robot R to an arbitrary characteristic point WPi within the visual field of the physical image sensor 11, and assign [0, 0, DZ] to be the coordinates of another arbitrary point D within the visual field with respect to the flange F, i.e., D=[0, 0, DZ] as shown in
Step (b): Apply the physical image sensor 11 to obtain a coordinate C1 of the characteristic point WPi with respect to the image coordinate system, and then the physical image sensor 11 rotates twice about the Z axis (z1c axis) so as to generate two coordinates C2 and C3. Further, according to the arcs formed by coordinates C1, C2, C3, calculate a center position D1, as shown in
Step (c): Calculate a vector ({right arrow over (D1I1)})c from the center position D1 to a tool-image center I1, and transform the vector ({right arrow over (D1I1)})c into a vector with respect to the flange F, i.e., vector ({right arrow over (D1I1)})f=flangeTCCD({right arrow over (D1I1)})c. The tool-image center I1 is the intersection point of the Z axis (ztool) of the tooling coordinate system and the first image central axis A.
Step (d): Correct the coordinate of point D by D=D+L(D1I1)f, in which L( ) stands for a distance function. Then, go back to perform Step (a), until the center position D1 overlaps the image center point I1. Thus, coordinate I of the virtual tool center point can be obtained. Based on each change at the vector ({right arrow over (D1I1)})c, adjust constants of the function L( ).
Step (e): Rotate the physical image sensor 11 an angle θv about an arbitrary coordinate axis of the flange F so as to generate the second viewing angle to locate the virtual image sensor 12.
Referring to
Step (a1): Apply the physical image sensor 11 to capture the image information including coordinates (xc1,yc1) of the designated point with respect to the physical image sensor 11.
Step (b1): Transform
into SR=baseTflangeflangeTCCDSc in the coordinate system of the robot R, as shown in
Step (c1): Move the robot R along SR, until the designated point reaches the coordinate axis of the physical image sensor 11.
Step (d1): If the designated point does not overlap the center point of the physical image sensor 11, then go back to perform Step (a1). Otherwise, if the designated point overlaps the intersection points of axial lines, then the visual servo control process is completed.
Referring to
The controller 13 controls the robot R as well as the physical image sensor 11 or the virtual image sensor 12 to rotate synchronously by an angle, so as to have an arbitrary characteristic point WPi on the workpiece W to move repeatedly back and forth between the first image central axis A and the second image central axis B, as shown in
From
Then, the controller 13 goes further to determine whether or not a quantity of the calibration points is larger than or equal to a preset value. In this embodiment, the quantity of the calibration point needs to be larger than or equal to 3. If the quantity of the calibration points is less than 3, then the controller 13 can apply a random-number generator to produce Euler-angle increments ΔRx, ΔRy, ΔRz for correcting the Euler angles of the robot R so as to vary the posture of the robot R. At this time, the Euler angles of the robot R can be expressed by (Rx+ΔRx, Ry+ΔRy, Rz+ΔRz), in which (Rx, Ry, Rz) stands for the original Euler angle of the robot R, Rx stands for the Yaw angle, Ry stands for the pitch angle, and Rz stands for the roll angle. If the corrected Euler angle exceeds a motion region of the robot R or the overlapped region IA, then the controller 13 would apply again the random-number generator to generate a new set of the Euler-angle increments.
Then, after a new Euler angle and a next characteristic point WPi are obtained, the controller 13 controls the robot R to move the virtual tool center point TCP repeatedly back and forth between the first image central axis A and the second image central axis B. Upon when the virtual tool center point TCP overlaps the second image central axis B, a second calibration point CP2 is recorded.
Then, the controller 13 determines if or not the quantity of the calibration points is larger than or equal to 3. In the case that the controller 13 judges that the quantity of the calibration points is less than 3, then the controller 13 repeats the aforesaid procedures for obtaining and recording a third calibration point CP3. The same process would be repeated until the controller 13 confirms that the quantity of the calibration points is larger than or equal to 3.
As described above, in the calibration process of this disclosure, at least 3 designated points in the workpiece coordinate system are adopted as the aforesaid designated or characteristic points. For example, these three designated points can be the origin of the workpiece coordinate system, an arbitrary point at the X axis of the workpiece coordinate system, and an arbitrary point on the X-Y plane of the workpiece coordinate system. Firstly, the robot R is controlled to move an i-th designated point in the workpiece coordinate system (i.e., the i-th characteristic point WPi on the workpiece W) into an overlapped visual region of the physical image sensor 11 and the virtual image sensor 12. The aforesaid moving operation is repeated until the number i is greater than a present number, so that information-collection process of the designated points for calibration can be completed. Based on the plurality of calibration points, coordinates of the virtual TCP and the workpiece can be calculated.
As shown in
The coordinates of the virtual tool center point TCP can be derived by the following equation (6):
T1iT2=P (6)
in which matrix T1i is a 4×4 transformation matrix for transforming coordinates of the i-th calibration point from the basic coordinate system (xb-yb-zb) to the coordinate system (xf-yf-zf) of the flange F, column T2 stands for the coordinates of the virtual tool center point TCP in the coordinate system of the flange F, and column P stands for the coordinates of the calibration point in the basic coordinate system (xb-yb-zb). Since each of the calibration points can apply equation (6) to obtain three linear equations, thus 3n linear equations can be obtained for n calibration points. Then, a pseudo-inverse matrix can be applied to derive the coordinates of the virtual tool center point TCP. As shown below, equation (7) is derived from equation (6).
in which column (e11i, e21i, e31i) stands for a coordinate vector for the i-th calibration point at the xf axis of the basic coordinate system (xb-yb-zb), column (e12i, e22i, e32i) stands for a coordinate vector for the i-th calibration point at the yf axis of the basic coordinate system (xb-yb-zb), and column (e13i, e23i, e33i) stands for a coordinate vector for the i-th calibration point at the zf axis of the basic coordinate system (xb-yb-zb). From equation (7), equations (8) and (9) can be derived as follows:
in which
T3t is a transpose matric of T3, and (T3T3t)−1 is an inverse matrix of (T3T3t).
If the quantity of the calibration points is met, plug entries of matrix T with respect to the known i-th calibration point into equation (8), and, through a shift operation upon matrix T3, equation (9) can be obtained. Thus, coordinates (Tx, Ty, Tz) of the virtual tool center point TCP in the coordinate system of the flange F and coordinates (Px, Py, Pz) of the virtual tool center point TCP in the coordinate system of the robot R (xR-yR-zR) can be obtained, calibrations of coordinates (Tx, Ty, Tz) of the virtual tool center point TCP can be completed, and coordinates for the workpiece W can be calculated.
As described above, in this embodiment, the automated calibration system for a workpiece coordinate frame of a robot 1 can apply the visual servo means to automatically calibrate the coordinate of the robot with respect to the workpiece to be machined, with an acceptable calibration precision. Thus, related labor and time cost can be effectively reduced. In addition, since the automated calibration system for a workpiece coordinate frame of a robot 1 can calibrate the coordinates of the robot with respect to the workpiece in a single calibration process, so the automated calibration system 1 provided by this disclosure can effectively improve existing shortcomings in the art.
Referring now to
Step S51: Control the robot R to move an arbitrary designed point within a visual field of the physical image sensor 11 a distance LR along a horizontal axis xR of the coordinate system (xR-yR-zR) of the robot R from an arbitrary position in an image-overlapped region IA, and obtain a first projection coordinate Px1′ through the physical image sensor 11.
Step S52: Control the robot R to move the designed point the distance LR along a vertical axis yR of the coordinate system (xR-yR-zR) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a second projection coordinate Py1′ through the physical image sensor 11.
Step S53: Control the robot R to move the designed point the distance LR along another vertical axis zR of the coordinate system (xR-yR-zR) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a third projection coordinate Pz1′ through the physical image sensor 11.
Step S54: Provide a first spatial vector {right arrow over (U1)}, a second spatial vector {right arrow over (V1)} and a third spatial vector {right arrow over (W1)} corresponding to the first projection coordinate Px1′, the second projection coordinate Py1′ and the third projection coordinate Pz1′, respectively.
Step S55: Based on an orthogonal relationship formed by the first spatial vector {right arrow over (U1)}, the second spatial vector {right arrow over (V1)} and the third spatial vector {right arrow over (W1)}, calculate the first spatial vector {right arrow over (U1)}, the second spatial vector {right arrow over (V1)} and the third spatial vector {right arrow over (W1)}.
Step S56: Based on an orthogonal relationship formed by the first spatial vector {right arrow over (U1)}, the second spatial vector {right arrow over (V1)} and the third spatial vector {right arrow over (W1)}, calculate the first spatial vector {right arrow over (U1)}, the second spatial vector {right arrow over (V1)} and the third spatial vector {right arrow over (W1)}, referring to equation (4).
Referring to
Step S61: Control the robot R to move an arbitrary designed point within a visual field of the virtual image sensor 12 a distance LR along a horizontal axis xR of the coordinate system (xR-yR-zR) of the robot R from an arbitrary position in an image-overlapped region IA, and obtain a first projection coordinate Px2′ through the virtual image sensor 12.
Step S62: Control the robot R to move the designed point the distance LR along a vertical axis yR of the coordinate system (xR-yR-zR) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a second projection coordinate Py2′ through the virtual image sensor 12.
Step S63: Control the robot R to move the designed point the distance LR along another vertical axis zR of the coordinate system (xR-yR-zR) of the robot R from the aforesaid position in the image-overlapped region IA, and obtain a third projection coordinate Pz2′ through the virtual image sensor 12.
Step S64: Provide a first spatial vector {right arrow over (U2)}, a second spatial vector {right arrow over (V2)} and a third spatial vector {right arrow over (W2)} corresponding to the first projection coordinate Px2′, the second projection coordinate Py2′ and the third projection coordinate Pz2′ respectively.
Step S65: Based on an orthogonal relationship formed by the first spatial vector {right arrow over (U2)}, the second spatial vector {right arrow over (V2)} and the third spatial vector {right arrow over (W2)}, calculate the first spatial vector {right arrow over (U2)}, the second spatial vector {right arrow over (V2)} and the third spatial vector {right arrow over (W2)}.
Step S66: Based on the first spatial vector {right arrow over (U2)}, the second spatial vector {right arrow over (V2)} and the third spatial vector {right arrow over (W2)}, calculate a transformation relationship between the coordinate system (xR-yR-zR) of the robot R and a coordinate system (x2C-y2C-z2C) of the virtual image sensor 12, referring to equation (8).
Referring now to
Step S71: Provide a physical image sensor 11 having a first image central axis A.
Step S72: Provide a virtual image sensor 12 having a second image central axis B intersecting the first image central axis A at an intersection point I.
Step S73: Control a robot R to move a characteristic point WPi of a workpiece repeatedly back and forth between the first image central axis A and the second image central axis B.
Step S74: Upon when the characteristic point WPi and the intersection point I are overlapped, record a calibration point including coordinates of a plurality of joints J1-J6 of the robot R.
Step S75: Repeat the aforesaid steps to generate a plurality of another calibration points.
Step S76: Based on all the plurality of calibration points, calculate coordinates of a virtual tool center point TCP and the workpiece.
Referring now to
Step S81: Given i=1, define an i-th characteristic point WPi of a workpiece W.
Step S82: A controller 13 controls a robot R to move the characteristic point WPi into a common visual field of a physical image sensor 11 and a virtual image sensor 12.
Step S83: The controller 13 moves the characteristic point WPi repeatedly back and forth between a first image central axis A and a second image central axis B, until the characteristic point WPi hits an intersection point I of the first image central axis A and the second image central axis B.
Step S84: The controller 13 determines whether or not a distance between the characteristic point WPi and the intersection point I of the first image central axis A and the second image central axis B is smaller than a threshold valve. If positive, go to perform Step S85. If negative, then go to perform Step S841.
Step S841: The controller 13 applies a random-number generator to generate Euler-angle increments (ΔRx,ΔRy,ΔRz) to correct corresponding Euler angles of the robot R, and the method goes back to perform Step S83.
Step S85: The controller 13 records a first set of joint values of the characteristic point WPi, i.e., a first calibration point.
Step S86: The controller 13 determines whether or not a quantity of the first set of joint values is larger than or equal to 4. If positive, go to perform Step S87. If negative, go to perform Step S841. In this disclosure, the aforesaid quantity can be any other integer.
Step S87: The controller 13 determines whether or not a quantity of the characteristic points WPi is larger than or equal to a preset number. If positive, go to perform Step S88. If negative, go to perform Step S871.
Step S871: Given i=i+1.
Step S88: Derive coordinates of a virtual tool center point TCP and the workpiece W with respect to the robot R by a tool-center calibration method which is disclosed in the prior application.
As described above, in the automated calibration system and method for a workpiece coordinate frame of a robot provided by this disclosure, the calibration process mainly includes four portions: (1) mounting a physical image sensor on a flange at an end of the robot; (2) obtaining a transformation relationship between a coordinate system of the flange of the robot and that of the physical image sensor so as to transform motion information captured as an image into information convenient for the robot; (3) applying a multi-vision means to construct the virtual image sensor and the position of the virtual tool center point so as to generate a 2.5D machine vision; and, (4) applying a visual servo means to control the robot to overlap a designated point on the workpiece and an intersection point of two image axes. In addition, the aforesaid overlapping operation can be achieved by any of the two following method: (41) controlling the robot having four, for example, different postures to overlap the virtual tool center point and the origin of the workpiece coordinate system, then overlapping the virtual tool center point of any posture of the robot and each of an arbitrary point at an X axis of the workpiece coordinate system and another arbitrary point on the X-Y plane, and recording the coordinates; or (42) controlling the robot having four, for example, different postures to overlap the virtual tool center point and each of four known coordinate points on the workpiece of the workpiece coordinate system, and recording the coordinates.
According to this disclosure, only one physical image sensor is needed to be furnished onto the flange at the end of the robot, and, through calibrating the characteristic points, the calibration process avoids direct contact between the workpiece and the robot, and can complete the position correction of the workpiece in a single calibration process. Thereupon, the calibration precision can be effectively enhanced.
With respect to the above description then, it is to be realized that the optimum dimensional relationships for the parts of the disclosure, to include variations in size, materials, shape, form, function and manner of operation, assembly and use, are deemed readily apparent and obvious to one skilled in the art, and all equivalent relationships to those illustrated in the drawings and described in the specification are intended to be encompassed by the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
201911308696.3 | Dec 2019 | CN | national |