System and method for calibrating tool center point of robot

Information

  • Patent Grant
  • 10926414
  • Patent Number
    10,926,414
  • Date Filed
    Monday, December 18, 2017
    6 years ago
  • Date Issued
    Tuesday, February 23, 2021
    3 years ago
Abstract
A system for calibrating tool center point of robot is provided, which may include a first image sensor, a second image sensor, and a controller. The first image sensor may have a first image central axis. The second image sensor may have a second image central axis not parallel to the first image central axis, and intersect the first image central axis at an intersection point. The controller may control a robot to repeatedly move a tool center point thereof between the first and the second image central axis. The controller may record a calibration point including the coordinates of the joints of the robot when the tool center point overlaps the intersection point, and then move the tool center point and repeat the above steps to generate several calibration points, whereby the controller may calculate the coordinate of the tool center point according to the calibration points.
Description
CROSS REFERENCE TO RELATED APPLICATION

All related applications are incorporated by reference. The present application is based on, and claims priority from, Taiwan Application Serial Number 106133775, filed on Sep. 29, 2017, the disclosure of which is hereby incorporated by reference herein in its entirety.


TECHNICAL FIELD

The technical field relates to a system for calibrating tool center point of robot, in particular to an automatic system for calibrating tool center point of robot. The technical field further relates to the method of the system for calibrating tool center point of robot.


BACKGROUND

With advance of technology, the industrial manipulators are more widely used to various industries. In general, the most frequently used industrial manipulator is articulated robotic arm, which has several joints. One end of the robotic arm is installed with a tool, such as welding tool or drilling tool, in order to perform various operations. The tool center point (TCP) of the tool of the robotic arm should be precisely calibrated in advance before the robotic arm starts an operation. Then, the controller of the robotic arm can control the tool according to the tool center point to move the tool in the correct path.


SUMMARY

An embodiment of the present disclosure relates to a system for calibrating tool center point of industrial manipulator (hereinafter “robot”). The system includes a first image sensor, a second image sensor and a controller. The first image sensor includes a first image central axis. The second image sensor includes a second image central axis intersecting the first image axis at an intersection point. The controller can control a robot to repeatedly move the tool center point of the tool thereof between the first image central axis and the second image central axis until the tool center point overlaps the intersection point. The controller can record a calibration point including the coordinates of the joints of the robot when the tool center point overlaps the intersection point, and move the tool center point and repeat the above steps to generate a plurality of the calibration points in order to calculate the coordinate of the tool center point according to the calibration points.


Another embodiment of the present disclosure relates to a method for calibrating tool center point of robot. The method includes the following steps: providing a first image sensor having a first image central axis; providing a second image sensor having a second image central axis not parallel to the first image central axis and intersecting the first image central axis at an intersection point; controlling a robot to repeatedly move the tool center point of the tool thereof between the first image central axis and the second image central axis until the tool center point overlaps the intersection point, and recording a calibration point including the coordinates of the joints of the robot; moving the tool center point, and repeating the above steps to generate a plurality of the calibration points; and calculating the coordinate of the tool center point according to the calibration points.


Further scope of applicability of the present application will become more apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating exemplary embodiments of the disclosure, are given by way of illustration only, since various changes and modifications within the spirit and scope of the disclosure will become apparent to those skilled in the art from this detailed description.





BRIEF DESCRIPTION OF THE DRAWINGS

The present disclosure will become more fully understood from the detailed description given herein below and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present disclosure and wherein:



FIG. 1 is a first schematic diagram of a system for calibrating tool center point of robot of a first embodiment in accordance with the present disclosure.



FIG. 2A˜FIG. 2C are from second to fourth schematic diagrams of the first embodiment.



FIG. 3A˜FIG. 3C are from fifth to seventh schematic diagrams of the first embodiment.



FIG. 4 is an eighth schematic diagram of the first embodiment.



FIG. 5˜FIG. 7 are from first to third flow charts of the first embodiment.



FIG. 8 is a flow chart of a system for calibrating tool center point of robot of a second embodiment in accordance with the present disclosure.





DETAILED DESCRIPTION

In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.



FIG. 1, FIG. 2A˜FIG. 2C, FIG. 3A˜FIG. 3C and FIG. 4 are from first to eighth schematic diagrams of a system for calibrating tool center point of robot of a first embodiment in accordance with the present disclosure. As shown in FIG. 1, the system 1 for calibrating tool center point of robot includes a first image sensor 11, a second image sensor 12, and a controller 13. In an embodiment, the first image sensor 11 and the second image sensor 12 may be a camera or other similar devices. The robot R includes a main body M and a tool T disposed at one end of the main body M. Besides, the main body M includes a plurality of joints J1˜J6. The system 1 for calibrating tool center point of robot can be used to calibrate the tool center point TCP of the tool T.



FIG. 2A shows the image of the first image sensor 11; FIG. 2B shows the relation between the space vectors and the coordinate system (x1C-y1C-z1C) of the first image sensor 11; FIG. 2C shows the moving path that the tool center point TCP moves along the coordinate system (x1C-y1C-z1C) of the first image sensor 11. As shown in FIG. 2A and FIG. 2B, the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x1C-y1C-z1C) of the first image sensor 11 should be calculated before the calibration of the tool center point TCP.


The first step is to obtain the transformation relation between the coordinate system of the robot R and the coordinate system of the first image sensor 11. The controller 13 controls the robot R to move the tool center point TCP thereof, along the horizontal axis (xR axis) of the coordinate system of the robot R, from any position of the image overlapping area IA between the first image sensor 11 and the second image sensor 12 by a distance LR to obtain a projected coordinate point of the first image sensor 11. A first projected coordinate P′x1=(x11, y11) is obtained from the first image sensor 11. Then, the first space vector custom character of the first projected coordinate can be assumed to be custom character=(x11, y11, z11), where z11 is unknown.


Next, the controller 13 controls the robot R to move the tool center point TCP thereof, along the longitudinal axis (yR axis) of the coordinate system of the robot R, from the aforementioned position of the image overlapping area IA between the first image sensor 11 and the second image sensor 12 by the distance LR to obtain a projected coordinate point of the first image sensor 11. A second projected coordinate P′y1=(x21, y21) is obtained from the first image sensor 11. Then, the second space vector custom character of the second projected coordinate can be assumed to be custom character=(x21, y21, z21), where z21 is unknown.


Similarly, the controller 13 controls the robot R to move the tool center point TCP thereof, along the vertical axis (zR axis) of the coordinate system of the robot R, from the aforementioned position of the image overlapping area IA between the first image sensor 11 and the second image sensor 12 by the distance LR to obtain a projected coordinate point of the first image sensor 11. A third projected coordinate P′z1=(x31, y31) is obtained from the first image sensor 11. Then, the third space vector custom character of the third projected coordinate can be assumed to be custom character=(x31, y31, z31), where z31 is unknown.


As the space vectors custom character, custom character and custom character are perpendicular to one another, so the controller 13 can calculate the space vectors custom character, custom character and custom character according to Equation (1), Equation (2) and Equation (3):

custom character·custom character=0  (1)
custom character·custom character=0  (2)
custom character·custom character=0  (3)


Finally, the controller 13 can obtain the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x1C-y1C-z1C) of the first image sensor 11, as shown in Equation (4):










S
R

=



[




U
1








U
1















V
1








V
1















W
1








W
1








]


-
1




S

C





1







(
4
)







In Equation (4), SC1 stands for the movement amount that the tool center point TCP moves along the coordinate system (x1C-y1C-z1C) of the first image sensor 11. As shown in FIG. 2C, SR stands for the movement amount that the tool center point TCP moves along the coordinate system (xR-yR-zR) of the robot R.


Therefore, when moving the tool center point TCP of the robot R, the controller 13 can obtain the movement amount SC1 from the images of the first image sensor 11. Then, the space vectors custom character, custom character and custom character and the lengths ∥custom character∥, ∥custom character∥ and ∥custom character∥ thereof can be calculated according to Equations (1)˜(3). The controller 13 obtains a matrix by dividing the space vectors custom character, custom character and custom character by the lengths ∥custom character∥, ∥custom character∥ and ∥custom character∥ thereof respectively, and then multiply the inverse matrix of the aforementioned matrix by the movement amount SC1 to obtain the movement amount SR. The movement amount SR refers that the tool center point TCP moves along the coordinate system (xR-yR-zR) of the robot R.



FIG. 3A shows the image of the second image sensor 12; FIG. 3B shows the relation between the space vectors and the coordinate system (x2C-y2C-z2C) of the second image sensor 12; FIG. 3C shows the moving path that the tool center point TCP moves along the coordinate system (x2C-y2C-z2C) of the second image sensor 12. As shown in FIG. 3A and FIG. 3B, the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x2C-y2C-z2C) of the second image sensor 12 should be calculated before the calibration of the too center point TCP.


The controller 13 controls the robot R to move the tool center point TCP thereof, along the horizontal axis (xR axis) of the coordinate system of the robot R, from any position of the image overlapping area IA between the first image sensor 11 and the second image sensor 12 by a distance LR to obtain a projected coordinate point of the second image sensor 12. A first projected coordinate P′x2=(x12, y12) is obtained from the second image sensor 12. Then, the first space vector custom character of the first projected coordinate can be assumed to be custom character=(x y12, z12), where z12 is unknown.


Next, the controller 13 controls the robot R to move the tool center point TCP thereof, along the longitudinal axis (yR axis) of the coordinate system of the robot R, from the aforementioned position of the image overlapping area IA between the first image sensor 11 and the second image sensor 12 by the distance LR to obtain a projected coordinate point of the second image sensor 12. A second projected coordinate P′y2=(x22, y22) is obtained from the first image sensor 11. Then, the second space vector custom character of the second projected coordinate can be assumed to be custom character=22 (x22, y22, z22), where z22 is unknown.


Similarly, the controller 13 controls the robot R to move the tool center point TCP thereof, along the vertical axis (zR axis) of the coordinate system of the robot R, from the aforementioned position of the image overlapping area IA between the first image sensor 11 and the second image sensor 12 by the distance LR to obtain a projected coordinate point of the second image sensor 12. A third projected coordinate P′z2=(x32, y32) is obtained from the second image sensor 12. Then, the third space vector custom character of the third projected coordinate can be assumed to be custom character=(x32, y32, z32), where z32 is unknown.


As the space vectors custom character, custom character and custom character are perpendicular to one another, so the controller 13 can calculate the space vectors custom character, custom character and custom character according to Equation (5), Equation (6) and Equation (7):

custom character·custom character=0  (5)
custom character·custom character=0  (6)
custom character·custom character=0  (7)


Finally, the controller 13 obtains the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x2C-y2C-z2C) of the second image sensor 12, as shown in Equation (8):










S
R

=



[




U
2








U
2















V
2








V
2















W
2








W
2








]


-
1




S

C





2







(
8
)







In Equation (8), SC2 stands for the movement amount that the tool center point TCP moves along the coordinate system (x2C-y2C-z2C) of the second image sensor 12, as shown in FIG. 3C; SR stands for the movement amount that the tool center point TCP moves along the coordinate system (xR-yR-zR) of the robot R.


Therefore, when moving the tool center point TCP of the robot R, the controller 13 can obtain the movement amount SC2 from the images of the second image sensor 12. Then, the space vectors custom character, custom character and custom character and the lengths ∥custom character∥, ∥custom character∥ and ∥custom character∥ thereof can be calculated according to Equations (5)˜(7). The controller 13 can obtain a matrix by dividing the space vectors custom character, custom character and custom character by the lengths ∥custom character∥, ∥custom character∥ and ∥custom character∥ thereof respectively, and then multiply the inverse matrix of the aforementioned matrix by the movement amount SC2 to obtain the movement amount SR. The movement amount SR refers that the tool center point TCP moves along the coordinate system (xR-yR-zR) of the robot R.


In this way, the controller 13 can control the robot R to move the tool center point TCP thereof via visual servo control according to the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x1C-y1C-z1C) of the first image sensor 11, and the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x2C-y2C-z2C) of the second image sensor 12.


As shown in FIG. 4, the first image sensor 11 has a first image central axis A; in an embodiment, the first image sensor 11 may be a camera or other similar devices.


The second image sensor 12 has a second image central axis B. The first image central axis A is not parallel to the second image axis B, and intersects the second image central axis B at an intersection point I. Besides, the first image sensor 11 and the second image sensor 12 have an image overlapping area IA, so the first image sensor 11 and the second image sensor 12 can provide a 2.5D machine vision. In another embodiment, the first image central axis A is substantially perpendicular to the second image central axis B. In an embodiment, the second image sensor 12 may be a camera or other similar devices.


The controller 13 controls the robot R to repeatedly move the tool center point TCP of the tool T thereof between the first image central axis A and the second image central axis B. In an embodiment, the controller 13 may be various computer devices. Then, the controller 13 records a calibration point when the tool center point TCP overlaps the intersection point I of the first image central axis A and the second image central axis B. Then the controller 13 changes the posture of the robot R to record next calibration point. In this way, the controller 13 can obtain a plurality of calibration points in different postures of the robot R. Finally, the controller 13 calculates the coordinate of the tool center point TCP according to the calibration points, where each calibration point includes the coordinates of the joints J1˜J6. The coordinate of each joint is the rotation angle that the joint rotates relatively to a default initial point. For example, if the joint angle θ stands for the coordinate of a joint, the coordinates of the joints J1˜J6 can be expressed as θJ1, θJ2, θJ3, θJ4, θJ5 and θJ6. Thus, a calibration point can be expressed as (θJ1, θJ2, θJ3, θJ4, θJ5, θJ6).


As shown in FIG. 4, the controller 13 controls the robot R to move the tool center point TCP thereof to an initial point O in the image overlapping area IA of the first image sensor 11 and the second image sensor 12. The initial point O may be any point in the image overlapping area IA. Then, the controller 13 controls the robot R to move the tool center point TCP toward the first image central axis A from the initial point O to the point T1, as shown by the path PH1. Afterward, the controller 13 controls the robot R to move the tool center point TCP toward the second image central axis B from the point T1 to the point T2, as shown by the path PH2. Similarly, the controller 13 controls the robot R to move the tool center point TCP toward the first image central axis A from the point T2 to the point T3, as shown by the path PH3. Then the controller 13 controls the robot R to move the tool center point TCP toward the second image central axis B from the point T3 to the point T4, as shown by the path PH4. Finally, the controller 13 controls the robot R to move the tool center point TCP toward the first image central axis A from the point T4 to the intersection point I, and records a first calibration point CP1 when the tool center point TCP overlaps the intersection point I. In the embodiment, when the distance between the tool center point TCP and the first image central axis A, and the distance between the tool center point TCP and the second image central axis B are less than a threshold value, the tool center point TCP is considered overlapping the intersection point I. In general, the threshold value is set as 50% of the pixels of the tool center point TCP. In other words, when the pixels of the tool center point TCP overlaps the first image central axis A and the second image central axis B at 50%, the tool center point TCP is considered overlapping the intersection point I. Of course, the aforementioned threshold value can be adjusted according to the size and the type of the tool T. As described above, the controller 13 can control the robot R to repeatedly move the tool center point TCP between the first image central axis A and the second image central axis B in order to obtain the first calibration point CP1.


Then, the controller 13 determines whether the number of the calibration points is greater or equal to a default value. In the embodiment, the number of the calibration points should be greater than or equal to 3. If the number of the calibration points is less than 3, the controller 13 generates an Euler angle increment ΔRx, ΔRy, ΔRz by a random number generator to modify the Euler angle of the robot R in order to change the posture of the robot R. At the moment, the Euler angle of the robot R can be expressed as (Rx+ΔRx, Ry ΔRy, Rz+ΔRz), where (Rx, Ry, Rz) stands for the original Euler angle of the robot R, Rx stands for a yaw angel, Ry stands for a pitch angle and Rz stands for a roll angel. If the modified Euler angle exceeds the motion range of the robot R or the image overlapping area IA, the controller 13 re-generates a new Euler angle increment by the random number generator.


Afterward, the controller 13 controls the robot R to repeatedly move the tool center point TCP thereof between the first image central axis A and the second image central axis B, and records a second calibration point CP2 when the tool center point TCP overlaps the intersection point I.


Next, the controller 13 determines whether the number of the calibration points is greater or equal to 3. If the controller 13 determines that the number of the calibration points is less than 3, the controller 13 repeats the above steps to obtain and record the third calibration point CP3 until the controller 13 determines that the number of the calibration points is greater or equal to 3.


As shown in FIG. 1, the controller 13 can calculate the coordinate of the tool center point TCP according to the calibration points CP1˜CP3. The coordinate of each of the calibration points CP1˜CP3 can be obtained according to the link rod parameters (Denavit-Hartenberg Parameters) of the robot R, the coordinates of the joints J1˜J6 and the coordinate system (xf-yf-zf) of the tool center point TCP in relation to the flange facing F; more specifically, the link rod parameters include link offsets d, joint angles θ, link lengths a, and link twist angles α.


The coordinate of the tool center point TCP can be calculated according to Equation (9):

T1iT2=P  (9)


In Equation (9), the matrix T1i stands for the 4×4 homogeneous transformation for transforming the ith calibration point from the coordinate system (xb-yb-zb) of the base to the coordinate system (xf-yf-zf) of the flange facing F. The matrix T2 stands for the coordinate of the tool center point TCP in relation to the coordinate system of the flange facing F. The matrix P stands for the coordinate of the calibration point in relation to the coordinate system (xb-yb-zb) of the base in the space. Each calibration point can obtain 3 linear equations via Equation (9). Therefore, the coordinate of the tool center point TCP can be calculated by Pseudo-inverse matrix according to 2n equations obtained from n calibration points. Equation (10) is derived from Equation (9):











[




e

11

i





e

12

i





e

13





i





e

14

i







e

21

i





e

22

i





e

23

i





e

24

i







e

31

i





e

32

i





e

33

i





e

34

i






0


0


0


1



]



[




T
x






T
y






T
z





1



]


=

[




P
x






P
y






P
z





1



]





(
10
)







In Equation (10), the coordinate (e11i, e21i, e31i) stands for the direction of the xf-axis vector of the ith calibration point in relation to the coordinate system (xb-yb-zb) of the base. The coordinate (e12i, e22i, e32i) stands for the direction of the yf-axis vector of the ith calibration point in relation to the coordinate system (xb-yb-zb) of the base. The coordinate (e13i, e23i, e33i) stands for the direction of the zf-axis vector of the ith calibration point in relation to the coordinate system (xb-yb-zb) of the base. Equations (11)˜(12) are derived from Equation (10):











[




e
111




e
121




e
131




-
1



0


0





e
211




e
221




e
231



0



-
1



0





e
311




e
321




e
331



0


0



-
1






e
112




e
122




e
132




-
1



0


0





e
212




e
222




e
232



0



-
1



0





e
312




e
322




e
332



0


0



-
1






e
113




e
123




e
133




-
1



0


0





e
213




e
223




e
233



0



-
1



0





e
313




e
323




e
333



0


0



-
1




]



[




T
x






T
y






T
z






P
x






P
y






P
z




]


=

[




-

e
141







-

e
241







-

e
341







-

e
142







-

e
242







-

e
342







-

e
143







-

e
243







-

e
343





]





(
11
)







[




T
x






T
y






T
z






P
x






P
y






P
z




]

=




T
3
t



(


T
3



T
3
t


)



-
1




[




-

e
141







-

e
241







-

e
341







-

e
142







-

e
242







-

e
342







-

e
143







-

e
243







-

e
343





]






(
12
)








In Equation (10),








T
3

=

[




e
111




e
121




e
131




-
1



0


0





e
211




e
221




e
231



0



-
1



0





e
311




e
321




e
331



0


0



-
1






e
112




e
122




e
132




-
1



0


0





e
212




e
222




e
232



0



-
1



0





e
312




e
322




e
332



0


0



-
1






e
113




e
123




e
133




-
1



0


0





e
213




e
223




e
233



0



-
1



0





e
313




e
323




e
333



0


0



-
1




]


,





where T3t stands for the transpose matrix of T3, and (T3T3t)−1 stands for the inverse matrix of (T3T3t).


If the number of the calibration points is sufficient and the matrix T1i corresponding to the given ith calibration point is obtained, Equation (12) can be derived by substituting all elements of the matrix into Equation (11) and then shifting the matrix T3. After that, the coordinate (Tx, Ty, Tz) of the tool center point TCP in relation to the coordinate system of the flange facing F, and the coordinate (Px, Py, Pz) of the tool center point TCP in relation to the coordinate system (xR-yR-zR) of the robot R can be obtained. Finally, the calibration process of the tool center point TCP is finished.


The embodiment just exemplifies the present disclosure and is not intended to limit the scope of the present disclosure. Any equivalent modification and variation according to the spirit of the present disclosure is to be also included within the scope of the following claims and their equivalents.


As described above, in the embodiment, the system 1 for calibrating tool center point of robot can automatically calibrate the tool center point TCP of the robot R via visual servo control. The system 1 achieves high accuracy, which can effectively reduce labor cost and time waste. In addition, the system 1 for calibrating tool center point of robot can precisely calibrate the tool center point TCP of the robot R just by performing the calibration process for only one time, so the system 1 can more efficiently calibrate the tool center point TCP of the robot R. Thus, the system 1 for calibrating tool center point of robot can exactly improve the shortcomings of prior art.



FIG. 5˜FIG. 7 are from first to third flow charts of the first embodiment. As shown in FIG. 5, the method of obtaining the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x1C-y1C-z1C) of the first image sensor 11 includes the following steps.


Step S51: Controlling a robot R to move the tool center point TCP thereof, along the horizontal axis xR of the coordinate system (xR-yR-zR) of the robot R, from any position of an image overlapping area IA by a distance LR, and obtaining a first projected coordinate P′x1 from a first image sensor 11.


Step S52: Controlling the robot R to move the tool center point TCP thereof, along the longitudinal axis yR of the coordinate system (xR-yR-zR) of the robot R, from the aforementioned position of the image overlapping area IA by the distance LR, and obtaining a second projected coordinate P′y1 from the first image sensor 11.


Step S53: Controlling the robot R to move the tool center point TCP thereof, along the vertical axis zR of the coordinate system (xR-yR-zR) of the robot R, from the aforementioned position of the image overlapping area IA by the distance LR, and obtaining a third projected coordinate P′z1 from the first image sensor 11.


Step S54: Providing a first space vector custom character, a second space vector custom character, and a third space vector custom character respectively corresponding to the first projected coordinate P′x1, the second projected coordinate P′y1, and the third projected coordinate P′z1.


Step S55: Calculating the first space vector custom character, the second space vector custom character, and the third space vector custom character according to the vertical relation of the first space vector custom character, the second space vector custom character, and the third space vector custom character.


Step S56: Calculating the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x1C-y1C-z1C) of the first image sensor 11 according to the first space vector custom character, the second space vector custom character, and the third space vector custom character, as shown in aforementioned Equation (4).


As shown in FIG. 6, the method of obtaining the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x2C-y2C-z2C) of the second image sensor 12 includes the following steps.


Step S61: Controlling the robot R to move the tool center point TCP thereof, along the horizontal axis xR of the coordinate system (xR-yR-zR) of the robot R, from any position of the image overlapping area IA by the distance LR, and obtaining a first projected coordinate P′x2 from a second image sensor 12.


Step S62: Controlling the robot R to move the tool center point TCP thereof, along the longitudinal axis yR of the coordinate system (xR-yR-zR) of the robot R, from the aforementioned position of the image overlapping area IA by the distance LR, and obtaining a second projected coordinate P′y2 from the second image sensor 12.


Step S63: Controlling the robot R to move the tool center point TCP thereof, along the vertical axis zR of the coordinate system (xR-yR-zR) of the robot R, from the aforementioned position of the image overlapping area IA by the distance LR, and obtaining a third projected coordinate P′z2 from the second image sensor 12.


Step S64: Providing a first space vector custom character, a second space vector custom character, and a third space vector custom character respectively corresponding to the first projected coordinate P′x2, the second projected coordinate P′y2, and the third projected coordinate P′z2.


Step S65: Calculating the first space vector custom character, the second space vector custom character, and the third space vector custom character according to the vertical relation of the first space vector custom character, the second space vector custom character, and the third space vector custom character.


Step S66: Calculating the transformation relation between the coordinate system (xR-yR-zR) of the robot R and the coordinate system (x2C-y2C-z2C) of the second image sensor 12 according to the first space vector custom character, the second space vector custom character, and the third space vector custom character, as shown in aforementioned Equation (8).


As shown in FIG. 7, the method of the system 1 for calibrating tool center point of robot includes the following steps.


Step S71: Providing a first image sensor 11 having a first image central axis A.


Step S72: Providing a second image sensor 12 having a second image central axis B not parallel to the first image central axis A and intersecting the first image central axis A at an intersection point I.


Step S73: Controlling a robot R to repeatedly move the tool center point TCP of the tool T thereof between the first image central axis A and the second image central axis B.


Step S74: Recording a calibration point including the coordinates of the joints J1˜J6 of the robot R when the tool center point TCP overlaps the intersection point I.


Step S75: Repeating the above steps to generate a plurality of the calibration points.


Step S76: Calculating the coordinate of the tool center point TCP according to the calibration points.


It is worthy to point out that the currently available technology for calibrating tool center point of robot may need a user to manually operate a robot in order to calibrate the tool center point of the robot, which tends to cause man-made errors. The users may not precisely calibrate the tool center point. Thus, the currently available technology for calibrating tool center point of robot tends to result in low calibration accuracy, high labor cost and time waste. On the contrary, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure includes a controller 13 and image sensors 11, 12, and can automatically calibrate the tool center point TCP of a robot R via visual servo control, which can achieve high calibration accuracy, reduce labor cost and time waste.


Besides, the currently available technology for calibrating tool center point of robot may need additional measurement devices to calibrate the tool center point of the robot. However, these measurement devices should be also calibrated in advance, which increases not only labor cost and time waste, but also significantly increase the cost of the calibration instruments. On the contrary, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure can automatically calibrate the tool center point TCP of the robot R by the controller 11 and the image sensors 11, 12 without calibrating the positions of the image sensors 11, 12 in advance, which can further reduce labor cost and time waste.


Moreover, the currently available technology for calibrating tool center point of robot may need the users to repeatedly calibrate the tool center point of the robot for several times, so it is inefficient in use. On the contrary, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure can achieve high calibration accuracy just by performing the calibration process for only one time, so the system 1 can more efficiently calibrate the tool center point TCP of the robot R.


Further, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure can be applied to various types of robots, so the system 1 is more flexible in use. As described above, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure definitely has an inventive step.



FIG. 8 is a flow chart of a system 1 for calibrating tool center point of robot of a second embodiment in accordance with the present disclosure. The embodiment illustrates the detailed steps that the system 1 calibrates the tool center point TCP of a robot R.


Step S81: The controller 13 controls the robot R to move the tool center point TCP thereof to any position of the image overlapping area IA between the first image sensor 11 and the second image sensor 12; then, the process proceeds to Step S82.


Step S82: The controller 13 controls the robot R to repeatedly move the tool center point TCP thereof between the first image central axis A and the second image central axis B. The controller 13 records a calibration point when the tool center point TCP overlaps the intersection point I between the first image central axis A and the second image central axis B; then, the process proceeds to Step S83.


Step S83: The controller 13 determines whether the number of the calibration points is greater or equal to a default value. If it is, the process proceeds to Step S84; it is not, the process proceeds to Step S831.


Step S831: The controller 13 generates an Euler angle increment (ΔRx, ΔRy, ΔRz) by a random number generator to modify the Euler angle of the robot R; then, the process returns to Step S82.


Step S84: The controller 13 calculates the coordinate of the tool center point TCP according to the calibration points.


In summation of the description above, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure includes a controller and image sensors, and can automatically calibrate the tool center point of a robot via visual servo control, which can achieve high calibration accuracy, reduce labor cost and time waste.


Besides, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure can automatically calibrate the tool center point of the robot by the controller and image sensors without calibrating the positions of the image sensors in advance, which can further reduce labor cost and time waste.


Moreover, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure can achieve high calibration accuracy just by performing the calibration process for only one time, so the system can more efficiently calibrate the tool center point of the robot.


Further, the system 1 for calibrating tool center point of robot according to one embodiment of the present disclosure can be applied to various types of robots, so is more flexible in use.


It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.

Claims
  • 1. A system for calibrating tool center points of a robot, comprising: a first image sensor, comprising a first image central axis;a second image sensor, comprising a second image central axis intersecting the first image axis at an intersection point, wherein a field of view of the second image sensor overlaps a field of view of the first image sensor to form an image overlapping area, and the first image central axis and the second image central axis are inside the image overlapping area; anda controller, configured to control a robot to move a tool center point of a tool thereof from an initial point in the image overlapping area toward the first image central axis, and changes a moving direction of the tool center point so as to move the tool center point toward the second image central axis when the tool center point reaches the first image central axis, and changes the moving direction of the tool center point so as to move the tool center point toward the first image central axis when the tool center point reaches the second image central axis so as to repeatedly move the tool center point within the image overlapping area and between the first image central axis and the second image central axis until the tool center point overlaps the intersection point;wherein the controller is configured to record a calibration point including coordinates of joints of the robot when the tool center point overlaps the intersection point, and the controller is configured to move the tool center point and change a posture of the robot and repeat the above steps to generate a plurality of the calibration points in order to calculate a coordinate of the tool center point according to the calibration points.
  • 2. The system for calibrating tool center points of the robot of claim 1, wherein the controller controls the robot according to a first transformation relation between a coordinate system of the first image sensor and a coordinate system of the robot, a second transformation relation between a coordinate system of the second image sensor and the coordinate system of the robot, and images of the first image sensor and the second image sensor.
  • 3. The system for calibrating tool center points of the robot of claim 1, wherein the coordinate of each joint is a rotation angle of the joint rotating relatively to a default initial point.
  • 4. The system for calibrating tool center points of the robot of claim 3, wherein the controller calculates the coordinate of the tool center point according to the calibration points and a link rod parameter of the robot.
  • 5. The system for calibrating tool center points of the robot of claim 1, wherein a number of the calibration points is greater or equal to a default value.
  • 6. The system for calibrating tool center points of the robot of claim 5, wherein when the number of the calibration points is less than 3, the controller generates an Euler angle increment by a random number generator to modify an Euler angle of the robot in order to change a posture of the robot.
  • 7. The system for calibrating tool center points of the robot of claim 6, wherein after the controller modifies the Euler angel of the robot, the controller controls the robot to repeatedly move the tool center point between the first image central axis and the second image central axis in order to overlap the intersection point so as to generate next calibration points until the number of the calibration points is greater or equal to the default value.
  • 8. The system for calibrating tool center points of the robot of claim 1, wherein when the tool center point overlaps the intersection point between the first image central axis and the second image central axis, a first distance between the tool center point and the first image central axis and a second distance between the tool center point and the second image central axis are less than a threshold value.
  • 9. The system for calibrating tool center points of the robot of claim 1, wherein the controller generates the calibration points by different postures of the robot.
  • 10. The system for calibrating tool center points of the robot of claim 1, wherein the coordinate of the tool center point comprises a coordinate of the tool center point in relation to a base of the robot, or a coordinate of the tool center point in relation to a flange facing of the robot.
  • 11. The system for calibrating tool center points of the robot of claim 1, wherein the first image central axis is perpendicular to the second image central axis.
  • 12. A method for calibrating tool center points of a robot, comprising: providing a first image sensor having a first image central axis;providing a second image sensor having a second image central axis not parallel to the first image central axis and intersecting the first image central axis at an intersection point, wherein a field of view of the second image sensor overlaps a field of view of the first image sensor to form an image overlapping area, and the first image central axis and the second image central axis are inside the image overlapping area;controlling a robot to move a tool center point of a tool thereof from an initial point in the image overlapping area toward the first image central axis, and change a moving direction of the tool center point so as to move the tool center point toward the second image central axis when the tool center point reaches the first image central axis, and change the moving direction of the tool center point so as to move the tool center point toward the first image central axis when the tool center point reaches the second image central axis so as to repeatedly move the tool center point within the image overlapping area and between the first image central axis and the second image central axis until the tool center point overlaps the intersection point, and recording a calibration point including coordinates of joints of the robot;moving the tool center point, changing a posture of the robot, and repeating the above steps to generate a plurality of the calibration points; andcalculating a coordinate of the tool center point according to the calibration points.
  • 13. The method for calibrating tool center points of the robot of claim 12, wherein the step of controlling the robot to repeatedly move the tool center point of the tool thereof between the first image central axis and the second image central axis until the tool center point overlaps the intersection point, and recording the calibration point including the coordinates of the joints of the robot further comprises the following step: providing a first transformation relation between a coordinate system of the first image sensor and a coordinate system of the robot, and a second transformation relation between a coordinate system of the second image sensor and the coordinate system of the robot; andcontrolling the robot to move the tool center point according to the first and second transformation relations and images of the first image sensor and the second image sensor.
  • 14. The method for calibrating tool center points of the robot of claim 13, wherein the step of providing the first transformation relation between the coordinate system of the first image sensor and the coordinate system of the robot, and the second transformation relation between the coordinate system of the second image sensor and the coordinate system of the robot further comprises following steps: controlling the robot to move the tool center point, along a horizontal axis of the coordinate system of the robot, from any position of the image overlapping area between the first image sensor and the second image sensor by a distance, and obtaining a first projected coordinate from the first image sensor and the second image sensor;controlling the robot to move the tool center point, along a longitudinal axis of the coordinate system of the robot, from the position by the distance, and obtaining a second projected coordinate from the first image sensor and the second image sensor; andcontrolling the robot to move the tool center point, along a vertical axis of the coordinate system of the robot, from the position by the distance, and obtaining a third projected coordinate from the first image sensor and the second image sensor.
  • 15. The method for calibrating tool center points of the robot of claim 14, wherein the step of providing the first transformation relation between the coordinate system of the first image sensor and the coordinate system of the robot, and the second transformation relation between the coordinate system of the second image sensor and the coordinate system of the robot further comprises following steps: providing a first space vector, a second space vector, and a third space vector respectively corresponding to the first projected coordinate, the second projected coordinate, and the third projected coordinate;calculating the first space vector, the second space vector, and the third space vector according to a vertical relation of the first space vector, the second space vector, and the third space vector; andcalculating the first transformation relation between the coordinate system of the first image sensor and the coordinate system of the robot, and the second transformation relation between the coordinate system of the second image sensor and the coordinate system of the robot according to the first space vector, the second space vector, and the third space vector.
  • 16. The method for calibrating tool center points of the robot of claim 12, wherein the coordinate of each joint is a rotation angle of the joint rotating relatively to a default initial point.
  • 17. The method for calibrating tool center points of the robot of claim 16, wherein the step of calculating the coordinate of the tool center point according to the calibration points further comprises a following step: calculating the coordinate of the tool center point according to the calibration points and a link rod parameter of the robot.
  • 18. The method for calibrating tool center points of the robot of claim 12, wherein a number of the calibration points is greater or equal to a default value.
  • 19. The method for calibrating tool center points of the robot of claim 12, wherein the step of controlling the robot to repeatedly move the tool center point of the tool thereof between the first image central axis and the second image central axis until the tool center point overlaps the intersection point, and recording the calibration point including the coordinates of the joints of the robot further comprises following steps: controlling the robot to repeatedly move the tool center point until a first distance between the tool center point and the first image central axis and a second distance between the tool center point and the second image central axis are less than a threshold value.
  • 20. The method for calibrating tool center points of the robot of claim 19, wherein the step of controlling the robot to repeatedly move the tool center point of the tool thereof between the first image central axis and the second image central axis until the tool center point overlaps the intersection point, and recording the calibration point including the coordinates of the joints of the robot further comprises a following step: generating an Euler angle increment by a random number generator to modify an Euler angle of the robot in order to change a posture of the robot when a number of the calibration points is less than 3.
  • 21. The method for calibrating tool center points of the robot of claim 20, further comprising a following step: controlling the robot to repeatedly move the tool center point between the first image central axis and the second image central axis to generate next calibration point after the Euler angle of the robot is modified until the number of the calibration points is greater or equal to a default value.
  • 22. The method for calibrating tool center points of the robot of claim 12, wherein the step of moving the tool center point, and repeating the above steps to generate a plurality of the calibration points further comprises a following step: generating the calibration points by different postures of the robot.
  • 23. The method for calibrating tool center points of the robot of claim 12, wherein the coordinate of the tool center point comprises a coordinate of the tool center point in relation to a base of the robot, or a coordinate of the tool center point in relation to a flange facing of the robot.
  • 24. The method for calibrating tool center points of the robot of claim 12, wherein the first image central axis is perpendicular to the second image central axis.
Priority Claims (1)
Number Date Country Kind
106133775 Sep 2017 TW national
US Referenced Citations (34)
Number Name Date Kind
4453085 Pryor Jun 1984 A
5321353 Furness Jun 1994 A
5767648 Morel Jun 1998 A
5929584 Gunnarsson et al. Jul 1999 A
6044308 Huissoon Mar 2000 A
6301763 Pryor Oct 2001 B1
6728582 Wallack Apr 2004 B1
6941192 Tang et al. Sep 2005 B2
7684898 Pagel et al. Mar 2010 B2
8457790 Blondel et al. Jun 2013 B2
8798794 Walser et al. Aug 2014 B2
9002516 Chiu et al. Apr 2015 B2
9043024 Chiu et al. May 2015 B2
9050728 Ban et al. Jun 2015 B2
9517560 Amano Dec 2016 B2
20030220756 Stengele Nov 2003 A1
20070036460 Koch Feb 2007 A1
20080188986 Hoppe Aug 2008 A1
20090107485 Reznik Apr 2009 A1
20090118864 Eldridge May 2009 A1
20110074674 Walberg Mar 2011 A1
20130010081 Tenney Jan 2013 A1
20130035791 Chiu Feb 2013 A1
20140365007 Trompeter Dec 2014 A1
20150142171 Li et al. May 2015 A1
20160046023 Nagendran Feb 2016 A1
20160071272 Gordon Mar 2016 A1
20160220404 Fogelberg Aug 2016 A1
20170051671 Chalaud Feb 2017 A1
20170113351 Yoshino Apr 2017 A1
20170178379 Fu Jun 2017 A1
20190022867 Deng Jan 2019 A1
20190099887 Huang Apr 2019 A1
20190212139 Allen Jul 2019 A1
Foreign Referenced Citations (24)
Number Date Country
100398274 Jul 2008 CN
102909728 Feb 2013 CN
103101060 May 2013 CN
103209809 Jul 2013 CN
103706945 Apr 2014 CN
104345688 Feb 2015 CN
105066884 Nov 2015 CN
105818167 Aug 2016 CN
106462140 Feb 2017 CN
105437230 Aug 2017 CN
0114505 Aug 1984 EP
11123678 May 1999 JP
3186797 Jul 2001 JP
2003117861 Apr 2003 JP
201117935 Jun 2011 TW
201221801 Jun 2012 TW
201411011 Mar 2014 TW
201420290 Jun 2014 TW
201618910 Jun 2016 TW
201702034 Jan 2017 TW
201716899 May 2017 TW
I587996 Jun 2017 TW
03033219 Apr 2003 WO
2012076038 Jun 2012 WO
Non-Patent Literature Citations (11)
Entry
Mendoza Thesis (Year: 2012).
Prior Art Search History (Year: 2020).
Perception for Robotics (Year: 2019).
Changjie Liu et al., Calibration method of TCP based on stereo vision robot, Infrared and Laser Engineering, 2015, vol. 44, No. 6, 1912-1917.
Frank Shaopeng Cheng, Calibration of robot reference frames for enhanced robot positioning, Robot Manipulators, 2008.
Christof Borrmann et al., Enhanced calibration of robot tool center point using analytical algorithm, International Journal of Materials Science and Engineering, 2015, vol. 3, No. 1, 12-18.
Yin, Shibin et al., Fast Recovery Technology of Tool Center Point in Robotic Visual Measurement System, ROBOT, 2013, vol. 35, No. 6, 736-743.
Gustav Bergström, Method for calibration of off-line generated robot program, Chalmers, Master of Science Thesis, 2011.
Johan Hallenberg, Robot Tool Center Point Calibration using Computer Vision, Linkoping University, Department of Electrical Engineering, 2007.
Taiwan Patent Office, “Office Action”, dated Nov. 21, 2018, Taiwan.
China Patent Office, “Office Action”, China, dated Mar. 25, 2020.
Related Publications (1)
Number Date Country
20190099887 A1 Apr 2019 US