CALIBRATION METHOD FOR TOOL CENTER POINT, TEACHING METHOD FOR ROBOTIC ARM AND ROBOTIC ARM SYSTEM USING THE SAME

Abstract
Firstly, a robotic arm drives a projection point of tool projected on test plane to perform relative movement relative to a reference point of a test plane. Then, conversion relationship is established according to the relative movement. Then, a tool axis vector relative to an installation surface reference coordinate system of the robotic arm is obtained. Then, calibration point information group obtaining step is performed, wherein the calibration point information group obtaining step includes: (a1) the robotic arm drives a tool center point to coincide with a reference point of the test plane and records calibration point information group; (a2) the robotic arm drives the tool to change angle of the tool; and (a3) steps (a1) and (a2) are repeated to obtain several calibration point information groups. Then, tool center point coordinate relative to the installation surface reference coordinate system is obtained according to the calibration point information groups.
Description

This application claims the benefit of Taiwan application Serial No. 109129784, filed Aug. 31, 2020, the subject matter of which is incorporated herein by reference.


TECHNICAL FIELD

The disclosure relates in general to a calibration method, a teaching method for robotic arm and a robotic arm system using the same, and more particularly to a calibration method for tool center point, a teaching method for robotic arm and a robotic arm system using the same.


BACKGROUND

Along with the advancement of science and technology, the application of robotic arms has become more and more broad in various industries. Generally speaking, robotic arms are jointed-type robotic arms with multiple joints, and one end of which is provided with a tool, such as welding tools or drilling tools, etc., to perform various operations. Before the robotic arm performs operations, the position of the Tool Center Point (TCP) of the tool needs to be accurately calibrated in advance, so that controller of the robotic arm could control tool to run on a calibration path according to the TCP of the tool. However, the TCP calibration technology of the robotic arm of the prior art does have many disadvantages that need to be improved. For example, according to the TCP calibration technology of the robotic arm of the prior art, user may need to manually operate the robotic arm to calibrate the TCP of the robotic arm. Therefore, it is prone to human error and thus the TCP cannot be accurately calibrated. Conclusively, it causes low calibration accuracy, high labor cost and time cost. In addition, the current calibration method for the TCP cannot be applied to the virtual TCP.


SUMMARY

According to an embodiment, a calibration method for tool center point is provided. The calibration method includes the following steps: step of establishing a first conversion relationship between a robotic arm reference coordinate system and a camera reference coordinate system is performed, comprising: (1) driving, by a robotic arm, a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane; and (2) establishing the first conversion relationship according to the relative movement; obtaining a tool axis vector relative to an installation surface reference coordinate system of the robotic arm; a calibration point information group obtaining step is performed, comprising: (a1) driving, by the robotic arm, a tool center point to coincide with the reference point of the test plane, and recording a calibration point information group of the robotic arm; (a2) driving, by the robotic arm, the tool to change an angle of the tool axis; and (a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; and a tool center point coordinate relative to the installation surface reference coordinate system obtained according to the calibration point information groups.


According to another embodiment, a teaching method for a robotic arm is provided. The teaching method includes the following steps: (d1) by using the calibration method as described above, the tool center point coordinate is obtained and driving the tool to a first position, so that the tool center point coincides with a designated point of a detection surface at the first position; (d2) the tool is translated by a translation distance to a second position; (d3) a detection angle of the tool is obtained according to the translation distance and a stroke difference of the tool center point of the tool along the tool axis; (d4) whether the detection angle meets a specification angle is determined; (d5) the tool is driven back to the first position when the detection angle does not meet the specification angle; and (d6) posture of the robotic arm is adjusted to perform steps (d2) to (d6) until the detection angle meets the specification angle.


According to an alternative embodiment, a robotic arm system includes a robotic arm and a controller. The robotic arm is configured to carry a tool, wherein the tool has a tool axis. The controller is configured to control the robotic arm to drive a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane; establish a first conversion relationship between a robotic arm reference coordinate system of the robotic arm and a camera reference coordinate system according to the relative movement; obtain a tool axis vector relative to an installation surface reference coordinate system of the robotic arm; perform a calibration point information group obtaining step, comprising: (a1) controlling the robotic arm to drive a tool center point to coincide with the reference point of the test plane and recording a calibration point information group of the robotic arm; (a2) controlling the robotic arm to drive the tool to change an angle of the tool axis; and (a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; and obtain a tool center point coordinate relative to the installation surface reference coordinate system according to the calibration point information groups.


The above and other aspects of the disclosure will become better understood with regard to the following detailed description of the preferred but non-limiting embodiment(s). The following description is made with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a schematic diagram of a robotic arm system for calibrating the according to an embodiment of the present disclosure;



FIGS. 2A to 2D show a flowchart of the tool center point of the robotic arm system in FIG. 1 for calibrating the tool;



FIG. 3A shows a schematic diagram of the robotic arm in FIG. 1 moving relative to the reference point in space;



FIG. 3B shows a schematic diagram of the image, captured by the camera, of the projection points moving on the test plane;



FIGS. 4A to 9B show schematic diagrams of process of obtaining the tool axis vector according to an embodiment of the present disclosure;



FIG. 10A shows a schematic diagram of the second light emitted by the light source of FIG. 1 and the first light emitted by the tool along the tool axis that intersect at the tool center point;



FIG. 10B shows a schematic diagram of the projection point of the second light emitted by the light source of FIG. 10A projected on the test plane and the projection point of the first light emitted by the tool along the tool axis projected on the test plane being two separated points respectively;



FIG. 11A shows a schematic diagram of the tool center point of FIG. 10A overlapping the test plane;



FIG. 11B shows a schematic diagram of the tool center point of FIG. 11A is spaced from the reference point by projection point movement vector;



FIGS. 12A to 12B show schematic diagrams of the tool center point of FIG. 11A overlapping the reference point;



FIG. 13 shows a flowchart of an automatic teaching method for the robotic arm system according to an embodiment of the present disclosure;



FIG. 14A shows a schematic diagram of the robotic arm system of FIG. 1 performing a first detection teaching process on the tool center point; and



FIG. 14B shows a schematic diagram of the robotic arm system of FIG. 1 performing a second detection teaching process on the tool center point.





In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.


DETAILED DESCRIPTION

Referring to FIG. 1, FIG. 1 shows a schematic diagram of a robotic arm system for calibrating the TCP according to an embodiment of the present disclosure. The robotic arm system 100 includes a robotic arm 110, a camera 120, a light source 130 and a controller 140. The robotic arm 110 is configured to hold the tool 10, and the tool 10 has a tool axis A1. The controller 140 is configured to: (1). control the tool 10 to make a projection point P1 of the tool axis A1 of the tool 10 projected on a test plane 20 move relative to a reference point O1 of the test plane 20; (2). according to the relative movement, establish a first conversion relationship T1 between the robotic arm reference coordinate system (xR-yR-zR) of the robotic arm 110 and a camera reference coordinate system (xC-yC-zC) of the camera 120; (3). obtain a tool axis vector Tez relative to an installation surface (or, referred to as a flange surface) reference coordinate system (xf-yf-zf) of the robotic arm 110; (4). perform a step of obtaining calibration point information groups, including: (a1). control the robotic arm 110 to drive the tool center point WO1 (shown in FIG. 10A) to coincide with the reference point O1 of the test plane 20, and record a calibration point information group of the robotic arm 110; (a2). control the robotic arm 110 to drive the tool 10 to change the tool axis A1; and (a3). repeat steps (a1) and (a2) to obtain a number of the calibration point information groups; and (5). according to the calibration point information group, obtain a tool center point coordinate TP relative to the installation surface reference coordinate system (xf-yf-zf).


The tool 10 is shown with a luminance meter as an example. In another embodiment, the tool 10 is, for example, a machining tool.


In the present embodiment, the test plane 20 is, for example, the surface of a physical screen. The physical screen is, for example, a transparent screen or an opaque screen. In the case of the opaque screen, the test plane 20 of the physical screen is, for example, white. However, as long as the first light L1 emitted by the tool 10 and the second light L2 emitted by the light source 130 could be clearly displayed (the second light L2 is shown in FIG. 10A), the embodiment of the disclosure does not limit the surface color of the physical screen. In the case of a transparent screen, the screen is, for example, glass or plastic. When the screen is the opaque screen, the camera 120 and the robotic arm 110 could be located on the same side of the test plane 20, as shown in FIG. 1. When the screen is the opaque screen, the camera 120 and the robotic arm 110 could be located on two opposite sides of the test plane 20, or could be located on the same side of the test plane 20. In addition, the camera 120 is directly facing the test plane 20, so that a captured image by the camera 120 is the image of the plane xC-yC of the camera's reference coordinate system (xC-yC-zC).


Referring to FIGS. 2A to 2D, FIGS. 2A to 2D show a flowchart of the tool center point of the robotic arm system 100 in FIG. 1 for calibrating the tool.


In step S110, the robotic arm system 100 executes the step of establishing a first conversion relationship T1 between the robotic arm reference coordinate system (xR-yR-zR) of the robotic arm 110 and the camera's reference coordinate system (xC-yC-zC) of the camera 120. The step S110 includes sub-steps S111 to S117. The step of establishing the first conversion relationship T1 includes the following steps: the robotic arm 110 drives the tool axis A1 of the tool 10 to project the projection point P1 on the test plane 20 relative to the reference point O1 of the test plane 20; then, the controller 140 establishes the first conversion relationship T1 between the robotic arm reference coordinate system (xR-yR-zR) of the robotic arm 110 and the camera reference coordinate system (xC-yC-zC) according to the relative movement.


For example, referring to FIGS. 3A and 3B, FIG. 3A shows a schematic diagram of the robotic arm 110 in FIG. 1 moving relative to the reference point O1 in space, and FIG. 3B shows a schematic diagram of the image M1, captured by the camera 120, of the projection points Px, Py, and Pz moving on the test plane 20. During the calibration process, the camera 120 could continuously capture the image M1 of the projection points Px, Py, and Pz moving on the test plane 20, so that the controller 140 could analyze trajectory changes of the projection points Px, Py, and Pz on the test plane 20 in real time. In FIG. 3A, xC-yC-zC is the camera reference coordinate system, and the space vectors {right arrow over (U1)}, {right arrow over (V1)}, {right arrow over (W1)} are the vectors that the projection points start from the reference point O1 (origin) of the camera reference coordinate system (xC-yC-zC) and respectively move by length LR along the axes xR, yR and zR of the robotic arm reference coordinate system (xR-yR-zR). In an embodiment, the moving length LR along each axis xR, yR and zR of the robotic arm reference coordinate system (xR-yR-zR) could be equal or unequal. In FIG. 3B, the image M1 is a plane image, and the axis zC is perpendicular to the image M1. Although FIG. 3B shows the camera's reference coordinate system (xC-yC-zC) and vector arrows, the actual image M1 may not have the coordinate image and the arrow image. The projection points Px(x1,y1,z1), Py(x2,y2,z2) and Pz(x3,y3,z3) in the space of FIG. 3A are, for example, vector end points, which correspond to P′x(x1,y1), P′y(x2,y2) and P′z(x3,y3) of the image M1 of FIG. 3B. Px(x1,y1,z1), Py(x2,y2,z2) and Pz(x3, y3, z3) in space are projected on the test plane 20 to form P′x(x1,y1), P′y(x2,y2) and P′z(x3,y3) respectively.


In step S111, as shown in FIGS. 1 and 3A, the controller 140 controls the robotic arm 110 to move so that the projection point Px of the first light L1 emitted by the tool 10 moves by a first space vector {right arrow over (U1)}(x1,y1,z1) from the reference point O1 along a first space vector (for example, the xR axis) of the robotic arm reference coordinate system (xR-yR-zR). In addition, the value (or length) of the first space vector {right arrow over (U1)}(x1,y1,z1) is LR, and an end point of the first space vector {right arrow over (U1)}(x1,y1,z1) is the projection point Px(x1,y1,z1) in FIG. 3A. In addition, the reference point O1 may be any point on the test plane 20, for example, the center point of the test plane 20.


In step S111, the controller 140 could analyze the image M1 captured by the camera 120, and, as shown in FIG. 3B, determine whether the projection point P1′ (shown in FIG. 6A, and position of the projection point P1′ changes with moving of the tool 10) in the image M1 corresponds to (or is located/coincident with) the reference point O1 in the image M1. When the projection point P1′ does not correspond to the reference point O1 in the image M1, the robotic arm 110 is controlled to move until the projection point P1′ corresponds to the reference point O1 in the image M1. When the projection point P1′ corresponds to the reference point O1 in the image M1, the controller 140 controls the robotic arm 110 to move so that the projection point P1′ moves by the first space vector {right arrow over (U1)}(x1,y1,z1) from the reference point O1 along the first space vector (for example, the axis xR) of the robotic arm reference coordinate system (xR-yR-zR). During the moving, the controller 140 analyzes the image M1 captured by the camera 120 and determines whether the projection point in the image M1 has moved by the first space vector {right arrow over (U1)}(x1,y1,z1).


In step S112, the controller 140 could analyze the image M1 captured by the camera 120. As shown in FIG. 3B, the image M1 is a planar image, and thus the point Px(x1, y1, z1) becomes P′x(x1, y1) to obtain the value of the first plane coordinate P′x(x1,y1) of the projection point P′x of the first space vector {right arrow over (U1)}(x1,y1,z1), that is, the first axis coordinate value x1 and the second axis coordinate value y1.


In step S113, the controller 140 controls the tool 10 moves by a second space vector {right arrow over (V1)}(x2,y2,z2) from the reference point O1 along a second space vector (for example, the yR axis) of the robotic arm reference coordinate system (xR-yR-zR). The value (or length) of the second space vector {right arrow over (V1)}(x2,y2,z2) is LR, and an end point of the second space vector {right arrow over (V1)}(x2,y2,z2) is the projection point Py(x2, y2, z2) in FIG. 3A.


Similarly, in step S113, the controller 140 could analyze the image M1 captured by the camera 120 and determine whether the projection point P1′ in the image M1 corresponds to (or is located/coincident with) the reference point O1 in the image M1. When the projection point P1′ does not correspond to the reference point O1 in the image M1, the robotic arm 110 is controlled to move until the projection point P1′ corresponds to the reference point O1 in the image M1. When the projection point P1′ corresponds to the reference point O1 in the image M1, the controller 140 controls the robotic arm 110 to move so that the projection point P1′ moves by the second space vector {right arrow over (V1)}(x2,y2,z2) from the reference point O1 along the second space vector (for example, the yR axis) of the robotic arm reference coordinate system (xR-yR-zR). During the moving, the controller 140 analyzes the image M1 captured by the camera 120 and determines whether the projection point P1′ in the image M1 has moved by the second space vector {right arrow over (V1)}(x2, y2, z2).


In step S114, the controller 140 could analyze the image captured by the camera 120 to obtain the value of the first plane coordinate P′y(x2, y2) of the projection point P′y of the second space vector {right arrow over (V1)}(x2, y2, z2), that is, the first axis coordinate value x2 and the second axis coordinate value y2.


In step S115, the controller 140 controls the tool 10 moves by a third space vector {right arrow over (W1)}(x3,y3,z3) from the reference point O1 along a third space vector (for example, the zR axis) of the robotic arm reference coordinate system (xR-yR-zR). The value (or length) of the third space vector {right arrow over (W1)}(x3,y3,z3) is LR, and an end point of the third space vector {right arrow over (W1)}(x3, y3, z3) is the projection point Pz(x3, y3, z3) in FIG. 3A.


Similarly, in step S115, the controller 140 could analyze the image M1 captured by the camera 120 and determine whether the projection point P1′ in the image M1 corresponds to (or is located/coincident with) the reference point O1 in the image M1. When the projection point P1′ does not correspond to the reference point O1 in the image M1, the robotic arm 110 is controlled to move until the projection point P1′ corresponds to the reference point O1 in the image M1. When the projection point P1′ corresponds to the reference point O1 in the image M1, the controller 140 controls the robotic arm 110 to move so that the projection point P1′ moves by the third space vector {right arrow over (W1)}(x3,y3,z3) from the reference point O1 along the third space vector (for example, the zR axis) of the robotic arm reference coordinate system (xR-yR-zR). During the moving, the controller 140 analyzes the image M1 captured by the camera 120 and determines whether the projection point P1′ in the image M1 has moved by the third space vector {right arrow over (W1)}(x3,y3,z3).


In step S116, the controller 140 could analyze the image captured by the camera 120 to obtain the value of the first plane coordinate P′z(x3, y3) of the projection point P′z of the third space vector {right arrow over (W1)}(x3,y3,z3), that is, the first axis coordinate value x3 and the second axis coordinate value y3.


In step S117, the controller 140 establishes the first conversion relationship T1 between the camera reference coordinate system (xC-yC-zC) and the robotic arm reference coordinate system (xR-yR-zR) according to mutually orthogonal characteristics of the first space vector {right arrow over (U1)}(x1,y1,z1), the second space vector {right arrow over (V1)}(x2,y2,z2) and the third space vector {right arrow over (W1)}(x3,y3,z3). For example, the controller 140 could use the following equations (1) to (3) to obtain the third axis coordinate values z1, z2 and z3. As a result, the controller 140 obtains x1, x2, x3, y1, y2, y3, z1, z2 and z3. Then, the controller 140 establishes the first conversion relationship T1 according to the following formula (4).


As shown in formula (5), the controller 140 could use the first conversion relationship T1 to convert the projection point movement vector SW into the robotic arm movement vector SR, wherein the projection point movement vector SW is the movement vector of the projection point P1 on the test plane 20 relative to the camera reference coordinate system (xC-yC-zC), and the projection point movement vector SW is the movement vector of the robotic arm 110 relative to the robotic arm reference coordinate system (xR-yR-zR). The robotic arm reference coordinate system (xR-yR-zR) could be established at any position of the robotic arm 110, for example, the base 111 of the robotic arm 110. Equations (1), (2), and (3) represent the space vectors {right arrow over (U1)}, {right arrow over (V1)} and {right arrow over (W1)} orthogonal to each other. The first conversion relation T1 in formula (4) is the inverse matrix of the space vectors {right arrow over (U1)}, {right arrow over (V1)} and {right arrow over (W1)} divided by the length of the vector (the result is unit vector). Formula (5) represents that the dot product of the first conversion relationship T1 and the projection point movement vector SW is equal to the robotic arm movement vector SR.












U
1



·


V
1




=
0




(
1
)









V
1



·


W
1




=
0




(
2
)









U
1



·


W
1




=
0




(
3
)







T
1

=


[






U
1







U
1











V
1







V
1











W
1







W
1









]


-
1






(
4
)







S
R

=



T
1

·

S
w


=



[






U
1







U
1











V
1







V
1











W
1







W
1









]


-
1


·

S
w







(
5
)







Then, in step S120, the robotic arm system 100 obtains the tool axis vector Tez of the tool 10 relative to the installation surface reference coordinate system (xf-yf-zf).


For example, referring to FIGS. 4A to 9B, which shows schematic diagrams of process of obtaining the tool axis vector Tez according to an embodiment of the present disclosure. FIG. 4A shows a schematic diagram of the image M1 showing the projection point P1 captured by the camera 120 of FIG. 1 on the test plane 20, FIG. 4B shows a schematic diagram of the test plane 20 viewed from the axis −yC of FIG. 1, and FIG. 4C is a schematic diagram of the test plane 20 viewed from the axis −xC of FIG. 1. As shown in FIGS. 4B and 4C, the tool axis A1 of the tool 10 is inclined relative to the plane xC-yC of the camera reference coordinate system (xC-yC-zC), that is, the tool axis A1 is not perpendicular to the plane xC-yC of the camera reference coordinate system (xC-yC-zC). However, through the following process of obtaining the tool axis vector Tez, the tool axis A1 of the tool 10 could be adjusted to the plane xC-yC perpendicular to the camera reference coordinate system (xC-yC-zC), as shown in FIGS. 8 and 9B. Then, the controller 140 could obtain the tool axis vector Tez according to such state (i.e., the tool axis A1 is perpendicular to the plane xC-yC of the camera reference coordinate system (xC-yC-zC)) of the joint angles of the joints J1 to J6 of the robotic arm 110. Further description is given below.


In step S121, as shown in FIGS. 4B and 4C, the projection point P1 of the first light L1 emitted by the tool 10 projected on the test plane 20 along the tool axis A1. Then, the camera 120 captures the image M1 of the test plane 20. As shown in FIG. 4A, the image M1 has image of the projection point P1. Then, the controller 140 obtains the projection point movement vector SW of the projection point P1 projected on the test plane 20 by the tool 10 relative to the reference point O1 according to the captured image M1.


In step S122, the controller 140 obtains the robotic arm movement vector SR according to the first conversion relationship T1 and the projection point movement vector SW. For example, the controller 140 could substitute the projection point movement vector SW into the above formula (5) to obtain (or calculate) the robotic arm movement vector SR required for the robotic arm 110 to move the projection point P1 to approach or coincide with the reference point O1. The purpose of steps S122 and S123 is to prevent the projection point P1′ from being out of the test plane 20 after of the robotic arm moving or rotating.


In step S123, as shown in FIGS. 5A to 5C, FIG. 5A shows a schematic diagram of an image in which the projection point P1 of FIG. 4A coincides with the reference point O1 of the test plane 20, and FIG. 5B shows a schematic diagram of the test plane 20 viewed from the axis −yC of FIG. 1, and FIG. 5C shows a schematic view of the test plane 20 viewed from the axis −xC of FIG. 1. In step S123, as shown in FIGS. 5B to 5C, the controller 140 controls the robotic arm 110 to move by the robotic arm movement vector SR to make the projection point P1 of the tool 10 approach the reference point O1. For example, the projection point P1 is coincident reference point O1. However, in another embodiment, the projection point P1 could be moved to be close to but not coincide with the reference point O1. Then, the camera 120 captures the image M1 of the test plane 20, as shown in FIG. 5A, the image M1 has the image of the projection point P1.


Due to the projection point P1 approaching to the reference point O1 in step S123, the moved projection point P1′ in the subsequent step S124A (the moved projection point P1′ is shown in FIG. 6A) will not fall out of the test plane 20, and/or the projection point P1′ after the tool 10 is rotated in the subsequent step S124B (the projection point P1′, after the tool 10 is rotated, is shown in FIG. 7A) will not fall outside the test plane 20. In another embodiment, if the moved projected point P1′ in step S124A does not fall out of the test plane 20 and the projected point P1′ after the tool 10 is rotated in step S124B does not fall out of the test plane 20, the steps S122 and S123 could be omitted.


Then, in step S124, the controller 140 could perform the offset correction to the tool axis A1 of the tool 10 relative to the first axis (for example, the axis xC). Steps S124A to S124C are further described below.


In step S124A, as shown in FIGS. 6A to 6C, FIG. 6A shows a schematic diagram of the image where the projection point P1 of FIG. 5A deviates from the reference point O1 of the test plane 20, and FIG. 6B shows a schematic view of the test plane 20 viewed from the axis −yC of FIG. 1, and FIG. 6C shows a schematic view of the test plane 20 viewed from the axis −xC of FIG. 1. In step S124A, as shown in FIGS. 6B and 6C, the robotic arm 110 drives the tool 10 to move along the third axis (for example, the axis zC) of the camera reference coordinate system (xC-yC-zC), as shown by the arrow showing that the tool 10 moves or translates (for example, moves in a straight direction) toward the axis −zC. Then, the camera 120 captures the image M1 of the test plane 20. As shown in FIG. 6A, the image M1 has image of the moved projection point P1′. Since the tool axis A1 is not perpendicular to the test plane 20, the position of the projection point P1 of FIG. 5A is changed to the position of the projection point P1′ of FIG. 6A after the tool 10 moves along the third axis (for example, the axis zC) of the camera reference coordinate system (xC-yC-zC).


In step S124B, the controller 140 determines whether the position of the projection point P1 on the test plane 20 in the first axis (for example, the axis xC) changes according to the image captured by the camera 120. If so (for example, in a translation test of the first axis xC, the position of the projection point P1 of FIG. 5B moves along the axis −xC to the position of the projection point P1′ of FIG. 6B, and it represents deviation occurs in the first axis xC, and thus subsequent adjustments are required), the process proceeds to S1240; if not (it represents no deviation occurs in the first axis xC), the process proceeds to step S125A.


In step S124C, as shown in FIGS. 7A and 7B, which show schematic diagrams of the tool 10 in FIG. 6B rotating around the axis yC of the camera reference coordinate system (xC-yC-zC). The robotic arm 110 drives the tool 10 to rotate by an angle around the second axis (for example, the axis yC) of the camera reference coordinate system (xC-yC-zC) to reduce the angle β1 between the tool axis A1 and the axis zC, namely, it makes the tool axis A1 move toward trend of being parallel to the axis zC. In addition, the angle β1 is, for example, an arbitrary angle. The angle is determined by way of trial and error here. In detail, the tool axis A1 is counterclockwise rotated by an arc angle α1 with the axis yC as the fulcrum or center, so as to gradually reduce the difference between the tool axis A1 and the axis zC. After being rotated, the projection point P1 on the test plane 20 may not remain at the original position.


In detail, in step S124A, after the robotic arm 110 drives the tool 10 to move or translate along the axis +/−zC of the camera reference coordinate system (xC-yC-zC). As shown in FIG. 7A, if the projection point P1′ moves or shifts toward the negative first axis (for example, the axis −xC), the robotic arm 110 drives the tool 10 to rotate around the positive second axis (for example, the axis +yC) to reduce the angle β1 between the tool axis A1 and the axis zC, that is, it makes the tool axis A1 (the projection on the plane xC-zC of the camera reference coordinate system (xC-yC-zC)) move toward trend of being parallel to the axis zC (In other words, it makes the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) move toward trend of being perpendicular to the test plane 20). In addition, as long as the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) toward trend of being parallel to the axis zC, the embodiment of the disclosure does not limit whether the position of the projection point P1′ has changed during rotation.


In another embodiment, as shown in FIG. 7C, FIG. 7C shows a schematic diagram of the projection point P1′ of FIG. 6B is offset toward the positive first axis (for example, the axis +xC) in another embodiment. In step S124A, after the robotic arm 110 drives the tool 10 to move along the axis +/−zC of the camera reference coordinate system (xC-yC-zC), if the projection point P1′ moves toward the positive first axis (for example, axis +xC) moves or shifts, the robotic arm 110 drives the tool 10 to rotate around the negative second axis (for example, the axis −yC) to reduce the angle β1 between the tool axis A1 and the axis zC. The tool axis A1 is clockwise rotated by the arc angle α1 with the projection point P1′ as the fulcrum, so as to gradually reduce the angle β1 between the tool axis A1 and the axis zC, that is, it makes the tool axis A1 (the projection on the plane xC-zC of the camera reference coordinate system (xC-yC-zC)) move toward trend of being parallel to the axis zC (in other words, it makes the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) move toward trend of being perpendicular to the test plane 20).


The controller 140 repeats steps S124A to S124C until the tool axis A1 of the tool 10 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) (for example, the angle of viewed in FIG. 8) is parallel to the axis zC of the camera reference coordinate system (xC-yC-zC), or the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) is perpendicular to the test plane 20, as shown in FIG. 8. So far, the offset correction for the tool axis A1 of the tool 10 relative to the axis xC is completed. Furthermore, when the robotic arm 110 drives the tool 10 to move along the axis +/−zC of the camera reference coordinate system (xC-yC-zC), if a position change amount of the projection of the projection point P1′ projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) along the axis xC is substantially equal to 0 (that is, the position of the projection of the projection point P1′ projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) along the axis xC will not change any more), which means that the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) has been parallel to the axis zC of the plane xC-yC of the camera reference coordinate system (xC-yC-zC), the process could proceed to step S125, and the controller 140 executes the offset correction for the tool axis A1 of the tool 10 relative to the second axis (for example, the axis yC), as shown in process of steps S125A to S125C.


In steps S125A to S125C, the controller 140 and the robotic arm 110 could use the process similar to steps S124A to S124C to complete the offset correction to the axis yC. Hereinafter, further examples are shown in FIGS. 6A and 6C and FIGS. 9A to 9B.


In step S125A, as shown in FIG. 6C, the robotic arm 110 drives the tool 10 to move along the third axis (for example, the axis zC) of the camera reference coordinate system (xC-yC-zC), as shown in the arrow showing that the tool 10 moves or translates toward the axis −zC. Then, the camera 120 captures the image M1 of the test plane 20, as shown in FIG. 6A, the image M1 has the image of the moved projection point P1′. Due to the tool axis A1 being not perpendicular to the test plane 20, after the tool 10 moves along the third axis (for example, the axis zC) of the camera reference coordinate system (xC-yC-zC), the position of the projection point P1 of FIG. 5A is changed to the position of the projection point P1′ of FIG. 6A. In another embodiment, if step S124A has been performed, step S125A could be optionally omitted.


In step S125B, the controller 140 determines whether the position of the projection point P1 on the test plane 20 in the second axis (for example, the axis yC) changes according to the image M1 captured by the camera 120. If so (for example, the position of the projection point P1 of FIG. 5C moves to the position of the projection point P1′ of FIG. 6C along the axis −yC), the process proceeds to S1250; if not, the process proceeds to S126.


In step S125C, as shown in FIG. 9A, FIG. 9A shows a schematic diagram of the tool 10 of FIG. 6C rotating around the axis xC of the camera reference coordinate system (xC-yC-zC). The robotic arm 110 drives the tool 10 to rotate by an angle α2 around the first axis (for example, the axis −xC) of the camera reference coordinate system (xC-yC-zC) to reduce the angle β2 between the tool axis A1 and the axis zC, that is, it makes the tool axis A1 move toward trend of being parallel to the axis zC. In addition, the angle α2 is, for example, an arbitrary angle.


In detail, in step S125A, after the robotic arm 110 drives the tool 10 to move or translate along the axis +/−zC of the camera reference coordinate system (xC-yC-zC). As shown in FIG. 9A, if the projection point Pt moves or shifts toward the negative second axis (for example, the axis −yC), the robotic arm 110 drives the tool 10 to rotate around the negative first axis (for example, the axis −xC) to reduce the angle β2 between the tool axis A1 and the axis zC, that is, the tool axis A1 is clockwise rotated by the arc angle α2 with the projection point Pt as the fulcrum, so as to gradually reduce the angle β2 between the tool axis A1 and the axis zC, that is, it makes the tool axis A1 (the projection on the plane xC-zC of the camera reference coordinate system (xC-yC-zC)) move toward trend of being parallel to the axis zC (in other words, it makes the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) move toward trend of being perpendicular to the test plane 20). In addition, as long as the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) toward trend of being parallel to the axis zC, the embodiment of the disclosure does not limit whether the position of the projection point Pt has changed during rotation.


The controller 140 repeats steps S125A to S125C until the tool axis A1 of the tool 10 projected on the plane yC-zC of the camera reference coordinate system (xC-yC-zC) (for example, the angle of viewed in FIG. 9B) is parallel to the axis zC of the camera reference coordinate system (xC-yC-zC), or the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) is perpendicular to the test plane 20, as shown in FIG. 9B. So far, the offset correction to the tool axis A1 of the tool 10 relative to the axis yC is completed. Furthermore, when the robotic arm 110 drives the tool 10 to move along the axis +/−zC of the camera reference coordinate system (xC-yC-zC), if a position change amount of the projection of the projection point P1′ projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) along the axis yC is substantially equal to 0 (that is, the position of the projection of the projection point P1′ projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) along the axis yC will not change any more), which means that the projection of the tool axis A1 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) has been parallel to the axis zC of the plane xC-yC of the camera reference coordinate system (xC-yC-zC), the process could proceed to step S126.


In step S126, after the offset correction to the first axis and the second axis is completed (it means that the tool axis A1 is perpendicular to the test plane 20, the purpose of steps S124 and S125 is to correct the shift along the axis xC and the axis yC axis), the controller 140 establishes a second conversion relationship T2 according to the posture of the robotic arm 110 when the tool axis A1 is perpendicular to the test plane 20, and obtains the tool axis vector Tez according to the second conversion relationship T2, wherein the tool axis vector Tez is, for example, parallel to or coincides with the tool axis A1. For example, the controller 140 establishes the second conversion relationship T2 according to the joint angles of the joints J1 to J6 of the robotic arm 110 when the tool axis A1 is perpendicular to the test plane 20. The second conversion relationship T2 is the conversion relationship of the installation surface (or flange surface) reference coordinate system (xf-yf-zf) of an installation surface 110s of the tool 10 relative to the robotic arm reference coordinate system (xR-yR-zR). The tool 10 could be installed on the installation surface 110s, and the tool axis A1 of the tool 10 is not limited to be perpendicular to the installation surface 110s. In an embodiment, the second conversion relationship T2 could be expressed in the following formula (6), and the elements in formula (6) could be obtained by the linkage parameters (Denavit-Hartenberg Parameters) of the robotic arm 110, the coordinates of the joints J1 to J6 and the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf), wherein the link parameters could include link offset, joint angle, link length and Link twist. In addition, the second conversion relationship T2 could be established by using a known kinematics method.










T
2

=

[




e

1

1

1





e

1

2

1





e

1

3

1





e

1

4

1







e

2

1

1





e

2

2

1





e
231




e

2

4

1







e
311




e
321




e

3

3

1





e

3

4

1






0


0


0


1



]





(
6
)







As shown in the following formula (7), the vector zw is the normal vector (i.e., the axis zC) of the test plane 20 relative to the robotic arm reference coordinate system (xR-yR-zR), and the vector Tez is the vector (herein referred to as the “tool axis vector”) of the tool axis A1 relative to the installation surface reference coordinate system (xf-yf-zf). The controller 140 could convert the vector zw into the tool axis vector Tez through the inverse matrix of the second conversion relationship T2.






T
ez
=T2−1·zw  (7)


In step S130, the robotic arm system 100 executes the step of obtaining calibration point information groups. Further examples are given below.


In step S131, referring to FIGS. 10A to 10B, FIG. 10A shows a schematic diagram of the second light L2 emitted by the light source 130 of FIG. 1 and the first light L1 emitted by the tool 10 along the tool axis A1 that intersect at the tool center point WO1, and FIG. 10B shows a schematic diagram of the projection point PL2 of the second light L2 emitted by the light source 130 of FIG. 10A projected on the test plane 20 and the projection point PL1 of the first light L1 emitted by the tool 10 along the tool axis A1 projected on the test plane 20 being two separated points respectively. In this step, the angle of the light source 130 could be adjusted to make the second light L2 emitted by the light source 130 and the first light L1 emitted by the tool 10 intersect at the tool center point WO1, as shown in FIG. 10A.


In the present embodiment, the tool 10 is shown by taking the luminance meter as an example, and the tool center point WO1 is the virtual tool center point. The tool center point WO1 is, for example, the focus of the first light L1 (detection light) projected by the tool 10. In another embodiment, the tool 10 is, for example, a machining tool, and the tool center point WO1 is the tool center point, such as a solid tool tip point. In summary, the tool center point of the embodiment of the present disclosure could be the physical tool center point or the virtual tool center point.


In one of the methods for adjusting the angle of the light source 130, the controller 140 could obtain the angle θ between the second light L2 emitted by the light source 130 and the first light L1 emitted by the tool 10 along the direction (dotted line from the rotation fulcrum 131 of the tool axis A1) perpendicular to the tool axis A1 according to the following formula (8), and then the angle of the light source 130 could be adjusted to the angle θ by manual or a mechanism (not shown) controlled by the controller 140, so that the emitted second light L2 emitted by the light source 130 and the first light L1 emitted by the tool 10 intersect at the tool center point WO1. The aforementioned mechanisms are, for example, various mechanisms that could drive the light source 130 to rotate, such as a linkage mechanism, a gear set mechanism, etc. Due to the angle θ being given (known), the angle of the light source 130 could be quickly adjusted so that the second light L2 emitted and the first light L1 emitted by the tool 10 intersect at the tool center point WO1. When the angle of the light source 130 is adjusted to the angle θ, the relative relationship between the light source 130 and the tool 10 could be fixed to fix the relative relationship between the tool center point WO1 and the tool 10.









θ
=


tan

-
1





(


H

1

+

H

2


)


H

3







(
8
)







In formula (8), H1 is the distance (for example, the focal length of the first light L1) between the tool center point WO1 and a light emitting surface 10s of the tool 10 along the tool axis A1, and H2 is the distance between the light emitting surface 10s of the tool 10 and the rotation fulcrum 131 of the light source 130 along the tool axis A1, and H3 is the vertical distance (perpendicular to the tool axis A1) between the rotation fulcrum 131 of the light source 130 and the tool axis A1.


As shown in FIG. 10B, since the tool center point WO1 has not yet coincided with the test plane 20, the projection point PL2 of the second light L2 emitted by the light source 130 projected on the test plane 20 and the projection point PL1 of the first light L1 emitted on the tool 10 projected on the test plane 20 are two separated points.


In step S132, the controller 140 executes the step of obtaining the calibration point information group. For example, the controller 140 could control the robotic arm 110 to make a plurality of calibration points whose tool center point WO1 coincide with the reference point O1 of the test plane 20 under a plurality of different postures, and accordingly record the calibration point information group of each calibration point. For example, the controller 140 could control the robotic arm 110 in a posture to make the tool center point WO1 coincide with the reference point O1 of the test plane 20, and accordingly record the calibration point information group in such posture. Then, the controller 140 controls the robotic arm 110 in another posture to make the tool center point WO1 coincide with the reference point O1 of the test plane 20, and accordingly record the calibration point information group in such posture. According to this principle, the controller 140 could obtain several calibration point information groups of the robotic arm 110 in several different postures. Each calibration point information group could include the coordinates of the joints J1 to J6, and the coordinates of each joint could be the rotation angle of each joint relative to its preset starting point. At least one of the rotation angles of the robotic arm 110 in different postures could be different.


For example, referring to FIGS. 11A to 11B, FIG. 11A shows a schematic diagram of the tool center point WO1 of FIG. 10A overlapping the test plane 20, and FIG. 11B shows a schematic diagram of the tool center point WO1 of FIG. 11A is spaced from the reference point O1 by projection point movement vector SW. In steps S132A to S132B, the controller 140 could control the robotic arm 110 to drive the tool 10 to move along the tool axis A1 until the tool center point WO1 coincides with the test plane 20, as shown in FIG. 11A.


In step S132A, as shown in FIG. 11A, the robotic arm 110 drives the tool 10 to move along the tool axis vector Tez. In an embodiment, the robotic arm 110 could drive the tool 10 to move in the positive or negative direction of the tool axis A1. At this time, the tool axis vector Tez is, for example, parallel to or coincides with the tool axis A1.


In step S132B, as shown in FIG. 11B, the controller 140 determines whether the tool center point WO1 coincides with the test plane 20 according to (e.g., analyzes) the image M1 of the test plane 20 captured by the camera 120. If yes, the process proceeds to step S1320; if not, the controller 140 repeats steps S132A to S132B until the tool center point WO1 coincides with the test plane 20, as shown in FIG. 11B. Furthermore, when the tool center point WO1 coincides with the test plane 20, a light spot (i.e., the tool center point WO1) will appear on the test plane 20. The controller 140 could analyze the image M1 of the test plane 20 captured by the camera 120 to determine whether the light spot has appeared on the test plane 20; if so, it means that the tool center point WO1 has coincided with the test plane 20 (for example, as shown in FIG. 11A), the process proceeds to step S1320; if no (for example, there are two light spots, namely the projection point PL1 and the projection point PL2), it means that the tool center point WO1 has not coincided with the test plane 20, and then the process returns to step S132A, the robotic arm 110 continues to drive the tool 10 to move in the positive or negative direction of the tool axis A1 until the tool center point WO1 coincides with the test plane 20.


In step S132C, as shown in FIG. 11B, the controller 140 obtains the projection point movement vector SW according to the image of the test plane 20 captured by the camera 120.


In step S132D, the controller 140 obtains the robotic arm movement vector SR according to the first conversion relationship T1 and the projection point movement vector SW. For example, the controller 140 could substitute the projection point movement vector SW of FIG. 11B into the above formula (5) to calculate (or obtain) the required robotic arm movement vector SR for the robotic arm 110 to move the tool center point WO1 to coincide with the reference point O1.


In step S132E, referring to FIGS. 12A to 12B, FIGS. 12A to 12B show schematic diagrams of the tool center point WO1 of FIG. 11A overlapping the reference point O1. In step S132E, the controller 140 controls the robotic arm 110 to move by the robotic arm movement vector SR to make the tool center point WO1 coincide with the reference point O1.


In step S132F, the controller 140 determines whether the tool center point WO1 coincides with the reference point O1 of the test plane 20 according to (or analyzes) the image (for example, the image M1 shown in FIG. 12B) of the test plane 20 captured by the camera 120. If yes, the process proceeds to step S132G; if not, the process returns to step S132A.


Furthermore, if the tool axis A1 in FIG. 10A is not parallel to the axis zC, after step S132E, the control command may be inconsistent with the actual movement of the robotic arm due to the movement error of the robotic arm, and it cause the situation of the projection point PL2 of the second light L2 emitted by the light source 130 projected on the test plane 20 and the projection point PL1 of the first light L1 emitted on the tool 10 projected on the test plane 20 are again two separated points, as shown in FIG. 10B. Such situation means that the tool center point WO1 is separated from the test plane 20 (meaning that the tool center point WO1 and the reference point O1 of the test plane 20 also not coincident). Therefore, the process could proceed to step S132A to make the tool center point WO1 coincide with the test plane 20. When the tool center point WO1 in step S132F coincides with the reference point O1 of the test plane 20, it means that the tool center point WO1 coincides with the test plane 20 and the tool center point WO1 also coincides with the reference point O1, and then the process proceeds to step S132G, if not, the process returns to step S132A.


In step S132G, the controller 140 records the joint angles of the joints J1 to J6 of the robotic arm 110 in the state where the tool center point WO1 coincides with the reference point O1 of the test plane 20, and uses it as one calibration point information group.


In step S132H, the controller 140 determines whether the number of the calibration point information groups has reached a predetermined number, for example, at least 3 groups, but could be more. When the number of the calibration point information groups has reached the predetermined number, the process proceeds to step S133, when the number of the calibration point information groups has not reached the predetermined number, the process proceeds to step S132I.


In step S132I, the controller 140 controls the robotic arm 110 to change the posture of the tool 10. For example, the controller 140 controls the robotic arm 110 to change the angle of at least one of the tool axis A1 of the tool 10 relative to the axis xC, the axis yC and the axis zC, wherein the changed angle is, for example, 30 degrees, 60 degrees or other arbitrary. For example, the controller 140 could generate an Euler angle increments ΔRx, ΔRy, ΔRz, through a random number generator, to correct the azimuth angle (Euler angle) of the robotic arm 110, thereby changing the posture of the robotic arm 110. As this time, the azimuth angle of the robotic arm 110 could be expressed as (Rx+ΔRx, Ry+ΔRy, Rz+ΔRz), wherein (Rx, Ry, Rz) is the original azimuth angle of the robotic arm 110, Rx represents the Yaw angle, Ry represents Pitch angle, and Rz represents Roll angle. If the corrected azimuth angle exceeds the motion range of the robotic arm 110, the controller 140 could regenerate the Euler angle increments through the random number generator.


Then, the process returns to step S132A to record the calibration point information group of the robotic arm 110 in the new (different) posture of the tool 10. Furthermore, after the controller 140 controls the robotic arm 110 to change the posture of the tool 10, the tool center point WO1 of the tool 10 may deviate from the test plane 20. Therefore, the process returns to step S132A to make the tool center point WO1 coincide with the reference point O1 again, and in the state where the tool center point WO1 coincides with the reference point O1, another calibration point information group under different posture of the robotic arm 110 is recorded. Steps S132A to S132I are repeated until the number of the calibration point information groups recorded by the controller 140 reaches the predetermined number.


In step S133, when the number of the predetermined number groups recorded by the controller 140 reaches the predetermined number, the controller 140 obtains the tool center point coordinate TP of the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf).


As shown in the following formula (9), the tool center point coordinate TP could be established according to a plurality of the calibration point information groups of the robotic arm 110 in a plurality of the different postures. The controller 140 could obtain (calculate) the coordinates of the tool center point WO1 according to the calibration point information groups, wherein the coordinates of each calibration point information group could be obtained through the linkage parameters (Denavit-Hartenberg Parameters) of the robot 110, the coordinates of the joints J1 to J6 and Information about the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf), wherein the link parameters could include link offset, joint angle, link length and link twist.


The coordinates of the tool center point WO1 could be obtained (calculated) by the following formula (9):











T

2

i




[




T
x






T
y






T
z





1



]


=



[




e

11

i





e

12

i





e

13

i





e

14

i







e

21

i





e

22

i





e

23

i





e

24

i







e

3

i





e

32

i





e

33

i





e

34

i






0


0


0


1



]



[




T
x






T
y






T
z





1



]


=

[




P
x






P
y






P
z





1



]






(
9
)







In formula (9), the matrix T2i is 4×4 homogeneous conversion matrix which converts the coordinate system of the ith calibration point information group to the installation surface reference coordinate system (xf-yf-zf) from the robotic arm reference coordinate system (xR-yR-zR). The matrix W1f in formula (9) includes [Tx Ty Tz]t, which is the coordinate W1f (Tx, Ty, Tz) of the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf), the matrix [Px Py Pz 1]t includes the coordinate W1R (Px, Py, Pz) of the tool center point WO1 relative to the robotic arm reference coordinate system (xR-yR-zR) in space. Each calibration point information group could obtain three linear equations through formula (9). Therefore, n calibration point information groups could obtain 3n equations, and then the coordinates of the tool center point WO1 could be obtained through Pseudo-inverse matrix.


Furthermore, in formula (9), (e11i, e21i, e31i) represents direction of the vector of the ith calibration point information group in the first axis (for example, the axis xf) relative to the robotic arm reference coordinate system (xR-yR-zR). (e12i, e22i, e32i) represents direction of the vector of the ith calibration point information group in the second axis (for example, the axis yf) relative to the robotic arm reference coordinate system (xR-yR-zR). (e13i, e23i, e33i) represents the direction of the vector of the ith calibration point information group in the third axis (for example, the axis zf) relative to the robotic arm reference coordinate system (xR-yR-zR). The following formulas (10) and (11) could be obtained from formula (9).











[




e

1

1

1





e

1

2

1





e

1

3

1





-
1



0


0





e

2

1

1





e

2

2

1





e

2

3

1




0



-
1



0





e

3

1

1





e

3

2

1





e

3

3

1




0


0



-
1






e

1

1

2





e

1

2

2





e

1

3

2





-
1



0


0





e

2

1

2





e

2

2

2





e

2

3

2




0



-
1



0





e

3

1

2





e

3

2

2





e

3

3

2




0


0



-
1






e

1

1

3





e

1

2

3





e

1

3

3





-
1



0


0





e

2

1

3





e

2

2

3





e

2

3

3




0



-
1



0





e

3

1

3





e

3

2

3





e

3

3

3




0


0



-
1




]



[




T
x






T
y






T
z






P
x






P
y






P
z




]


=

[




-

e

1

4

1








-

e

2

4

1








-

e

3

4

1








-

e

1

4

2








-

e

2

4

2








-

e

3

4

2








-

e

1

4

3








-

e

2

4

3








-

e

3

4

3






]





(
10
)








[




T
x






T
y






T
z






P
x






P
y






P
z




]

=




T
3
t



(


T
3



T
3
t


)



-
1




[




-

e

1

4

1








-

e

2

4

1








-

e

3

4

1








-

e

1

4

2








-

e

2

4

2








-

e

3

4

2








-

e

1

4

3








-

e

2

4

3








-

e

3

4

3






]










T
3

=

[




e

1

1

1





e

1

2

1





e

1

3

1





-
1



0


0





e

2

1

1





e

2

2

1





e

2

3

1




0



-
1



0





e

3

1

1





e

3

2

1





e

3

3

1




0


0



-
1






e

1

1

2





e

1

2

2





e

1

3

2





-
1



0


0





e

2

1

2





e

2

2

2





e

2

3

2




0



-
1



0





e

3

1

2





e

3

2

2





e

3

3

2




0


0



-
1






e

1

1

3





e

1

2

3





e

1

3

3





-
1



0


0





e

2

1

3





e

2

2

3





e

2

3

3




0



-
1



0





e

3

1

3





e

3

2

3





e

3

3

3




0


0



-
1




]






(
11
)







In formula (11), T3t is the transpose matrix of T3, (T3T3t)−1 is the inverse matrix of (T3T3t), and the coordinate (Tx Ty Tz) is the tool center point coordinate TP, and the matrix T3 is a calibration point information group matrix composed of the calibration point information groups.


If the number of the calibration point information groups is sufficient, substitute each element in the matrix T2i corresponding to the ith calibration point information group into formula (10) and relocate the matrix T3 to obtain formula (11) to obtain the coordinate W1f (Tx, Ty, Tz) of the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf) and the coordinate W1R (Px, Py, Pz) of the tool center point WO1 relative to the robotic arm reference coordinate system (xR-yR-zR).


Of course, the above-mentioned tool center point calibration method is only an example, and each component and/or calibration method of the robotic arm system 100 could be changed according to actual need/demand; however, such exemplification not meant to be limiting.


After the tool center point coordinate TP is obtained, the controller 140 could accordingly drive the robotic arm 110 to control the tool center point WO1 to a desired position. As a result, the robotic arm system 100 could perform an automatic teaching process for the robotic arm, and the following description is submitted with FIGS. 13 and 14A to 14B.


Referring to FIGS. 13 and 14A to 14B, FIG. 13 shows a flowchart of an automatic teaching method for the robotic arm system 100 according to an embodiment of the present disclosure, and FIG. 14A shows a schematic diagram of the robotic arm system 100 of FIG. 1 performing a first detection teaching process on the tool center point WO1 of the tool 10, and FIG. 14B shows a schematic diagram of the robotic arm system 100 of FIG. 1 performing a second detection teaching process on the tool center point WO1 of the tool 10.


In the following, the process of the robotic arm system 100 of FIG. 1 performing the first detection teaching process on the tool center point WO1 of the tool 10 is described using steps S210 to S260.


In step S210, as shown in FIG. 14A, the controller 140 drives the tool 10 to the first position S1 using uses the tool center point coordinate TP(Tx Ty Tz) and to, in the first position S1, make the tool center point WO1 coincides with a designated point 31 of a detection surface 30 (plane xd-yd). In detail, since the controller 140 could obtain (calculate) the tool center point coordinate TP according to the calibration point information group matrix T3 (for example, the above formula (10)), it could control the tool center point coordinate TP of the tool 10 to move to the desired position, so that the center point coordinate TP is moved to coincide with the designated point 31 of the detection surface 30. The detection surface 30 is, for example, a display surface of a display, and the designated point 31 is, for example, any point on the detection surface 30, for example, a center point of the detection surface 30.


In step S220, as shown in FIG. 14A, the controller 140 translates the tool 10 to a second position S2 by a translation distance LH. The way to translate the tool 10 is, for example, that the robotic arm system 100 further includes a moving element 150 which is slidably disposed on the robotic arm 110. The tool 10 and the light source 130 (the light source 130 not shown in FIG. 14A) could be disposed on the moving element 150 so that the tool 10 and the light source 130 could be translated with the moving element 150. In an embodiment, the controller 140 could control the moving element 150 to translate the translation distance LH to synchronously drive the tool 10 to translate by the translation distance LH.


In step S230, as shown in FIG. 14A, a detection angle θH of the tool 10 is obtained according to the translation distance LH and a stroke difference ΔTZ1 (using the translation axis module of the moving element 150 in conjunction with triangulation method) of the tool center point WO1 of the tool 10 along the tool axis A1, as shown in the following formula (13). The detection angle θH is, for example, the angle between the tool axis A1 and the axis xd when the tool axis A1 is projected on the xd-yd plane.





θH=π/2−tan−1TZ1/LH)  (13)


In step S240, the controller 140 determines whether the detection angle θH meets a first specification angle. When the detection angle θH does not meet the first specification angle, the process proceeds to step S250, and the controller 140 drives the tool 10 back to the first position S1. For example, the controller 140 could control the moving element 150 to translate for driving the tool 10 back to the first position S1. The first specification angle is, for example, specification value of the product, which is the specification value required when the tool 10 detects the detection surface 30 along the first detection direction. In detail, when the detection angle θH meets the first specification angle, analysis result of the display by the tool 10 (for example, the luminance meter) does not exceed the range (for example, when viewing display screen of the display in a skewed angle of view, a black screen or abnormal color will not be generated). The value of the first specification angle depends on the type of product, for example, the maximum viewing angle or the viewing angle of the flat-panel display; however, such exemplification not meant to be limiting.


In step S260, when the tool 10 returns to the first position, the controller 140 adjusts the posture of the robotic arm 110 to change the angle of the tool axis A1 relative to the detection surface 30, and then the process returns to step S210. The controller 140 could repeat steps S210 to S260 until the detection angle θH meets the first specification angle. For example, if the detection angle θH does not meet the first specification angle, the controller 140 controls the robotic arm 110 to rotate to rotate the tool 10 by an angle around the second axis (for example, the axis yd), and then the process returns to step S210. Repeat steps S210 to S260 according to this principle until the detection angle θH meets the first specification angle.


Similarly, as shown in FIG. 14B, the robotic arm system 100 could use a similar method (using the process in FIG. 13) to perform a second detection teaching process on the tool center point WO1 of the tool 10.


For example, in step S210, as shown in FIG. 14B, the controller 140 drives the tool 10 to the first position S1 using uses the tool center point coordinate TP (Tx Ty Tx), and to, in the first position S1, make the tool center point WO1 coincide with the designated point 31 of the detection surface 30.


In step S220, as shown in FIG. 14B, the controller 140 translates the tool 10 to a second position S2 by a translation distance LV. In an embodiment, the controller 140 could control the moving element 150 to translate by the translation distance LV to synchronously drive the tool 10 to translate the translation distance LV.


as shown in FIG. 14B, a detection angle θV of the tool 10 is obtained according to the translation distance LV and a stroke difference ΔTZ2 of the tool center point WO1 of the tool 10 along the tool axis A1, as shown in the following formula (14). The detection angle θV is, for example, the angle between the tool axis A1 and the axis zd when the tool axis A1 is projected on the xd-yd plane.





θV=tan−1TZ2/LV)  (14)


In step S240, the controller 140 determines whether the detection angle θV meets a second specification angle. When the detection angle θV does not meet the second specification angle, the process proceeds to step S250, and the controller 140 drives the tool 10 back to the second position S2. For example, the controller 140 could control the moving element 150 to translate for driving the tool 10 back to the second position S2. The second specification angle is, for example, specification value of the product, which is the specification value required when the tool 10 detects the detection surface 30 along the second detection direction. In detail, when the detection angle θV meets the second specification angle, analysis result of the display by the tool 10 (for example, the luminance meter) does not exceed the range (for example, when viewing display screen of the display in a skewed angle of view, a black screen or abnormal color will not be generated). The value of the second specification angle depends on the type of product, for example, the maximum viewing angle or the viewing angle of the flat-panel display; however, such exemplification not meant to be limiting.


In step S260, when the tool 10 returns to the first position S1, the controller 140 adjusts the posture of the robotic arm 110 to change the angle of the tool axis A1 relative to the detection surface 30, and then the process returns to step S210. The controller 140 could repeat steps S210 to S260 until the detection angle θV meets the second specification angle. For example, if the detection angle θV does not meet the second specification angle, the controller 140 controls the robotic arm 110 to rotate to rotate the tool 10 by an angle around the first axis (for example, the axis xd) until the detection angle θV meets the first specification angle.


When the detection angle θH meets the first specification angle and the detection angle θV meets the second specification angle, the controller 140 records the joint coordinate combination or performs detection according to the current posture of the robotic arm 110. For example, the controller 140 records the change amount of the motion parameters of the joints J1 to J6 of the robotic arm 110 during the steps S210 to S260.


In summary, according to the robotic arm system and the calibration method using the tool center point in the embodiments of the present disclosure, even if additional sensors and measuring devices (for example, three-dimensional measurement equipment) are not used, the calibration for the tool center point and the automatic teaching for the robotic arm could be performed.


It will be apparent to those skilled in the art that various modifications and variations could be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.

Claims
  • 1. A calibration method for tool center point, comprising: performing a step of establishing a first conversion relationship between a robotic arm reference coordinate system and a camera reference coordinate system, comprising: driving, by a robotic arm, a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane; andestablishing the first conversion relationship according to the relative movement;obtaining a tool axis vector relative to an installation surface reference coordinate system of the robotic arm;performing a calibration point information group obtaining step, comprising: (a1) driving, by the robotic arm, a tool center point to coincide with the reference point of the test plane, and recording a calibration point information group of the robotic arm;(a2) driving, by the robotic arm, the tool to change an angle of the tool axis; and(a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; andobtaining a tool center point coordinate relative to the installation surface reference coordinate system according to the calibration point information groups.
  • 2. The calibration method according to claim 1, wherein the step of driving, by the robotic arm, the projection point of the tool axis of the tool projected on the test plane to perform the relative movement relative to the reference point of the test plane further comprises: driving, by the robotic arm, the tool to move by a space vector from the reference point along a plurality of axes of the robotic arm reference coordinate system;wherein the step of establishing the first conversion relationship further comprises: capturing, by a camera, an image of the projection point moving on the test plane;wherein the step of establishing the first conversion relationship further comprises: analyzing the image captured by the camera to obtain a value of a plane coordinate of each space vector; andestablishing the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the space vectors.
  • 3. The calibration method according to claim 1, wherein the step of driving, by the robotic arm, the projection point of the tool axis of the tool projected on the test plane to perform the relative movement relative to the reference point of the test plane further comprises: driving, by the robotic arm, tool to move by a first space vector from the reference point along a first axis of the robotic arm reference coordinate system;driving, by the robotic arm, tool to move by a second space vector from the reference point along a second axis of the robotic arm reference coordinate system;driving, by the robotic arm, tool to move by a third space vector from the reference point along a third axis of the robotic arm reference coordinate system;wherein the step of establishing the first conversion relationship further comprises: capturing, by a camera, an image of the projection point moving on the test plane;wherein the step of establishing the first conversion relationship further comprises: analyzing the image captured by the camera to obtain a value of a first plane coordinate of the first space vector;analyzing the image captured by the camera to obtain a value of a second plane coordinate of the second space vector;analyzing the image captured by the camera to obtain a value of a third plane coordinate of the third space vector; andestablishing the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the first space vector, the second space vector and the third space vector.
  • 4. The calibration method according to claim 1, wherein the step of obtaining the tool axis vector comprises: performing offset correction to the tool axis relative to a first axis of the camera reference coordinate system, comprising: (b1) driving the tool to move along a third axis of the camera reference coordinate system;(b2) determining whether a position of the projection point on the test plane in the first axis of the camera reference coordinate system changes according to an image, captured by a camera, of the tool moving relative to the test plane;(b3) when the position of the projection point on the test plane in the first axis changes, driving the tool to rotate by an angle around a second axis of the camera reference coordinate system; and(b4) repeating steps (b1) to (b3) until a position change amount of the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero.
  • 5. The calibration method according to claim 4, wherein the step of obtaining the tool axis vector comprises: performing offset correction to the tool axis relative to the second axis of the camera reference coordinate system when the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero, comprising: (c1) driving the tool to move along a third axis of the camera reference coordinate system;(c2) determining whether a position of the projection point on the test plane in the second axis of the camera reference coordinate system changes according to an image, captured by the camera, of the tool moving relative to the test plane;(c3) when the position of the projection point on the test plane in the second axis changes, driving the tool to rotate by an angle around the first axis of the camera reference coordinate system; and(c4) repeating steps (c1) to (c3) until a position change amount of the projection point of the test plane in the second axis of the camera reference coordinate system is substantially equal to zero.
  • 6. The calibration method according to claim 1, wherein the step of obtaining the tool axis vector relative to the installation surface reference coordinate system of the robotic arm comprises: driving the tool axis of the tool to be perpendicular to the test plane; andobtaining the tool axis vector according to a posture of the robotic arm when the tool axis is perpendicular to the test plane.
  • 7. The calibration method according to claim 1, wherein the step of obtaining the tool center point coordinate comprises: adjusting an angle of a light source so that a first light emitted by the tool and a second light emitted by the light source intersect at the tool center point;obtaining a plurality of calibration point information groups where the tool center point coincides with the reference point under a plurality of different postures of the robotic arm;driving the tool to move along the tool axis vector;establishing a calibration point information group matrix according to the calibration point information groups; andobtaining the tool center point coordinate according to the calibration point information group matrix.
  • 8. A teaching method for robotic arm, comprises: (d1) by using the calibration method as claimed in claim 1, obtaining the tool center point coordinate and driving the tool to a first position, so that the tool center point coincides with a designated point of a detection surface at the first position;(d2) translating the tool by a translation distance to a second position;(d3) obtaining a detection angle of the tool according to the translation distance and a stroke difference of the tool center point of the tool along the tool axis;(d4) determining whether the detection angle meets a specification angle;(d5) driving the tool back to the first position when the detection angle does not meet the specification angle; and(d6) adjusting a posture of the robotic arm to perform steps (d2) to (d6) until the detection angle meets the specification angle.
  • 9. A robotic arm system, comprising: a robotic arm configured to carry a tool, wherein the tool has a tool axis;a controller configured to: control the robotic arm to drive a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane;establish a first conversion relationship between a robotic arm reference coordinate system of the robotic arm and a camera reference coordinate system according to the relative movement;obtain a tool axis vector relative to an installation surface reference coordinate system of the robotic arm;perform a calibration point information group obtaining step, comprising: (a1) controlling the robotic arm to drive a tool center point to coincide with the reference point of the test plane and recording a calibration point information group of the robotic arm;(a2) controlling the robotic arm to drive the tool to change an angle of the tool axis; and(a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; andobtain a tool center point coordinate relative to the installation surface reference coordinate system according to the calibration point information groups.
  • 10. The robotic arm system according to claim 9, further comprises: a camera configured to capture an image of the projection point moving on the test plane;wherein the controller is further configured to: control the robotic arm to drive the tool to move by a space vector from the reference point along a plurality of axes of the robotic arm reference coordinate system;analyze the image captured by the camera to obtain a value of a plane coordinate of each space vector; andestablish the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the space vectors.
  • 11. The robotic arm system according to claim 9, further comprises: a camera configured to capture an image of the projection point moving on the test plane;wherein the controller is further configured to: control the robotic arm to drive the tool to move by a first space vector from the reference point along a first axis of the robotic arm reference coordinate system;control the robotic arm to drive the tool to move by a second space vector from the reference point along a second axis of the robotic arm reference coordinate system;control the robotic arm to drive the tool to move by a third space vector from the reference point along a third axis of the robotic arm reference coordinate system;analyze the image captured by the camera to obtain a value of a first plane coordinate of the first space vector;analyze the image captured by the camera to obtain a value of a second plane coordinate of the second space vector;analyze the image captured by the camera to obtain a value of a third plane coordinate of the third space vector; andestablish the first conversion relationship between the robotic arm reference coordinate system and the camera reference coordinate system according to mutually orthogonal characteristics of the first space vector, the second space vector and the third space vector.
  • 12. The robotic arm system according to claim 9, further comprises: a camera configured to capture an image of the projection point moving on the test plane;wherein the controller is further configured to perform offset correction to the tool axis relative to a first axis of the camera reference coordinate system, comprising; (b1) driving the tool to move along a third axis of the camera reference coordinate system;(b2) determining whether a position of the projection point on the test plane in the first axis of the camera reference coordinate system changes according to an image, captured by a camera, of the tool moving relative to the test plane;(b3) when the position of the projection point on the test plane in the first axis changes, driving the tool to rotate by an angle around a second axis of the camera reference coordinate system; and(b4) repeating steps (b1) to (b3) until a position change amount of the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero.
  • 13. The robotic arm system according to claim 12, wherein the controller further configured to perform offset correction to the tool axis relative to the second axis of the camera reference coordinate system when the projection point of the test plane in the first axis of the camera reference coordinate system is substantially equal to zero, comprising: (c1) driving the tool to move along a third axis of the camera reference coordinate system;(c2) determining whether a position of the projection point on the test plane in the second axis of the camera reference coordinate system changes according to an image, captured by the camera, of the tool moving relative to the test plane;(c3) when the position of the projection point on the test plane in the second axis changes, driving the tool to rotate by an angle around the first axis of the camera reference coordinate system; and(c4) repeating steps (c1) to (c3) until a position change amount of the projection point of the test plane in the second axis of the camera reference coordinate system is substantially equal to zero.
  • 14. The robotic arm system according to claim 9, wherein the controller is further configured to: drive the tool axis of the tool to be perpendicular to the test plane; andobtain the tool axis vector of the tool relative to the installation surface reference coordinate system of the robotic arm according to a posture of the robotic arm when the tool axis is perpendicular to the test plane.
  • 15. The robotic arm system according to claim 9, wherein a first light emitted by the tool and a second light emitted by the light source intersect at the tool center point, and the controller is further configured to: obtain a plurality of calibration point information groups where the tool center point coincides with the reference point under a plurality of different postures of the robotic arm;drive the tool to move along the tool axis vector;establish a calibration point information group matrix according to the calibration point information groups; andobtain the tool center point coordinate according to the calibration point information group matrix.
  • 16. The robotic arm system according to claim 9, wherein the controller is further configured to: (d1) drive the tool to a first position, so that the tool center point coincides with a designated point of a detection surface at the first position;(d2) translate the tool by a translation distance to a second position;(d3) obtain a detection angle of the tool according to the translation distance and a stroke difference of the tool center point of the tool along the tool axis;(d4) determine whether the detection angle meets a specification angle;(d5) drive the tool back to the first position when the detection angle does not meet the specification angle; and(d6) adjust a posture of the robotic arm to perform steps (d2) to (d6) until the detection angle meets the specification angle.
Priority Claims (1)
Number Date Country Kind
109129784 Aug 2020 TW national