This application claims the benefit of Taiwan application Serial No. 109129784, filed Aug. 31, 2020, the subject matter of which is incorporated herein by reference.
The disclosure relates in general to a calibration method, a teaching method for robotic arm and a robotic arm system using the same, and more particularly to a calibration method for tool center point, a teaching method for robotic arm and a robotic arm system using the same.
Along with the advancement of science and technology, the application of robotic arms has become more and more broad in various industries. Generally speaking, robotic arms are jointed-type robotic arms with multiple joints, and one end of which is provided with a tool, such as welding tools or drilling tools, etc., to perform various operations. Before the robotic arm performs operations, the position of the Tool Center Point (TCP) of the tool needs to be accurately calibrated in advance, so that controller of the robotic arm could control tool to run on a calibration path according to the TCP of the tool. However, the TCP calibration technology of the robotic arm of the prior art does have many disadvantages that need to be improved. For example, according to the TCP calibration technology of the robotic arm of the prior art, user may need to manually operate the robotic arm to calibrate the TCP of the robotic arm. Therefore, it is prone to human error and thus the TCP cannot be accurately calibrated. Conclusively, it causes low calibration accuracy, high labor cost and time cost. In addition, the current calibration method for the TCP cannot be applied to the virtual TCP.
According to an embodiment, a calibration method for tool center point is provided. The calibration method includes the following steps: step of establishing a first conversion relationship between a robotic arm reference coordinate system and a camera reference coordinate system is performed, comprising: (1) driving, by a robotic arm, a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane; and (2) establishing the first conversion relationship according to the relative movement; obtaining a tool axis vector relative to an installation surface reference coordinate system of the robotic arm; a calibration point information group obtaining step is performed, comprising: (a1) driving, by the robotic arm, a tool center point to coincide with the reference point of the test plane, and recording a calibration point information group of the robotic arm; (a2) driving, by the robotic arm, the tool to change an angle of the tool axis; and (a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; and a tool center point coordinate relative to the installation surface reference coordinate system obtained according to the calibration point information groups.
According to another embodiment, a teaching method for a robotic arm is provided. The teaching method includes the following steps: (d1) by using the calibration method as described above, the tool center point coordinate is obtained and driving the tool to a first position, so that the tool center point coincides with a designated point of a detection surface at the first position; (d2) the tool is translated by a translation distance to a second position; (d3) a detection angle of the tool is obtained according to the translation distance and a stroke difference of the tool center point of the tool along the tool axis; (d4) whether the detection angle meets a specification angle is determined; (d5) the tool is driven back to the first position when the detection angle does not meet the specification angle; and (d6) posture of the robotic arm is adjusted to perform steps (d2) to (d6) until the detection angle meets the specification angle.
According to an alternative embodiment, a robotic arm system includes a robotic arm and a controller. The robotic arm is configured to carry a tool, wherein the tool has a tool axis. The controller is configured to control the robotic arm to drive a projection point of a tool axis of a tool projected on a test plane to perform a relative movement relative to a reference point of the test plane; establish a first conversion relationship between a robotic arm reference coordinate system of the robotic arm and a camera reference coordinate system according to the relative movement; obtain a tool axis vector relative to an installation surface reference coordinate system of the robotic arm; perform a calibration point information group obtaining step, comprising: (a1) controlling the robotic arm to drive a tool center point to coincide with the reference point of the test plane and recording a calibration point information group of the robotic arm; (a2) controlling the robotic arm to drive the tool to change an angle of the tool axis; and (a3) repeating steps (a1) and (a2) to obtain a plurality of the calibration point information groups; and obtain a tool center point coordinate relative to the installation surface reference coordinate system according to the calibration point information groups.
The above and other aspects of the disclosure will become better understood with regard to the following detailed description of the preferred but non-limiting embodiment(s). The following description is made with reference to the accompanying drawings.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
Referring to
The tool 10 is shown with a luminance meter as an example. In another embodiment, the tool 10 is, for example, a machining tool.
In the present embodiment, the test plane 20 is, for example, the surface of a physical screen. The physical screen is, for example, a transparent screen or an opaque screen. In the case of the opaque screen, the test plane 20 of the physical screen is, for example, white. However, as long as the first light L1 emitted by the tool 10 and the second light L2 emitted by the light source 130 could be clearly displayed (the second light L2 is shown in
Referring to
In step S110, the robotic arm system 100 executes the step of establishing a first conversion relationship T1 between the robotic arm reference coordinate system (xR-yR-zR) of the robotic arm 110 and the camera's reference coordinate system (xC-yC-zC) of the camera 120. The step S110 includes sub-steps S111 to S117. The step of establishing the first conversion relationship T1 includes the following steps: the robotic arm 110 drives the tool axis A1 of the tool 10 to project the projection point P1 on the test plane 20 relative to the reference point O1 of the test plane 20; then, the controller 140 establishes the first conversion relationship T1 between the robotic arm reference coordinate system (xR-yR-zR) of the robotic arm 110 and the camera reference coordinate system (xC-yC-zC) according to the relative movement.
For example, referring to
In step S111, as shown in
In step S111, the controller 140 could analyze the image M1 captured by the camera 120, and, as shown in
In step S112, the controller 140 could analyze the image M1 captured by the camera 120. As shown in
In step S113, the controller 140 controls the tool 10 moves by a second space vector {right arrow over (V1)}(x2,y2,z2) from the reference point O1 along a second space vector (for example, the yR axis) of the robotic arm reference coordinate system (xR-yR-zR). The value (or length) of the second space vector {right arrow over (V1)}(x2,y2,z2) is LR, and an end point of the second space vector {right arrow over (V1)}(x2,y2,z2) is the projection point Py(x2, y2, z2) in
Similarly, in step S113, the controller 140 could analyze the image M1 captured by the camera 120 and determine whether the projection point P1′ in the image M1 corresponds to (or is located/coincident with) the reference point O1 in the image M1. When the projection point P1′ does not correspond to the reference point O1 in the image M1, the robotic arm 110 is controlled to move until the projection point P1′ corresponds to the reference point O1 in the image M1. When the projection point P1′ corresponds to the reference point O1 in the image M1, the controller 140 controls the robotic arm 110 to move so that the projection point P1′ moves by the second space vector {right arrow over (V1)}(x2,y2,z2) from the reference point O1 along the second space vector (for example, the yR axis) of the robotic arm reference coordinate system (xR-yR-zR). During the moving, the controller 140 analyzes the image M1 captured by the camera 120 and determines whether the projection point P1′ in the image M1 has moved by the second space vector {right arrow over (V1)}(x2, y2, z2).
In step S114, the controller 140 could analyze the image captured by the camera 120 to obtain the value of the first plane coordinate P′y(x2, y2) of the projection point P′y of the second space vector {right arrow over (V1)}(x2, y2, z2), that is, the first axis coordinate value x2 and the second axis coordinate value y2.
In step S115, the controller 140 controls the tool 10 moves by a third space vector {right arrow over (W1)}(x3,y3,z3) from the reference point O1 along a third space vector (for example, the zR axis) of the robotic arm reference coordinate system (xR-yR-zR). The value (or length) of the third space vector {right arrow over (W1)}(x3,y3,z3) is LR, and an end point of the third space vector {right arrow over (W1)}(x3, y3, z3) is the projection point Pz(x3, y3, z3) in
Similarly, in step S115, the controller 140 could analyze the image M1 captured by the camera 120 and determine whether the projection point P1′ in the image M1 corresponds to (or is located/coincident with) the reference point O1 in the image M1. When the projection point P1′ does not correspond to the reference point O1 in the image M1, the robotic arm 110 is controlled to move until the projection point P1′ corresponds to the reference point O1 in the image M1. When the projection point P1′ corresponds to the reference point O1 in the image M1, the controller 140 controls the robotic arm 110 to move so that the projection point P1′ moves by the third space vector {right arrow over (W1)}(x3,y3,z3) from the reference point O1 along the third space vector (for example, the zR axis) of the robotic arm reference coordinate system (xR-yR-zR). During the moving, the controller 140 analyzes the image M1 captured by the camera 120 and determines whether the projection point P1′ in the image M1 has moved by the third space vector {right arrow over (W1)}(x3,y3,z3).
In step S116, the controller 140 could analyze the image captured by the camera 120 to obtain the value of the first plane coordinate P′z(x3, y3) of the projection point P′z of the third space vector {right arrow over (W1)}(x3,y3,z3), that is, the first axis coordinate value x3 and the second axis coordinate value y3.
In step S117, the controller 140 establishes the first conversion relationship T1 between the camera reference coordinate system (xC-yC-zC) and the robotic arm reference coordinate system (xR-yR-zR) according to mutually orthogonal characteristics of the first space vector {right arrow over (U1)}(x1,y1,z1), the second space vector {right arrow over (V1)}(x2,y2,z2) and the third space vector {right arrow over (W1)}(x3,y3,z3). For example, the controller 140 could use the following equations (1) to (3) to obtain the third axis coordinate values z1, z2 and z3. As a result, the controller 140 obtains x1, x2, x3, y1, y2, y3, z1, z2 and z3. Then, the controller 140 establishes the first conversion relationship T1 according to the following formula (4).
As shown in formula (5), the controller 140 could use the first conversion relationship T1 to convert the projection point movement vector SW into the robotic arm movement vector SR, wherein the projection point movement vector SW is the movement vector of the projection point P1 on the test plane 20 relative to the camera reference coordinate system (xC-yC-zC), and the projection point movement vector SW is the movement vector of the robotic arm 110 relative to the robotic arm reference coordinate system (xR-yR-zR). The robotic arm reference coordinate system (xR-yR-zR) could be established at any position of the robotic arm 110, for example, the base 111 of the robotic arm 110. Equations (1), (2), and (3) represent the space vectors {right arrow over (U1)}, {right arrow over (V1)} and {right arrow over (W1)} orthogonal to each other. The first conversion relation T1 in formula (4) is the inverse matrix of the space vectors {right arrow over (U1)}, {right arrow over (V1)} and {right arrow over (W1)} divided by the length of the vector (the result is unit vector). Formula (5) represents that the dot product of the first conversion relationship T1 and the projection point movement vector SW is equal to the robotic arm movement vector SR.
Then, in step S120, the robotic arm system 100 obtains the tool axis vector Tez of the tool 10 relative to the installation surface reference coordinate system (xf-yf-zf).
For example, referring to
In step S121, as shown in
In step S122, the controller 140 obtains the robotic arm movement vector SR according to the first conversion relationship T1 and the projection point movement vector SW. For example, the controller 140 could substitute the projection point movement vector SW into the above formula (5) to obtain (or calculate) the robotic arm movement vector SR required for the robotic arm 110 to move the projection point P1 to approach or coincide with the reference point O1. The purpose of steps S122 and S123 is to prevent the projection point P1′ from being out of the test plane 20 after of the robotic arm moving or rotating.
In step S123, as shown in
Due to the projection point P1 approaching to the reference point O1 in step S123, the moved projection point P1′ in the subsequent step S124A (the moved projection point P1′ is shown in
Then, in step S124, the controller 140 could perform the offset correction to the tool axis A1 of the tool 10 relative to the first axis (for example, the axis xC). Steps S124A to S124C are further described below.
In step S124A, as shown in
In step S124B, the controller 140 determines whether the position of the projection point P1 on the test plane 20 in the first axis (for example, the axis xC) changes according to the image captured by the camera 120. If so (for example, in a translation test of the first axis xC, the position of the projection point P1 of
In step S124C, as shown in
In detail, in step S124A, after the robotic arm 110 drives the tool 10 to move or translate along the axis +/−zC of the camera reference coordinate system (xC-yC-zC). As shown in
In another embodiment, as shown in
The controller 140 repeats steps S124A to S124C until the tool axis A1 of the tool 10 projected on the plane xC-yC of the camera reference coordinate system (xC-yC-zC) (for example, the angle of viewed in
In steps S125A to S125C, the controller 140 and the robotic arm 110 could use the process similar to steps S124A to S124C to complete the offset correction to the axis yC. Hereinafter, further examples are shown in
In step S125A, as shown in
In step S125B, the controller 140 determines whether the position of the projection point P1 on the test plane 20 in the second axis (for example, the axis yC) changes according to the image M1 captured by the camera 120. If so (for example, the position of the projection point P1 of
In step S125C, as shown in
In detail, in step S125A, after the robotic arm 110 drives the tool 10 to move or translate along the axis +/−zC of the camera reference coordinate system (xC-yC-zC). As shown in
The controller 140 repeats steps S125A to S125C until the tool axis A1 of the tool 10 projected on the plane yC-zC of the camera reference coordinate system (xC-yC-zC) (for example, the angle of viewed in
In step S126, after the offset correction to the first axis and the second axis is completed (it means that the tool axis A1 is perpendicular to the test plane 20, the purpose of steps S124 and S125 is to correct the shift along the axis xC and the axis yC axis), the controller 140 establishes a second conversion relationship T2 according to the posture of the robotic arm 110 when the tool axis A1 is perpendicular to the test plane 20, and obtains the tool axis vector Tez according to the second conversion relationship T2, wherein the tool axis vector Tez is, for example, parallel to or coincides with the tool axis A1. For example, the controller 140 establishes the second conversion relationship T2 according to the joint angles of the joints J1 to J6 of the robotic arm 110 when the tool axis A1 is perpendicular to the test plane 20. The second conversion relationship T2 is the conversion relationship of the installation surface (or flange surface) reference coordinate system (xf-yf-zf) of an installation surface 110s of the tool 10 relative to the robotic arm reference coordinate system (xR-yR-zR). The tool 10 could be installed on the installation surface 110s, and the tool axis A1 of the tool 10 is not limited to be perpendicular to the installation surface 110s. In an embodiment, the second conversion relationship T2 could be expressed in the following formula (6), and the elements in formula (6) could be obtained by the linkage parameters (Denavit-Hartenberg Parameters) of the robotic arm 110, the coordinates of the joints J1 to J6 and the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf), wherein the link parameters could include link offset, joint angle, link length and Link twist. In addition, the second conversion relationship T2 could be established by using a known kinematics method.
As shown in the following formula (7), the vector zw is the normal vector (i.e., the axis zC) of the test plane 20 relative to the robotic arm reference coordinate system (xR-yR-zR), and the vector Tez is the vector (herein referred to as the “tool axis vector”) of the tool axis A1 relative to the installation surface reference coordinate system (xf-yf-zf). The controller 140 could convert the vector zw into the tool axis vector Tez through the inverse matrix of the second conversion relationship T2.
T
ez
=T2−1·zw (7)
In step S130, the robotic arm system 100 executes the step of obtaining calibration point information groups. Further examples are given below.
In step S131, referring to
In the present embodiment, the tool 10 is shown by taking the luminance meter as an example, and the tool center point WO1 is the virtual tool center point. The tool center point WO1 is, for example, the focus of the first light L1 (detection light) projected by the tool 10. In another embodiment, the tool 10 is, for example, a machining tool, and the tool center point WO1 is the tool center point, such as a solid tool tip point. In summary, the tool center point of the embodiment of the present disclosure could be the physical tool center point or the virtual tool center point.
In one of the methods for adjusting the angle of the light source 130, the controller 140 could obtain the angle θ between the second light L2 emitted by the light source 130 and the first light L1 emitted by the tool 10 along the direction (dotted line from the rotation fulcrum 131 of the tool axis A1) perpendicular to the tool axis A1 according to the following formula (8), and then the angle of the light source 130 could be adjusted to the angle θ by manual or a mechanism (not shown) controlled by the controller 140, so that the emitted second light L2 emitted by the light source 130 and the first light L1 emitted by the tool 10 intersect at the tool center point WO1. The aforementioned mechanisms are, for example, various mechanisms that could drive the light source 130 to rotate, such as a linkage mechanism, a gear set mechanism, etc. Due to the angle θ being given (known), the angle of the light source 130 could be quickly adjusted so that the second light L2 emitted and the first light L1 emitted by the tool 10 intersect at the tool center point WO1. When the angle of the light source 130 is adjusted to the angle θ, the relative relationship between the light source 130 and the tool 10 could be fixed to fix the relative relationship between the tool center point WO1 and the tool 10.
In formula (8), H1 is the distance (for example, the focal length of the first light L1) between the tool center point WO1 and a light emitting surface 10s of the tool 10 along the tool axis A1, and H2 is the distance between the light emitting surface 10s of the tool 10 and the rotation fulcrum 131 of the light source 130 along the tool axis A1, and H3 is the vertical distance (perpendicular to the tool axis A1) between the rotation fulcrum 131 of the light source 130 and the tool axis A1.
As shown in
In step S132, the controller 140 executes the step of obtaining the calibration point information group. For example, the controller 140 could control the robotic arm 110 to make a plurality of calibration points whose tool center point WO1 coincide with the reference point O1 of the test plane 20 under a plurality of different postures, and accordingly record the calibration point information group of each calibration point. For example, the controller 140 could control the robotic arm 110 in a posture to make the tool center point WO1 coincide with the reference point O1 of the test plane 20, and accordingly record the calibration point information group in such posture. Then, the controller 140 controls the robotic arm 110 in another posture to make the tool center point WO1 coincide with the reference point O1 of the test plane 20, and accordingly record the calibration point information group in such posture. According to this principle, the controller 140 could obtain several calibration point information groups of the robotic arm 110 in several different postures. Each calibration point information group could include the coordinates of the joints J1 to J6, and the coordinates of each joint could be the rotation angle of each joint relative to its preset starting point. At least one of the rotation angles of the robotic arm 110 in different postures could be different.
For example, referring to
In step S132A, as shown in
In step S132B, as shown in
In step S132C, as shown in
In step S132D, the controller 140 obtains the robotic arm movement vector SR according to the first conversion relationship T1 and the projection point movement vector SW. For example, the controller 140 could substitute the projection point movement vector SW of
In step S132E, referring to
In step S132F, the controller 140 determines whether the tool center point WO1 coincides with the reference point O1 of the test plane 20 according to (or analyzes) the image (for example, the image M1 shown in
Furthermore, if the tool axis A1 in
In step S132G, the controller 140 records the joint angles of the joints J1 to J6 of the robotic arm 110 in the state where the tool center point WO1 coincides with the reference point O1 of the test plane 20, and uses it as one calibration point information group.
In step S132H, the controller 140 determines whether the number of the calibration point information groups has reached a predetermined number, for example, at least 3 groups, but could be more. When the number of the calibration point information groups has reached the predetermined number, the process proceeds to step S133, when the number of the calibration point information groups has not reached the predetermined number, the process proceeds to step S132I.
In step S132I, the controller 140 controls the robotic arm 110 to change the posture of the tool 10. For example, the controller 140 controls the robotic arm 110 to change the angle of at least one of the tool axis A1 of the tool 10 relative to the axis xC, the axis yC and the axis zC, wherein the changed angle is, for example, 30 degrees, 60 degrees or other arbitrary. For example, the controller 140 could generate an Euler angle increments ΔRx, ΔRy, ΔRz, through a random number generator, to correct the azimuth angle (Euler angle) of the robotic arm 110, thereby changing the posture of the robotic arm 110. As this time, the azimuth angle of the robotic arm 110 could be expressed as (Rx+ΔRx, Ry+ΔRy, Rz+ΔRz), wherein (Rx, Ry, Rz) is the original azimuth angle of the robotic arm 110, Rx represents the Yaw angle, Ry represents Pitch angle, and Rz represents Roll angle. If the corrected azimuth angle exceeds the motion range of the robotic arm 110, the controller 140 could regenerate the Euler angle increments through the random number generator.
Then, the process returns to step S132A to record the calibration point information group of the robotic arm 110 in the new (different) posture of the tool 10. Furthermore, after the controller 140 controls the robotic arm 110 to change the posture of the tool 10, the tool center point WO1 of the tool 10 may deviate from the test plane 20. Therefore, the process returns to step S132A to make the tool center point WO1 coincide with the reference point O1 again, and in the state where the tool center point WO1 coincides with the reference point O1, another calibration point information group under different posture of the robotic arm 110 is recorded. Steps S132A to S132I are repeated until the number of the calibration point information groups recorded by the controller 140 reaches the predetermined number.
In step S133, when the number of the predetermined number groups recorded by the controller 140 reaches the predetermined number, the controller 140 obtains the tool center point coordinate TP of the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf).
As shown in the following formula (9), the tool center point coordinate TP could be established according to a plurality of the calibration point information groups of the robotic arm 110 in a plurality of the different postures. The controller 140 could obtain (calculate) the coordinates of the tool center point WO1 according to the calibration point information groups, wherein the coordinates of each calibration point information group could be obtained through the linkage parameters (Denavit-Hartenberg Parameters) of the robot 110, the coordinates of the joints J1 to J6 and Information about the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf), wherein the link parameters could include link offset, joint angle, link length and link twist.
The coordinates of the tool center point WO1 could be obtained (calculated) by the following formula (9):
In formula (9), the matrix T2i is 4×4 homogeneous conversion matrix which converts the coordinate system of the ith calibration point information group to the installation surface reference coordinate system (xf-yf-zf) from the robotic arm reference coordinate system (xR-yR-zR). The matrix W1f in formula (9) includes [Tx Ty Tz]t, which is the coordinate W1f (Tx, Ty, Tz) of the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf), the matrix [Px Py Pz 1]t includes the coordinate W1R (Px, Py, Pz) of the tool center point WO1 relative to the robotic arm reference coordinate system (xR-yR-zR) in space. Each calibration point information group could obtain three linear equations through formula (9). Therefore, n calibration point information groups could obtain 3n equations, and then the coordinates of the tool center point WO1 could be obtained through Pseudo-inverse matrix.
Furthermore, in formula (9), (e11i, e21i, e31i) represents direction of the vector of the ith calibration point information group in the first axis (for example, the axis xf) relative to the robotic arm reference coordinate system (xR-yR-zR). (e12i, e22i, e32i) represents direction of the vector of the ith calibration point information group in the second axis (for example, the axis yf) relative to the robotic arm reference coordinate system (xR-yR-zR). (e13i, e23i, e33i) represents the direction of the vector of the ith calibration point information group in the third axis (for example, the axis zf) relative to the robotic arm reference coordinate system (xR-yR-zR). The following formulas (10) and (11) could be obtained from formula (9).
In formula (11), T3t is the transpose matrix of T3, (T3T3t)−1 is the inverse matrix of (T3T3t), and the coordinate (Tx Ty Tz) is the tool center point coordinate TP, and the matrix T3 is a calibration point information group matrix composed of the calibration point information groups.
If the number of the calibration point information groups is sufficient, substitute each element in the matrix T2i corresponding to the ith calibration point information group into formula (10) and relocate the matrix T3 to obtain formula (11) to obtain the coordinate W1f (Tx, Ty, Tz) of the tool center point WO1 relative to the installation surface reference coordinate system (xf-yf-zf) and the coordinate W1R (Px, Py, Pz) of the tool center point WO1 relative to the robotic arm reference coordinate system (xR-yR-zR).
Of course, the above-mentioned tool center point calibration method is only an example, and each component and/or calibration method of the robotic arm system 100 could be changed according to actual need/demand; however, such exemplification not meant to be limiting.
After the tool center point coordinate TP is obtained, the controller 140 could accordingly drive the robotic arm 110 to control the tool center point WO1 to a desired position. As a result, the robotic arm system 100 could perform an automatic teaching process for the robotic arm, and the following description is submitted with
Referring to
In the following, the process of the robotic arm system 100 of
In step S210, as shown in
In step S220, as shown in
In step S230, as shown in
θH=π/2−tan−1(ΔTZ1/LH) (13)
In step S240, the controller 140 determines whether the detection angle θH meets a first specification angle. When the detection angle θH does not meet the first specification angle, the process proceeds to step S250, and the controller 140 drives the tool 10 back to the first position S1. For example, the controller 140 could control the moving element 150 to translate for driving the tool 10 back to the first position S1. The first specification angle is, for example, specification value of the product, which is the specification value required when the tool 10 detects the detection surface 30 along the first detection direction. In detail, when the detection angle θH meets the first specification angle, analysis result of the display by the tool 10 (for example, the luminance meter) does not exceed the range (for example, when viewing display screen of the display in a skewed angle of view, a black screen or abnormal color will not be generated). The value of the first specification angle depends on the type of product, for example, the maximum viewing angle or the viewing angle of the flat-panel display; however, such exemplification not meant to be limiting.
In step S260, when the tool 10 returns to the first position, the controller 140 adjusts the posture of the robotic arm 110 to change the angle of the tool axis A1 relative to the detection surface 30, and then the process returns to step S210. The controller 140 could repeat steps S210 to S260 until the detection angle θH meets the first specification angle. For example, if the detection angle θH does not meet the first specification angle, the controller 140 controls the robotic arm 110 to rotate to rotate the tool 10 by an angle around the second axis (for example, the axis yd), and then the process returns to step S210. Repeat steps S210 to S260 according to this principle until the detection angle θH meets the first specification angle.
Similarly, as shown in
For example, in step S210, as shown in
In step S220, as shown in
as shown in
θV=tan−1(ΔTZ2/LV) (14)
In step S240, the controller 140 determines whether the detection angle θV meets a second specification angle. When the detection angle θV does not meet the second specification angle, the process proceeds to step S250, and the controller 140 drives the tool 10 back to the second position S2. For example, the controller 140 could control the moving element 150 to translate for driving the tool 10 back to the second position S2. The second specification angle is, for example, specification value of the product, which is the specification value required when the tool 10 detects the detection surface 30 along the second detection direction. In detail, when the detection angle θV meets the second specification angle, analysis result of the display by the tool 10 (for example, the luminance meter) does not exceed the range (for example, when viewing display screen of the display in a skewed angle of view, a black screen or abnormal color will not be generated). The value of the second specification angle depends on the type of product, for example, the maximum viewing angle or the viewing angle of the flat-panel display; however, such exemplification not meant to be limiting.
In step S260, when the tool 10 returns to the first position S1, the controller 140 adjusts the posture of the robotic arm 110 to change the angle of the tool axis A1 relative to the detection surface 30, and then the process returns to step S210. The controller 140 could repeat steps S210 to S260 until the detection angle θV meets the second specification angle. For example, if the detection angle θV does not meet the second specification angle, the controller 140 controls the robotic arm 110 to rotate to rotate the tool 10 by an angle around the first axis (for example, the axis xd) until the detection angle θV meets the first specification angle.
When the detection angle θH meets the first specification angle and the detection angle θV meets the second specification angle, the controller 140 records the joint coordinate combination or performs detection according to the current posture of the robotic arm 110. For example, the controller 140 records the change amount of the motion parameters of the joints J1 to J6 of the robotic arm 110 during the steps S210 to S260.
In summary, according to the robotic arm system and the calibration method using the tool center point in the embodiments of the present disclosure, even if additional sensors and measuring devices (for example, three-dimensional measurement equipment) are not used, the calibration for the tool center point and the automatic teaching for the robotic arm could be performed.
It will be apparent to those skilled in the art that various modifications and variations could be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
109129784 | Aug 2020 | TW | national |