This application claims the benefit of Taiwan application Serial No. 106142605, filed Dec. 5, 2017 and the benefit of Taiwan application Serial No. 106146427, filed Dec. 29, 2017, the subject matters of which are incorporated herein by reference.
The disclosure relates in general to a robot arm, and more particularly to a robot arm calibration device and a calibration method thereof.
In general, in order to achieve high level of absolute precision, the robot arms must be passed through various checks and adjustments in the manufacturing process before shipment. However, the precision of robot arm after long time of use is hard to maintain due to mechanical offset, or due to the deviation caused by maintenance of the robot arm (such as replacement of a motor or a gear set). Therefore, how to ensure the precision of the robot arm within the required range and directly calibrate the precision of the robot arm on the production line (in-line) is an urgent problem to be resolved for the industry.
The disclosure is directed to a robot arm calibration device and a calibration method thereof capable of correcting the precision of the robot arm to be corrected by the corrected robot arm.
According to one embodiment, a robot arm calibration device is provided, which includes a light emitter, a light sensing module, a cooperative motion controller and a processing module. The light emitter is disposed on at least one robot arm to emit a light beam. The light sensing module is disposed on at least another robot arm to receive the light beam and the light beam is converted into a plurality of image data. The cooperative motion controller is configured to drive the light emitter and light sensing module on at least two robot arms to a corrected position and a position to be corrected, respectively. The processing module receives the image data and the motion parameters of the at least two robot arms to calculate an error value between the corrected position and the position to be corrected, and analyzes the image data to output a corrected motion parameter. The cooperative motion controller modifies a motion command of at least one of the at least two robot arms according to the corrected motion parameter.
According to another embodiment, a robot arm calibration method is provided, including the following steps. A light emitter and a light sensing module are disposed on at least two robot arms, respectively. The light emitter and the light sensing module on the at least two robot arms are respectively driven to a corrected position and a position to be corrected. The light emitter emits a light beam to project the light beam onto the light sensing module. The light sensing module receives the light beam and converts the light beam to a plurality of image data. According to the image data and the motion parameters of the at least two robot arms, an error value between the corrected position and the position to be corrected is calculated, and the image data is analyzed to output a corrected motion parameter. The motion command of at least one of the two robot arms is modified according to the corrected motion parameter.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
Detailed descriptions of the disclosure are disclosed below with a number of embodiments. However, the disclosed embodiments are for explanatory and exemplary purposes only, not for limiting the scope of protection of the disclosure. Similar/identical designations are used to indicate similar/identical elements.
Referring to
Referring to
The light sensing module 120 is configured to receive the light beam L emitted from the light emitter 110 and the projecting image or projecting image point of the light beam L is converted into electronic signals and transmitted through cable or wireless communication to the cooperative motion controller 130. The communication is performed, for example: the light sensing module 120 and the cooperative motion controller 130 are connected through external transmission line; or, the electronic signal is temporarily stored in the memory of the light sensing module 120, an universal serial bus (USB) flash driver is used to connect the USB terminal of the light sensing module 120 to access the electronic signal, and then the electronic signal in the USB flash driver is transmitted to the cooperative motion controller 130; or, the electronic signal is temporarily stored in light sensing module 120 and transmitted to the cooperative motion controller 130 through wireless communication.
Referring to
Referring to
In step S13, the light sensing module 120 receives the light beam L and converts the light beam L into a plurality of image data for determining whether an offset presents between the corrected position and the position to be corrected. That is, the position of the robot arm 104 to be corrected is adjusted by the image data of the light spot on the light sensing unit 121, so that the position of the robot arm 104 to be corrected can meet the measurement condition. The light sensing unit 121 senses the projection of the light beam L and the image on the pixel of the light sensing unit 121 irradiated by the light beam L is converted into an image data Xmeasure. The image data Xmeasure can be a single position point or a set of a plurality of position points, and each point represents the projection position of the light beam L in the coordinate system of the light sensing module 120.
In an embodiment, the coordinate position function of the light beam L projected onto the light sensing unit 121 can be expressed as: Xpoint=G(RA(DHA, θA), HA_emitter, RB(DHB, θB), HB_sensor, HA-B), where RA(DHA,θA) is a motion parameter of the corrected robot arm 102, HA_emitter is a relative spatial transform parameter of the end E1 of the corrected robot arm 102 to the light emitter 110, RB(DHB, θB) is a motion parameter of the robot arm 104 to be corrected, HB_sensor is a relative spatial transform parameter of the end E2 of the robot arm 104 to be corrected to the light sensing module 120, HA-B is a relative spatial transform parameter of the corrected robot arm 102 and the robot arm 104 to be corrected. DHA is a six-axis spatial transform parameter of the corrected robot arm 102, θA are all joint angles of the corrected robot arm 102, DHB is a six-axis spatial transform parameter of the robot arm 104 to be corrected, and θB are all joint angles of the robot arm 104 to be corrected. In the present disclosure, the motion postures of the two robot arms can be adjusted several times and the light beam L is sequentially projected onto each light sensing unit 121 to obtain a plurality of image data.
Next, in step S14, the cooperative motion controller 130 records the motion command XA_point_k of the corrected robot arm 102, the j-th motion command XB_point_k_j of the robot arm 104 to be corrected (j=1-P, where P is the number of the optical sensing units 121) and repeat steps S11 to S13 to obtain the j-th image data of the light sensing module 120 in the motion posture k.
In step S15, the processing module 140 records the image data M in all the motion postures k (k=1-N), that is, the image data M of the light sensing module 120 (the number of the image data M is N times P, and M is greater than or equal to the number of parameters to be corrected), and analyzes whether the spot center position of the image data M is at a predicted projecting position. If the spot center position deviates from the predicted projecting position, it indicates that the motion parameter of the robot arm 104 to be corrected needs to be further adjusted. Next, in step S16, the processing module 140 may calculate a corrected motion parameter according to the motion parameter of the corrected robot arm 102, the motion parameter of the robot arm 104 to be corrected and the corresponding plurality of image data M of the light sensing module 120 in each motion posture.
In an embodiment, the processing module 140 may obtain a correct value according to an error value between the coordinate position of the measured projecting point or projecting pattern in the coordinate system of the light sensing module 120 and the predicted projecting position, and the motion parameter to be corrected can be adjusted by numerical methods, so that the error value can be minimized and approaches zero.
In the embodiment, the motion parameter RA (DHA, θA) of the corrected robot arm 102 is known, and all the joint angles θA of the robot arm 102, all the joint angles θB of the robot arm 104 to be corrected, and the coordinate position Xpoint of the light beam L projected on the light sensing unit 121 have been measured, and six-axis spatial transform parameter DHB of the robot arm 104 to be corrected is obtained through the optimal algorithm and the relative spatial transform parameter HA-B of the corrected robot arm 102 and the robot arm 104 to be corrected, the relative spatial transform parameter HA_emitter of the end E1 of the corrected robot arm 102 to the light emitter 110, and the relative spatial transform parameter HB_sensor of the end E2 of the robot arm 104 to be corrected to the light sensing module 120 can be identified at the same time so as to adjust the motion parameter of the robot arm 104 to be corrected. For example, spatial coordinate of each joint angle or each arm can be adjusted.
Next, in step S17, the processing module 140 outputs the corrected motion parameter to the cooperative motion controller 130, so that the cooperative motion controller 130 modifies a motion command according to the corrected motion parameter to compensate the offset error, so as to improve the absolute position precision of the robot arm 104 to be corrected and reduce the error value. The processing module 140 may transmit the corrected motion parameter to the cooperative motion controller 130 through an external connection line or a wireless communication method. In this way, after receiving the corrected motion parameter, the cooperative motion controller 130 may further modify the motion command of the robot arm 104 to be corrected to complete the calibration procedure.
In step S12, in order to ensure that the light sensing module 120 can accurately receive the light beam L emitted from the light emitter 110, an image capturing device 125 may be additionally disposed on the corrected robot arm 102, and the image capturing device 125 captures a pattern mark 126 on the light sensing module 120 to determine whether the light emitter 110 and the light sensing module 120 are held at a relative position, thereby reducing alignment time. The pattern mark 126 may be a clearly identified geometric pattern or a two-dimensional feature pattern. Alternatively, the image capturing device 125 may also capture the feature at the edge of each light sensing unit 121 as an identification feature. In
In an embodiment, the corrected robot arm 102 and the robot arm 104 to be corrected may have the same or different numbers of joints and arms, and the ends E1 and E2 of the corrected robot arm 102 and the robot arm 104 to be corrected may have the same or different functional components to be assembled, such as drilling, gripping, laser ablation, dispensing, welding and the like. When the corrected robot arm 102 and the robot arm 104 to be corrected have the same number of joints and arms (e.g., the robot arms of same type) and perform the same function, the corrected robot arm 102 can be used to correct the precision of the robot arm 104 to be corrected to ensure that the precision of the robot arm before shipment or in the production line is within the required range. Alternatively, when the corrected robot arm 102 and the robot arm 104 to be corrected have different number of joints and arms (e.g., the robot arms of different types) and perform different functions, for example, when two robot arms perform a task in cooperation, the corrected robot arm 102 can be used to correct the precision of the robot arm 104 to be corrected in the production line directly to ensure that the relative position of the two robot arms is within the required range of precision. Therefore, it can be seen from the above description that the calibration device 100 and the method thereof of the present disclosure can correct the offset error generated when the two robot arms perform cooperative movement.
In the embodiment, in addition to correcting the precision of a single robot arm 104 to be corrected by a single corrected robot arm 102, the precisions of a plurality of robot arms 105 and 106 to be corrected may also be corrected by the single corrected robot arm 102 at the same time. Please refer to
In addition, the difference from the above embodiment is that in
In
In the same manner, referring to
Referring to
Referring to
In the robot arm calibration device and the calibration method thereof disclosed in the above embodiments of the present disclosure, the corrected robot arm is provided to correct the precision of the robot arm to be corrected, and the robot arm manufacturer can calibrate the robot arms before shipment. The offset error in precision due to manufacturing or assembly errors can be compensated and the robot user is allowed to regularly calibrate robot arms at the factory to address the problem of maintaining mechanical precision hard due to mechanical offset over long time of use, or the offset error caused by maintenance of the robot arm (such as replacement of a motor). In addition, the calibration device and the method thereof of the present disclosure can meet the need of correcting the absolute precision of a robot arm directly on the production line, and can correct the offset error in precision generated by two robot arms during cooperative movement so as to ensure that the relative position of the two robot arms can be within the required range of precision.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
106142605 A | Dec 2017 | TW | national |
106146427 A | Dec 2017 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
6321137 | De Smet | Nov 2001 | B1 |
7899577 | Ban et al. | Mar 2011 | B2 |
8082064 | Kay | Dec 2011 | B2 |
8120301 | Goldberg | Feb 2012 | B2 |
8290618 | Demopoulos | Oct 2012 | B2 |
8423182 | Robinson | Apr 2013 | B2 |
9156160 | Nagai | Oct 2015 | B2 |
9310482 | Rosenberg | Apr 2016 | B2 |
9623563 | Nixon | Apr 2017 | B2 |
10192195 | Brazeau | Jan 2019 | B1 |
20090157226 | De Smet | Jun 2009 | A1 |
20140277715 | Nagai | Sep 2014 | A1 |
20170210011 | Hull | Jul 2017 | A1 |
20180285110 | Ray | Oct 2018 | A1 |
20190206565 | Shelton, IV | Jul 2019 | A1 |
20190270306 | Uroz Soria | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
102062587 | May 2011 | CN |
102942061 | Feb 2013 | CN |
103889663 | Sep 2016 | CN |
106989670 | Jul 2017 | CN |
104602871 | Aug 2017 | CN |
201217123 | May 2012 | TW |
I404609 | Aug 2013 | TW |
201603979 | Feb 2016 | TW |
I537110 | Jun 2016 | TW |
I561354 | Dec 2016 | TW |
201702034 | Jan 2017 | TW |
201509617 | Mar 2017 | TW |
I579123 | Apr 2017 | TW |
Entry |
---|
Abidi et al., Autonomous robotic inspection and manipulation using multisensor feedback, 1991, IEEE, p. 17-31 (Year: 1991). |
Yussof et al., Handling capabilities of two robot hands equipped with optical three-axis tactile sensor, 2009, IEEE, p. 165-170 (Year: 2009). |
Hsiao et al., External tracking of an instrument by electric fields in minimally invasive surgery, 2016, IEEE, pg. (Year: 2016). |
Basdogan et al., VR-Based Simulators for Training in Minimally Invasive Surgery, 2007, IEEE, p. 54-66 (Year: 2007). |
Tendick et al., Human-Machine Interfaces for Minimally Invasive Surgery, 1997, IEEE, p. 2771-2776 (Year: 1997). |
Jian Zhou et al, “Selecting Optimal Measurement Poses for Kinematic Calibration of Industrial Robots,” Advances in Mechanical Engineering, May 20, 2014, pp. 1-9, vol. 2014, Article ID 291389, Republic of Korea. |
Mohamed Abderrahim et al, “Accuracy and Calibration Issues of Industrial Manipulators,” Industrial Robotics: Programming, Simulation and Application, Dec. 2006, pp. 131-147. |
In-Won Park, et al “Laser-Based Kinematic Calibration of Robot Manipulator Using Differential Kinematics,” ASME Transactions on Mechatronics, Dec. 2012, pp. 1059-1067, vol. 17, No. 6. |
“Absolute Accuracy,” User's guide BaseWare option, ABB Automation Technology Products AB, 2003, 40 pages. |
Gianni Campion, et al “Robot Calibration using Mobile Camera” International Conference on Robotics & Automation, Washington, DC, May 2002. pp. 141. |
In-Chul Ha, “Kinematic parameter calibration method for industrial robot manipulator using the relative position,” Journal of Mechanical Science and Technology 22, 2008, pp. 1085-1090. |
Number | Date | Country | |
---|---|---|---|
20190168385 A1 | Jun 2019 | US |